Dont understand...hd4850 10w idle @ 68c

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Fri Aug 08, 2008 1:27 am

ATIs problem is that it doesn't matter what the realities of the lifetime of the card is as a function of temperature.

Google could come out with a long-term study of a hundred thousand HD4850s showing that their mean time before failure idling at 80C is 25 years. The results would be rejected because they go against folks preconceptions, as happened with the hard-drive study.

Perception is everything, facts are irrelevant. Alan Sugar once said "the customer is always right, always give the customer what they want", meaning that the customer is always right even when [s]he is wrong and what they want is stupid. Maybe Dave will learn that next time when he approves the fan speed tables for the BIOS in the 5850 :)

jaganath
Posts: 5085
Joined: Tue Sep 20, 2005 6:55 am
Location: UK

Post by jaganath » Fri Aug 08, 2008 2:57 am

so ATI magically came up with a solution to electromigration and interconnect degradation due to high temperatures? amazing, why doesn't everyone else in the industry use it? you cling to that hard drive study like a sailor clinging to a bit of flotsam, you bang on about the "facts" but choose only the facts that suit your argument, the Google study. the incredibly short upgrade cycle of PC components, especially graphics cards, mean the manufacturers simply do not have the time to do long-term burn-in tests to see genuinely how long they last; they just do a very quick, very high-temperature test and extrapolate the card's expected lifetime at more "normal" temperatures from those results; it's a rough-and-ready approach, and whatever uncertainties that method produces are seldom shown up due to the aforementioned short shelf life before upgrade. But don't pretend manufacturers have done extensive burn-in testing of VGA cards at high temperatures for long periods of time, because they simply haven't.

CA_Steve
Moderator
Posts: 7650
Joined: Thu Oct 06, 2005 4:36 am
Location: St. Louis, MO

Post by CA_Steve » Fri Aug 08, 2008 7:28 am

My guess is that ATI defines "lifetime" in the consumer replacement cycle sense. So, if the cards end up with an MTBF of 2-3 years, then all is well.

What will be interesting to see is if AMD influences the ATI design methodology in a positive manner.

I'm not a fan of gpu's running high temps...but, I still have a passive 6600GT that runs in the 80-90C under load on my secondary PC. It's not dead, yet. :D

dragmor
Posts: 301
Joined: Sun Jul 10, 2005 7:54 pm
Location: Oz

Post by dragmor » Sun Aug 10, 2008 7:24 pm

nafets wrote:But will continued high temps decrease the longevity of the video card. More than likely.
Maybe, but there is a usable life cycle as well, which with PC components is short, 5 to 10 years at best.

I had a passive 9550 (now running in a relatives PC) since about when they came out (about 5 years ago?). I don't have AC, ambient in summer are around 35c to 40c and the card idled around 90c and load was about 110c. It worked without error in those conditions till mid 2007 and from then to now in a relatives PC for the kids to play old games (ambient around 30c) and no complaints. The PC's probably 6hrs a day with different people in the house gaming.

Another thing is that electromigration problems might not show up as much in the GPU. A GPU flipping the colour of a pixel or something else might not be noticed but a CPU could easily cause noticeable crashes.

quietnevbie
Posts: 18
Joined: Wed Dec 26, 2007 11:52 am

Post by quietnevbie » Tue Aug 12, 2008 11:17 am

Can I undervolt the 4850 in windows to test the limits?

Or is there a lowest possible voltage in the card BIOS that will work anyway with 160/500mhz?

Oh and in all cases, is the ATITool artifact scanner still good for measuring video card stability? The latest version is from 2006..

tuz
Posts: 24
Joined: Tue Aug 12, 2008 4:51 am
Location: Melbourne

Post by tuz » Wed Aug 13, 2008 2:20 am

Other people seem to have success getting ATITool to run with their 4850 but it always hangs with my CPU @ 100% whenever I try to launch it. Updated graphics cards drivers haven't fixed it, I have a Sapphire 4850 if you're wondering.

ryboto
Friend of SPCR
Posts: 1439
Joined: Tue Dec 14, 2004 4:06 pm
Location: New Hampshire, US
Contact:

Post by ryboto » Wed Aug 13, 2008 5:37 am

tuz wrote:Other people seem to have success getting ATITool to run with their 4850 but it always hangs with my CPU @ 100% whenever I try to launch it. Updated graphics cards drivers haven't fixed it, I have a Sapphire 4850 if you're wondering.
I get the exact same problem, and I've got an HD3870, must be something's up with our systems.

nafets
Posts: 89
Joined: Mon Feb 19, 2007 5:04 pm

My HD4870 + 2D IDLE results...

Post by nafets » Wed Aug 13, 2008 7:52 am

I had this whole post up on the TechPowerUp forums, but it got deleted somehow. *Update* It's also back on the TechPowerUp forums. It contains some information I already posted here on SPCR, and also some new stuff. I'm still working on a few things, and I'll keep this post updated...

.........................................

Ok, here is what I came up with during extensive testing with my HD4870.

Hardware used:
Sapphire HD4870 512MB

Software used:
Radeon Bios Editor v1.12 *New 8/10/08* Radeon Bios Editor v1.13 is now available. See the end of this post for updated details.
Gpu-Z v0.2.6 *New 8/12/08* Gpu-Z v0.2.7 is now available. See the end of this post for updated details.
ATIFlash 3.60
AMD GPU Clock Tool v0.9.8 (Referred to from now on as AGCT)

To use AGCT as a shortcut to set GPU/MEM clocks, use the following command line;
"AMDGPUClockTool.exe" -eng=x -mem=x

X is the clock rate in MHz.

This quickly applies the GPU and MEMORY clocks without opening up the AGCT program. You can have a shortcut for each state (2D/3DLP/3DHP), each with it's own custom clocks, and apply them whenever necessary.

HD4870 BIOS Used:
Stock1.bin (This is the original BIOS my HD4870 came with)
Date: 06/09/08 16:25
Version String: 113-B50701-100
AtomBIOS Version: ATOMBIOSBK-ATI VER011.003.000.001.029254

2D: 500/900 @ 1.263V
3DLP: 500/900 @ 1.263V
3DHP: 750/900 @ 1.263V

Stock2.bin (This is a newer version of the above BIOS available on TechPowerUp)
Date: 06/17/08 11:20
Version String: 113-B50701-105
AtomBIOS Version: ATOMBIOSBK-ATI VER011.003.000.001.029335

2D: 550/900 @ 1.203V
3DLP: 550/900 @ 1.203V
3DHP: 750/900 @ 1.263V

I've named the two BIOSes as they are, just to make it easier to keep track of what is what.

.........................................

Idle power consumption and changes in voltage were measured with a Kill-A-Watt plug-through device.

.........................................

My testing is only in regards to the HD4870 and 2D GPU/MEM clocks, power consumption, and voltages. I do not do any video decoding, so any changes or adjustment to UVD settings in the BIOS (Clock Info settings 04, 05, and 06) have not been tested or implemented by me.

.........................................

Also, I think it's important to mention that I do not use or have installed Catalyst Control Center (CCC). I am using the Catalyst 8.8 Beta drivers only (0807281307-8.52.2-080722a-066078E-ATI.5). For changing 3D settings I use the latest beta verison of Ati Tray Tools (v1.4.7.1211). I am aware that there are ways to change GPU/MEM clocks and voltages by editing certain .XML files that are used with CCC, but I cannot comment on these methods.

.........................................

GPU Testing:

Before you mess around with any BIOS settings or start flashing different BIOSes it is highly recommended that you find the lowest possible, stable GPU clock for your HD4870. The simplest way to do this is to download AGCT and use it to test for yourself.

I found that I had no problems with stability or 2D performance with the GPU clock running at 160MHz. I settled at this value as the updated MSI BIOS for the HD4850 has a setting of 160MHz for 2D. Both the HD4850 and HD4870 use the same RV770 GPU core, so I consider this a safe setting for me.

-Method 1
Using AGCT to set the GPU clock here are a few pictures of the temperatures and power consumption results;

Image Image
Stock1.bin - 500/900 - 1.263V - 139W ------------------ Stock1.bin - 160/900 - 1.263V - 138W

Lowering the GPU clock from 500MHz to 160MHz yielded no temperature difference for me, and only 1W savings in power consumption.

When using AGCT to set the GPU clock, the setting stays locked in, regardless of whatever state you may be in (2D/3DLP/3DHP).

-Method 2
Using Radeon Bios Editor to set a lower GPU clock in the BIOS (Clock Info 01 + 07) is perfectly safe. Changing from 2D to 3DLP/3DHP states, and vice versa, works fine and no visual anomalies were detected.

When using an edited BIOS to set the GPU clock, the card will properly change settings based on whatever state you are in (2D/3DLP/3DHP).

-Bottom Line
Functionality and results of changing GPU clocks in the BIOS or via AGCT are exactly the same. Either method works great.

.........................................

MEMORY Testing: *New Information 8/13/08* See the end of this post for updated details.

Before you mess around with any BIOS settings or start flashing different BIOSes it is highly recommended that you find the lowest possible, stable MEMORY clock for your HD4870. The simplest way to do this is to download AGCT and use it to test for yourself.

I found that I had no problems with stability or 2D performance with the GPU clock running at 225MHz. Going lower than 225MHz resulted in "yellow snow" artifacting and stability issues arised.

-Method 1
Using AGCT to set the MEMORY clock here are a few pictures of the temperatures and power consumption results;

Image Image
Stock1.bin - 500/900 - 1.263V - 139W ------------------ Stock1.bin - 500/225 - 1.263V - 101W

Lowering the MEMORY clock from 900MHz to 225MHz yielded a temperature difference of 6C to 11C, and 38W savings in power consumption.

As stated before, when using AGCT to set the MEMORY clock, the setting stays locked in, regardless of whatever state you may be in (2D/3DLP/3DHP). When changing the MEMORY clock there will be a single on-screen flash. Since the setting is locked in, you won't have repeated on-screen flashes after that.

-Method 2
Using Radeon Bios Editor to set a lower MEMORY clock in the BIOS (Clock Info 01 + 07) has a problem. When the video card changes states from 2D to 3DLP/3DHP or vice versa, a quick, but noticable on-screen flash is present. Simple things like opening or closing a window (say....Internet Explorer) will cause the card to quickly change states, with the on-screen flash happening with each state change. This is very annoying. Here is a picture that shows what I am talking about;

Image

The spikes next to the BIOS set 2D GPU and MEMORY clocks of 300MHz are just from opening and closing windows and using Gpu-Z. Clocks jump to the default BIOS set 3DHP speeds of 750/900, whenever this happens.

-Bottom Line
Functionality and results of changing MEMORY clocks in the BIOS or via AGCT are quite different. Using AGCT to set the MEMORY clock is very simple, safe and stable. There is only one on-screen flash, and the setting is locked in, with no further flashing. Using modified MEMORY clocks in the BIOS will result in repeated on-screen flashes, even when not gaming, and is highly unacceptable. You could run all three states at the same lower MEMORY clock, which would lower 2D power consumption, but I don't find that to be a reasonable solution.

.........................................

Before I move onto HD4870 voltage testing, here are pictures and temperature results of the combined, lowered GPU/MEMORY clocks while using AGCT;

Image Image
Stock1.bin - 500/900 - 1.263V - 139W ------------------ Stock1.bin - 160/225 - 1.263V - 100W

Lowering the GPU clock from 500MHz to 160MHz and the MEMORY clock from 900MHz to 225MHz yielded a temperature difference of 7C to 12C, and 39W savings in power consumption.

.........................................

VOLTAGE Testing: *New Information 8/13/08* See the end of this post for updated details.

I've learned from mat9v (poster at TechPowerUp forums), who got his information from W1zzard (Admin at TechPowerUp forums who made Gpu-Z), that the HD4870 has 4 power states [-0.2V, -0.1V, 0, +0.1V]. This corresponds to voltages of 1.063, 1.163, 1.263 (default), and 1.363. These are the figures I'm going by as information of proper voltage ranges is hard to come by.

-Method 1
I first tried using Radeon Bios Editor to set lower VOLTAGES in the BIOS (Clock Info 01 + 07), with the Stock1.bin BIOS, which came with my HD4870. I changed the default 1.263V to 1.063V. Saved the BIOS, flashed it to the video card, and rebooted.

Upon booting into Windows there was no noticeable temperature change, or power consumption difference. I figured the HD4870 wasn't responding to different VOLTAGE settings in the BIOS.

I then tried the newer BIOS that was available on TechPowerUp, the only one with different stock VOLTAGE settings for 2D and 3DLP. The Stock2.bin BIOS has a value of 1.203V instead of 1.263V. It also has a slightly higher 2D and 3DLP GPU clock of 550MHz, instead of 500MHz.

Using the Stock2.bin BIOS with it's settings unchanged, here are a few pictures of the temperatures and power consumption results;

Image Image
Stock1.bin - 500/900 - 1.263V - 139W ------------------ Stock2.bin - 550/900 - 1.203V - 132W

Lowering the VOLTAGE from 1.263V to 1.203V yielded a temperature difference of about 1C, and 7W savings in power consumption. Pretty good. Setting the GPU clock to 500MHz, instead of 550MHz, yielded no decrease in temperature or power consumption.

So with this new Stock2.bin BIOS it's now possible to change the VOLTAGES in the BIOS, with the HD4870 actually responding to them, for some strange reason. Now lets see how the HD4870 responds with a 1.063V in 2D;

Image Image
Stock1.bin - 500/900 - 1.263V - 139W ------------------ Stock2.bin - 550/900 - 1.063V - 124W

Lowering the VOLTAGE from 1.263V to 1.063V yielded a temperature difference of 2C to 4C, and 15W savings in power consumption. Excellent. Setting the GPU clock to 500MHz, instead of 550MHz, yielded no decrease in temperature or power consumption.

-Method 2
While AGCT supports changing VOLTAGES (VDDC, VDDCI, MVDDC, VDDQ), I didn't have any luck getting the HD4870 to respond to any of those command line settings.

It is possible however to revert back to the default 2D GPU and MEMORY clocks and the default 2D VOLTAGE listed in the BIOS (Clock Info 01 + 07), by opening the AGCT program and clicking on "Restore Default Clocks". Using the command line "-restore" setting does not work for the VOLTAGE, and only sets the card back to the default 2D GPU and MEMORY clocks listed in the BIOS.

-Bottom Line
Functionality and results of changing VOLTAGES in the BIOS or via AGCT are quite different. For me, I had to use the Stock2.bin BIOS, as changing settings in the Stock1.bin BIOS, which came with my card, didn't work properly. Setting the 2D VOLTAGE to 1.063V in the Stock2.bin BIOS was simple, stable, and safe. I had no performance issues or stability problems running at the lower VOLTAGE. While the lower 2D VOLTAGE is nice, the improvement in power consumption and temperatures isn't as good as lowering the MEMORY clocks, something that cannot be done properly at this time in the BIOS. While AGCT didn't set VOLTAGES properly for me, it is possible to use the program, along with a modified BIOS to get the best of both worlds; Lower 2D GPU/MEMORY clocks and lower VOLTAGES. I will now show you how.

.........................................

Wrap-Up and Final Thoughts:

So what did all this testing, endless blathering, and wasted time come up with? Well if you're looking for a final fix (All-in-one BIOS) for HD4870 2D power consumption with modified GPU/MEMORY/VOLTAGES, you won't find it. The real killer is the on-screen flashing that is present when using varying MEMORY clocks in the BIOS. You could just settle with lowered GPU clocks and lowered VOLTAGES in 2D, but that's only half the solution. If you really want to squeeze out the most savings, here is how I did it. This is not a set it and forget it solution, nor will it function ideally for any and all situations:

1. Start with the Stock2.bin BIOS.

2. You will need to edit the Clock Info settings (01, 02, 07, and 08) with your optimum lowered GPU/MEMORY settings plus a VOLTAGE of 1.063V. They will all be the same settings. For me this would be 160/225 @ 1.063V.

3. You will need to edit the Clock Info settings (03 and 09) with your optimum lowered GPU/MEMORY settings plus a VOLTAGE of 1.263V. Both will be the same setting. For me this would be 160/225 @ 1.263V. It should look something like this;

Image

4. Make two shortcuts in your Start button or on your desktop. One of them will be an AGCT quick setting and one will be a link to the AGCT program. THE AGCT quick setting will be your 3D settings, for gaming ("AMDGPUClockTool.exe" -eng=750 -mem=900). The other is for opening the AGCT program to use the "Restore Default Clocks" button.

5. When you boot up into Windows and get to the desktop, the HD4870 will be running at 160/225 (or whatever your optimum lowered clocks are) @ 1.063V. It will not change GPU/MEMORY clocks, as they are locked in. The VOLTAGE however will fluctuate between 1.063V and 1.263V, whenever you open or close windows, but without any on-screen flashing as was evident with modified MEMORY clocks in the BIOS.

6. When you want to game, click on the AGCT 3D shortcut, and after one on-screen flash, the HD4870 will be running at 750/900 (or whatever you set for gaming) @ 1.263V. I have seen that the VOLTAGE does not fluctuate during gaming, and I have run numerous benchmarks, and logged quite a few hours gaming and for me performance is exactly the same as before all this modified BIOS/AGCT stuff.

7. When you're done gaming, click on the AGCT program shortcut, and click on the "Restore Default Clocks" button. After one on-screen flash, the HD4870 will now revert back to 160/225 (or whatever your optimum lowered clocks are) @ 1.063V.

I have tested this setup for a number of days now, and it works great. It's not as simple and carefree as a BIOS with all the proper settings, but with the HD4870 and it's GDDR5 memory, this just isn't possible yet, because of the lousy on-screen flashing. My total power consumption now at 160/225 @ 1.063V is 94W, compared to 500/900 @ 1.263 being 139W. A total savings of 45W. Not bad.

.........................................

Phew. What a pain in the ass. I should have just gotten a GTX 260. But seriously, I am open to any and all comments and criticisms. Maybe I missed something, or am wrong about my methods and testing. Post up your thoughts and experiences. I'm interested to see what others come up with. It would be nice if ATI had this 2D power consumption crap done right in the first place, but they didn't. In time it might be fix or bettered, but for now, this is sort of a plausible solution...

.........................................

Updates:
Regarding *Radeon Bios Editor v1.13*
Many great new features have been added to the newest version of Radeon Bios Editor. One of which is allowing (unlocking) voltage settings for the Stock1.bin BIOS (and it's various Manufacturer variants, based off of it). So the problem I reported at the beginning of the VOLTAGES testing section (HD4870 not responding to modified BIOS voltages) is now easily fixed. Simply put, now you can have the same functionality of the Stock2.bin BIOS, but with whatever Stock1.bin BIOS variant you may be using (ex........PowerColor, VisionTek, Gigabyte....etc). Add in your own custom GPU clocks, VOLTAGES, and a modified fan profile, and you're all set to go. Good stuff.

Regarding *Gpu-Z v0.2.7*
Many great new features have been added to the newest version of Gpu-Z. One of which is VRM monitoring for the HD4870. This allows you to see the Amps used, temperatures of the three digital VRM chips, and the current VDDC voltage. Previously you had to use a licensed beta copy of Everest to get VRM temps and Amps. Very nice!

One problem is that if you're using AGCT to set your GPU/MEMORY clocks it will disable all of the above VRM monitoring. If you need to see those specific temperatures/readouts, simply reboot your PC and allow the HD4870 to run at the GPU/MEMORY clock speeds specified in the BIOS.

I haven't found a way around this yet, to be able to use AGCT and get the VRM monitoring to show up.

Regarding *MEMORY testing*
I was just dumbfounded after reading the TechPowerUp review of the HD4870 X2 2GB, mainly to do with it's IDLE power consumption. System power consumption for the HD4870 was measured at 158W and the HD4870 X2 at 160W. It seems ATI successfully managed to tone down the GDDR5 memory clocks (and I'm guessing without any on-screen flashing). Considering that the HD4870 X2 is merely two HD4870's on one PCB plus the PLX chip; one wonders, how did they do it?

The answer is right in the review;
W1zzard wrote:In order to keep the power consumption down of their card AMD has implemented R600 style 2D/3D clocks in their card - the dynamic power management of the RV770 is not used at all.
On the HD 4850 and HD 4870 the GPU is configured in a way to dynamically adjust the clock frequencies based on GPU load, without any software intervention - this is disabled now. Instead the driver detects if the card is running in 2D or fullscreen (!) 3D and switches clocks accordingly. In 2D and windowed 3D the card will always run at 500 MHz / 500 MHz to conserve power. Also it seems CrossFire is disabled when running windowed 3D apps. I did a quick test with Crysis (same settings for both tests) and got 27.1 FPS in a window and 54.4 FPS in fullscreen 3D, quite a difference.
So as we've seen, and known for a while now, the HD4870's dynamic power management (state switching based on GPU load) combined with the HD4870's retraining of it's memory controller and GDDR5 memory (See this post, regarding GDDR5), is leading to this on-screen flashing problem.

This brings hope that somehow ATI could possibly use the same method of state switching as the HD4870 X2, on the HD4870, and when combined with a properly set BIOS with lowered 2D GPU/MEMORY/VOLTAGES, would give us the lowered power consumption that the card should have had in the first place.

As far as running windows 3D applications, one could just use AGCT to lock in a higher GPU/MEM, if necessary.

Regarding *VOLTAGE testing*
I was recently checking out settings with Ati Tray Tools and clicked on the System Information section. Reading through quite a bit of detailed information of my HD4870's current settings, I noticed the ATI OverDrive 5 VDDC voltage range. It's listed from a MIN of 1.083v to a MAX of 1.263v. Jackpot! Also in Ati Tray Tools, in the Overclocking section, 4 voltage states are shown in a drop-down box, 1.083v, 1.143v, 1.203v, and 1.263v. If that 1.203v looks familiar, it's from the Stock2.bin BIOS. :) This is pretty darn close to the first range of values I started with; 1.063v, 1.163v, 1.263v, and 1.363v, with the exception of the one over voltage.

It's fairly evident that the voltage steppings and ranges are set in the HD4870 BIOS by ATI or the video card manufacturers (I don't know which). So then I did some more testing (does it ever end?), to see what the differences in power consumption are for these voltages and others inside and outside the range (if the card even responds to them).

I used the Stock2.bin BIOS and altered the GPU/MEMORY clocks to 375/450. After a whole lot of BIOS flashing here's what I came up with;

Code: Select all

BIOS set VOLTAGE  |  Power Consumption  |  Gpu-Z VDDC Current
1.063 (-0.2v)        104W                  12.2A
1.083 (-0.18v)       104W                  12.2A
1.143 (-0.12v)       108W                  13.5A
1.163 (-0.1v)        110W                  14.9A
1.203 (-0.06v)       110W                  14.9A
1.217 (-0.46v)       114W                  14.9A
1.263 (Default)      114W                  14.9A
1.323 (+0.06)        104W                  12.2A
1.363 (+0.1v)        104W                  12.2A
It's pretty clear from analyzing the results that the maximum VOLTAGE for the Stock2.bin BIOS is 1.263v and the minimum VOLTAGE is 1.083. There are two steppings inbetween at 1.143v, and 1.203v. Any values set over or under this range are set at 1.083v. Any values in the range, that are not one of the steppings, run at whatever stepping is closest.

I'm not sure why the Amps is the same for 1.263v and 1.203v. It could be an error in what is reported to Gpu-Z or something that I'm unaware of. I retested those values a few times and got the same result. Interesting. I should note that the Amps do fluctuate during 2D operation, (I have Gpu-Z open right now, and I can see the variations, as I type this). I'd say this is from state switching, courtesy of ATI's PowerPlay.

Another thing to add is that Gpu-Z reports the same VDDC voltage of 1.2625v, regardless of what your HD4870 is running at. I'm guessing it's not reporting properly.

All-in-all, I'm pretty confident now as to what the voltages are available to us with this Stock2.bin BIOS. In the future, newer BIOSes could have expanded voltage ranges, or there may be some possible unlocking of current BIOSes voltage ranges. Time will tell.
Last edited by nafets on Wed Aug 13, 2008 12:18 pm, edited 1 time in total.

ryboto
Friend of SPCR
Posts: 1439
Joined: Tue Dec 14, 2004 4:06 pm
Location: New Hampshire, US
Contact:

Post by ryboto » Wed Aug 13, 2008 9:09 am

For me this would be 160/225 @ 1.063V.
So, what kind of power consumption did you see with these clocks? Also, is the GPU voltage tied to the GDDR voltage? I haven't really read up on the 4800 series much. Could you have tried 160/900 with 1.063v?

edit: oh, and thanks for all the hard work!

quietnevbie
Posts: 18
Joined: Wed Dec 26, 2007 11:52 am

Post by quietnevbie » Wed Aug 13, 2008 10:29 am

Thank you for all this information, nafets. I wonder how much different a 4850 with GDDR3 is regarding all this tweaking..

..I'll check that AGCT & gpu-z 0.2.7

EDIT: I get the message "SetClock failed! Please check device configuration." from AGCT if I try to Set Clocks.. how can I change the clockspeed? =/
Last edited by quietnevbie on Wed Aug 13, 2008 10:55 am, edited 1 time in total.

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Wed Aug 13, 2008 10:46 am

Thanks for all the info. It gives us hope that ATI may eventually fix the power issues, as it does at least prove that the cards can change voltage and have quite extensive facilities for monitoring.

davemuk
Posts: 30
Joined: Sun Jun 10, 2007 11:27 am

Post by davemuk » Wed Aug 13, 2008 12:11 pm

quietnevbie wrote:Thank you for all this information, nafets. I wonder how much different a 4850 with GDDR3 is regarding all this tweaking..
I would very much like to know that too ie. what's the lowest power consumption you can get a 4850 in 2D ?

Dave

nafets
Posts: 89
Joined: Mon Feb 19, 2007 5:04 pm

Post by nafets » Wed Aug 13, 2008 12:48 pm

ryboto wrote:So, what kind of power consumption did you see with these clocks?
You must have missed it;

"My total power consumption now at 160/225 @ 1.063V is 94W, compared to 500/900 @ 1.263 being 139W. A total savings of 45W. Not bad."

This is in 2D (IDLE) state.
ryboto wrote:Also, is the GPU voltage tied to the GDDR voltage?
I have read on other forums (Xtremesystems) that when modifying the voltage in the BIOS, it not only changes the GPU core voltage (VDDC) but also the MEMORY voltage (MVDDC). The only way to verify this is with a digital multimeter, which I do not have.
ryboto wrote:I haven't really read up on the 4800 series much. Could you have tried 160/900 with 1.063v?
I can and I did. I updated my post with the proper voltage ranges, and it's now 1.083v that is the lowest currently for the HD4870. Here is a small comparison for you.

500/900 @ 1.263 - 139W
160/900 @ 1.263 - 138W
500/900 @ 1.083 - 124W
160/900 @ 1.083 - 124W

As I showed before, going from 500MHz to 160MHz only yielded a 1W power consumption decrease. When going to a lower VOLTAGE (1.083v), the lowered GPU clock (160MHz) doesn't do much of anything.
quietnevbie wrote:Thank you for all this information, nafets. I wonder how much different a 4850 with GDDR3 is regarding all this tweaking..

..I'll check that AGCT & gpu-z 0.2.7

EDIT: I get the message "SetClock failed! Please check device configuration." from AGCT if I try to Set Clocks.. how can I change the clockspeed? =/
From what I've read the GDDR3 memory doesn't run much lower than 450-500MHz in 2D (IDLE) state, without some instability. Obviously this is on a case-by-case basis. Every card is different. The great thing is that the HD4850 can fully utilize a BIOS with modified GPU/MEMORY/VOLTAGE settings, with none of the problems that you see with an HD4870.

I don't think you'll see the kind of power consumption decrease seen with the HD4870, by underclocking the MEMORY on the HD4850. GDDR3 operates differently from GDDR5 and the extra VRM complexity and circuitry of the HD4870 adds to it's increased power consumption.

I'm not familiar with how AGCT may or may not function with the HD4850. I do get "SetClock failed!" errors every now and then. Most of the time I can just close the AGCT program, the open it again and it works fine.

Try using the AGCT shortcut method of changing GPU/MEM clocks, listed in my post, and see if that works. Just have one shortcut as the test settings and one as the default settings.

If using Catalyst Control Center you should also be able to use ATI OverDrive 5 to raise or lower your GPU/MEMORY clocks. If and when you find a stable combination, you can edit your BIOS, and flash your card with the modified settings. Good luck...

Vicotnik
*Lifetime Patron*
Posts: 1831
Joined: Thu Feb 13, 2003 6:53 am
Location: Sweden

Post by Vicotnik » Wed Aug 13, 2008 1:27 pm

Great work nafets. :) Btw ALT+PrintScrn captures just the active window and PNG is way better than JPEG for those kind of images (perfect quality, smaller size).

davemuk
Posts: 30
Joined: Sun Jun 10, 2007 11:27 am

Post by davemuk » Wed Aug 13, 2008 1:44 pm

nafets wrote:From what I've read the GDDR3 memory doesn't run much lower than 450-500MHz in 2D (IDLE) state, without some instability.
That's a shame, it seems reducing the memory speed returns the biggest power savings. Does 160/500 for a 4850 reduce it's power consumption by much ?

* Do you have any links about reducing 4850 power usage I could follow up on ?

TIA
Dave

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Wed Aug 13, 2008 2:07 pm

nafets wrote: 500/900 @ 1.263 - 139W
160/900 @ 1.263 - 138W
500/900 @ 1.083 - 124W
160/900 @ 1.083 - 124W

As I showed before, going from 500MHz to 160MHz only yielded a 1W power consumption decrease. When going to a lower VOLTAGE (1.083v), the lowered GPU clock (160MHz) doesn't do much of anything.
Interesting. There seems little point in going lower than 500MHz if the card is stable at that speed and voltage.

nafets
Posts: 89
Joined: Mon Feb 19, 2007 5:04 pm

Post by nafets » Wed Aug 13, 2008 3:09 pm

davemuk wrote:That's a shame, it seems reducing the memory speed returns the biggest power savings. Does 160/500 for a 4850 reduce it's power consumption by much ?

* Do you have any links about reducing 4850 power usage I could follow up on ?

TIA
Dave
Everything I know about the HD4850 is from what I've read on forums and in reviews, so I can't be certain of everything regarding the HD4850.

I really haven't seen much regarding actual power consumption figures of the older and newer HD4850 BIOSes. Most people do know that it runs rather hot with stock cooling (and stock fan speed settings), and isn't much of a power mizer at 2D IDLE.

The default 2D IDLE clocks for the HD4850 are 500/750 @ 1.046v. Newer HD4850's with updated BIOSes are set at 160/500 @ 1.046v.

If the GPU core on the HD4850 operates anything like the HD4870 (which it probably does), reducing the GPU core clock from 500MHz to 160MHz in 2D IDLE does little to nothing for power consumption (maybe 1W savings)and temperatures.

The decrease in clock speed from 750MHz MEMORY to 500MHz isn't very big either, and as I said before there probably isn't much savings there either.

The best bet for anyone with an HD4850 is to do the legwork yourself, and test out if lower GPU and MEMORY clocks do make any appreciable difference in power consumption and temperatures, as I did with the HD4870...

ryboto
Friend of SPCR
Posts: 1439
Joined: Tue Dec 14, 2004 4:06 pm
Location: New Hampshire, US
Contact:

Post by ryboto » Wed Aug 13, 2008 4:22 pm

nafets wrote:
ryboto wrote:So, what kind of power consumption did you see with these clocks?
You must have missed it;

"My total power consumption now at 160/225 @ 1.063V is 94W, compared to 500/900 @ 1.263 being 139W. A total savings of 45W. Not bad."

This is in 2D (IDLE) state.
ryboto wrote:Also, is the GPU voltage tied to the GDDR voltage?
I have read on other forums (Xtremesystems) that when modifying the voltage in the BIOS, it not only changes the GPU core voltage (VDDC) but also the MEMORY voltage (MVDDC). The only way to verify this is with a digital multimeter, which I do not have.
ryboto wrote:I haven't really read up on the 4800 series much. Could you have tried 160/900 with 1.063v?
I can and I did. I updated my post with the proper voltage ranges, and it's now 1.083v that is the lowest currently for the HD4870. Here is a small comparison for you.

500/900 @ 1.263 - 139W
160/900 @ 1.263 - 138W
500/900 @ 1.083 - 124W
160/900 @ 1.083 - 124W

As I showed before, going from 500MHz to 160MHz only yielded a 1W power consumption decrease. When going to a lower VOLTAGE (1.083v), the lowered GPU clock (160MHz) doesn't do much of anything.
Thanks for pointing all that out. I must confess, I was at work and skimmed through some of your post, guess I didn't do as good a job as I thought. It seems strange that the GDDR5 is what's causing the high power draw, considering all the talk about the power efficiency of GDDR5 over the previous generation.

thejamppa
Posts: 3142
Joined: Mon Feb 26, 2007 9:20 am
Location: Missing in Finnish wilderness, howling to moon with wolf brethren and walking with brother bears
Contact:

Post by thejamppa » Thu Aug 14, 2008 7:22 am

Club 3D has released new version of HD 4850. Instead stock cooler, it seems to have Arctic Cooling Accelero L1 OEM cooler with 80mm fan.
pictures here: http://hard-pc.pl/czytaj/news/club3d-hd ... ro-l1.html

That looks pretty nice. What I noticed that Club 3D hasn't opted to cool VRM's anyway. Perhaps I should get few aluminium heatsinks on there... Just in case since my retailer is going to get me one of those. 9800GTX+ are completely out and 30 - 40€'s more expensive.

I just wonder how fast that thing will spin. I hope its sub 2500 rpm's. good quality 80 MM fan is quiet on 2000 RPM's...

davemuk
Posts: 30
Joined: Sun Jun 10, 2007 11:27 am

Post by davemuk » Thu Aug 14, 2008 10:09 am

thejamppa wrote:Club 3D has released new version of HD 4850. Instead stock cooler, it seems to have Arctic Cooling Accelero L1 OEM cooler with 80mm fan.
pictures here: http://hard-pc.pl/czytaj/news/club3d-hd ... ro-l1.html

That looks pretty nice. What I noticed that Club 3D hasn't opted to cool VRM's anyway. Perhaps I should get few aluminium heatsinks on there... Just in case since my retailer is going to get me one of those. 9800GTX+ are completely out and 30 - 40€'s more expensive.

I just wonder how fast that thing will spin. I hope its sub 2500 rpm's. good quality 80 MM fan is quiet on 2000 RPM's...
Powercolor are releasing a passive 4850

ReelMonza
Posts: 41
Joined: Mon Nov 05, 2007 8:20 am

Post by ReelMonza » Thu Aug 14, 2008 2:40 pm

thejamppa wrote:...I just wonder how fast that thing will spin. I hope its sub 2500 rpm's. good quality 80 MM fan is quiet on 2000 RPM's...
I have one here, it spins at 2000RPM but it's quite noisy. It only gets "silent" at 1300~1500RPM

thejamppa
Posts: 3142
Joined: Mon Feb 26, 2007 9:20 am
Location: Missing in Finnish wilderness, howling to moon with wolf brethren and walking with brother bears
Contact:

Post by thejamppa » Fri Aug 15, 2008 12:21 am

ReelMonza wrote:
thejamppa wrote:...I just wonder how fast that thing will spin. I hope its sub 2500 rpm's. good quality 80 MM fan is quiet on 2000 RPM's...
I have one here, it spins at 2000RPM but it's quite noisy. It only gets "silent" at 1300~1500RPM
I can live with that then, if the fan doesn't have bad quality sound or bearing sounds and is intrusive. My game rig has Antec Big Boy: 21-23 dB/A on low: Corsair HX 620W 22 dB/A's, Nexus exhaust @at 12v 23 dB/A's + 2x Slipstreams 800 rpm's undervolted -> 19 d/BA's. My computer is under 30 dB/A's, if I correctly count its something between 25 to 27 d/BA's generally and maybe little over 30 d/BA's on fult tilt or so.
But quiet enough not to be heard over 1 meter distance is normal ambedience when idling. And it doesn't get too noisy when you game also. Then you're focusing something else or wearing headphones so there is probably good changes L1 will be enough for my game rig.


So there is very good change that it won't be intrusive noise. If it is, goodbye L1, hello S1 ^^

quietnevbie
Posts: 18
Joined: Wed Dec 26, 2007 11:52 am

Post by quietnevbie » Fri Aug 15, 2008 3:32 pm

nafets wrote:I'm not familiar with how AGCT may or may not function with the HD4850. I do get "SetClock failed!" errors every now and then. Most of the time I can just close the AGCT program, the open it again and it works fine.

Try using the AGCT shortcut method of changing GPU/MEM clocks, listed in my post, and see if that works. Just have one shortcut as the test settings and one as the default settings.

If using Catalyst Control Center you should also be able to use ATI OverDrive 5 to raise or lower your GPU/MEMORY clocks. If and when you find a stable combination, you can edit your BIOS, and flash your card with the modified settings. Good luck...
Hmm, I tried AGCT, AGCT command line and ATI Tray Tools overclocking options, but nothing can change the clockspeeds of my card. Perhaps it's because I don't have CCC installed or because of 4850 release drivers.

Oh well, I'll try some months later. Sounds promising that 4850 would not have similar problems with memory underclocking.

ReelMonza
Posts: 41
Joined: Mon Nov 05, 2007 8:20 am

Post by ReelMonza » Fri Aug 15, 2008 4:23 pm

Have you tried the latest beta ?
ATi Tray Tools 1.5.8.1250 Beta is available at ComputerBase (registration needed)
I'm not sure if supports the HD 4800 series :?

ReelMonza
Posts: 41
Joined: Mon Nov 05, 2007 8:20 am

Post by ReelMonza » Sat Aug 16, 2008 5:42 pm

ATT BETA 1.5.8.2250

Already implemented:
1. Temperature reading
2. Overclocking with voltage control (on 4850 you can set one step higher voltage than stock 3D voltage. Tested by me with multimeter
3. FAN control

quietnevbie
Posts: 18
Joined: Wed Dec 26, 2007 11:52 am

Post by quietnevbie » Sun Aug 17, 2008 4:20 am

Yes this ATT beta is the only utility that has allowed me to change clockspeeds. But I cannot underclock core at all because 500 is the lowest value, and I can only underclock memory by 10mhz at once because the lowest value is current-10mhz.. and then I have to reopen the window to underclock more.

500/500 was stable and lowered the temperatures a bit. There was no flicker when changing the memory clock. This with a Gigabyte 4850.

FartingBob
Patron of SPCR
Posts: 744
Joined: Tue Mar 04, 2008 4:05 am
Location: London
Contact:

Post by FartingBob » Sun Aug 17, 2008 1:35 pm

I'll keep an eye on this, shouldn't be too long before it fully supports the 48xx series fully and i cna ditch using CCC.

nafets
Posts: 89
Joined: Mon Feb 19, 2007 5:04 pm

Post by nafets » Sun Aug 17, 2008 6:12 pm

quietnevbie wrote:Yes this ATT beta is the only utility that has allowed me to change clockspeeds. But I cannot underclock core at all because 500 is the lowest value, and I can only underclock memory by 10mhz at once because the lowest value is current-10mhz.. and then I have to reopen the window to underclock more.

500/500 was stable and lowered the temperatures a bit. There was no flicker when changing the memory clock. This with a Gigabyte 4850.
You need to change the Overclock and Downclock % values. This will allow you a larger range to set your HD4850 to.

To expand these limits go to Tools & Options -> General Options -> Advanced Tab. Set Overclock limit % to 200 and Downclock limit % to 100.

Obviously you can use lower values, to shorten the range...

quietnevbie
Posts: 18
Joined: Wed Dec 26, 2007 11:52 am

Post by quietnevbie » Mon Aug 18, 2008 6:22 am

That works, thanks.

So I have a 4850 512mb GDDR3. BIOS is 011.004.000.000.029193 (113-B50102-105)

Lowest possible memory clock is 500MHz for me. Lower than that kills the video signal fast, when running ATT artifact tester. 472MHz is the lowest memory clock that doesn't show visual errors (471MHz does).

"Lowest" possible core clock is ~50MHz for me. Lower than that starts jumping the screen a bit, though ATT artifact tester finds no errors with 40MHz core for example.

ATT had 1,044 voltage at default. With 50/500 1,006 seems to be a stable voltage. 0,968 kills the video signal when I try the artifact tester (and it will find artifacts).

Rough estimates of card temperatures @ almost idle.
Default: 49C-50C-52C-48C (GPU-DISPIO-MEMIO-SHADERCORE)
50/500/1,006: 47C-47C-49C-45C

Cooling is almost completely passive accelero S1 rev 2.

edit:
Also, my ATI OverDrive 5 VDDC voltage range is Min 0,892v - Max 1,158v.

nafets
Posts: 89
Joined: Mon Feb 19, 2007 5:04 pm

Post by nafets » Mon Aug 18, 2008 7:43 am

Great results Quietnevbie. Thanks for the info!

I'm guessing you're seeing a noticable drop in temps (and maybe power consumption) going from the default 500/750 @ 1.046v (or 160/500 @ 1.046v) to your custom 2D IDLE setting of 50/500 @ 1.006v.

Also, what is your exact range, with steppings? I'm guessing it goes in 0.036v increments. It's odd that you can't use the lowest voltage setting in your range with stability on your HD4850. I wonder why ATI would set such a range.

With my HD4870 the lowest voltage setting works fine. Interesting...

Post Reply