What's the current preferred software to underclock/volt?

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Post Reply
Brian
Posts: 177
Joined: Wed Jan 18, 2006 3:41 pm
Location: Buffalo, NY

What's the current preferred software to underclock/volt?

Post by Brian » Tue May 08, 2007 4:49 am

There's loads of overclocking software out there; which one is good?

I've got a GeForce 7800GT, and I primarily want to reduce power consumption at idle, and rein in 3D clockspeed a bit.

Thanks.

tehfire
Posts: 530
Joined: Mon Jan 01, 2007 9:57 am
Location: US

Post by tehfire » Tue May 08, 2007 9:23 am

I personally use ATITool (yes, it works on nVidia cards). Another option is to get the coolbits registry hack and then over/underclock via the nVidia control panel.

angelkiller
Posts: 871
Joined: Fri Jan 05, 2007 11:37 am
Location: North Carolina

Post by angelkiller » Tue May 08, 2007 9:37 am

With a BIOS mod, you can lower 2D clock settings, without having a program do it for you.

burebista
Posts: 402
Joined: Fri Sep 02, 2005 12:05 am
Location: Romania

Post by burebista » Tue May 08, 2007 11:36 am

RivaTuner. It's a gem. Underclock/overclock dynamic based on a threshold defined by yourself (mine is Hardware acceleration), fan control, a lot of info on screen about video card (clock, temperature, videomemory usage) and system (voltages, temperatures, CPU speed/load).
But keep in mind that underclock without reducing voltages is less efective power/temperature wise.

It's amazing, for me is RMClock equivalent for GPU.

Image

Brian
Posts: 177
Joined: Wed Jan 18, 2006 3:41 pm
Location: Buffalo, NY

Post by Brian » Tue May 08, 2007 1:02 pm

angelkiller wrote:With a BIOS mod, you can lower 2D clock settings, without having a program do it for you.
I actually tried NiBiTor and Mr. Yevdokimenko's "Nvidia BIOS Modifier", but neither one appears to detect my GeForce 7.

I've tried RivaTuner in the past, but I didn't like it because it doesn't let me underclock very far below stock 2D levels, and the interface is stuck in Luna mode. Icky.

ATiTool fits my decor better, and I'm running my GeForce 7 at 10MHz right now. Awesome. Later tonight, I'll measure what impact clockspeed and VCore have on system power draw.

But does ATiTool do undervolting? Because, as BureBista points out, that's where the big power savings are.

Brian
Posts: 177
Joined: Wed Jan 18, 2006 3:41 pm
Location: Buffalo, NY

Post by Brian » Tue May 08, 2007 4:01 pm

Hmm. I've read it on the SPCR forum before, but I didn't believe it until I saw it with my own eyes: Underclocking the GPU to 4% of stock saves only a single watt when idling at the desktop. Put another way, the GeForce 7 already does a good job of keeping GPU power consumption down at idle.

XBit says my card idles at 20W. I recorded a 1W drop when GPU clock (and thus GPU power consumption) was dropped to 4% of its original value. Thus, components on the card other than the GPU use 19W at idle. That's a bit of a puzzle to me. I can't see any other power hungry components. RAM, voltage regulators, clock generator. What's drawing all the power?

btw, 3D power savings are nice, but they are only linear with clockspeed. I can't seem to alter the voltage at all.

Mr Evil
Posts: 566
Joined: Fri Jan 20, 2006 10:12 am
Location: UK
Contact:

Post by Mr Evil » Tue May 08, 2007 6:32 pm

Brian wrote:...I recorded a 1W drop when GPU clock (and thus GPU power consumption) was dropped to 4% of its original value. Thus, components on the card other than the GPU use 19W at idle...
When a chip is idle, most of the transistors are not switching and thus the clock speed has no effect. However, there are two sources of power consumption in transistors: switching losses and leakage. Leakage is not dependent on either clock speed or whether or not the transistors are switching, and so power consumption is not proportional to clock speed at low loads and will never approach zero.

The only way to reduce leakage is to reduce voltage. Do that, and you should be able to save a bit more of that 19W.

jhhoffma
Posts: 2131
Joined: Mon Apr 25, 2005 10:00 am
Location: Grand Rapids, MI

Post by jhhoffma » Wed May 09, 2007 6:25 am

If you have a VGA connection, the RAMDAC is constantly running to refresh the screen.

Also, just because there's nothing moving on screen doesn't mean that the transistors aren't doing any work...displaying a static screen still requires as many pixels as a moving one.

kike_1974
Posts: 171
Joined: Sun Sep 24, 2006 10:34 am
Location: Spain

Post by kike_1974 » Wed May 09, 2007 1:04 pm

It is more than the RAMDAC refreshing the screen. If it would be only that then a geforce 7800 GT would draw similar power to a 7300 LE at idle, and I'm sure this is not the case.

I think it is more related to current leakage. Voltages of GPUs are kind of high (~1,4V). I think, like Mr Evil pointed out, that by reducing voltage there would be a significant decrease of power draw.

The problem is that, in most cards, voltage is almost imposible to be changed without a hardware voltmod. I hope that nvidia and ati will allow software voltage changing in graphics cards in a similar way to CPUs (C&Q, speedstep, rmclock, crystalcpuid...) in a near future.

Brian
Posts: 177
Joined: Wed Jan 18, 2006 3:41 pm
Location: Buffalo, NY

Post by Brian » Wed May 09, 2007 3:04 pm

It seems the only logical thing to do would be to drop the GPU clockspeed to the threshold of acceptable 2D performance - which I find to be 100MHz - then drop the voltage to the threshold of stability. I reckon most cards take this approach right out of the box. However, nVidia appear to think 250MHz is the correct clockspeed for 2D, and they're being understandably conservative when it comes to voltage at 250MHz.

Hopefully the current generation of GPUs, which draw as much power as a Netburst, will stimulate some innovation regarding power management on desktop graphics cards.

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Thu Aug 16, 2007 2:56 pm

Sorry for the bump but I too have been experimenting in this direction. I can confirm that dropping an 8800 Ultra down to 100MHz does not reduce idle temps at all. I am unable to measure power draw.

There are two things I am working on: memory speed and voltage. NiBiTor seems to be able to adjust both for 2D/3D modes. I tried it and flashed my card (rubber pants time) but the system would not boot properly. It got as far as the Windows desktop and froze. I flashed back to the original BIOS and everything was fine again.

The 8800 Ultra allows free setting of core/memory speeds and voltages down from a stock 1.35V to 1.1V.

Has anyone had any sucess modifying their card with NiBiTor downwards? If the voltage mod could be done, it could be a significant power/heat saving.

WR304
Posts: 412
Joined: Tue Sep 20, 2005 1:21 pm
Location: UK

Post by WR304 » Thu Aug 16, 2007 3:25 pm

What speeds did you use for the BIOS flash? On my 8800GTS 640mb you can drop the core to 100mhz through the BIOS but the shader is at 300mhz and the memory at 400mhz.

You can't drop all three settings to 100mhz. :(

Dropping the shader clocks too low caused bad lockups and instability with my card. :(

When it's running in 2D mode the temperature of my card is always at least 4-5c lower than the card idling in 3D mode (core 648mhz/shader 1620mhz/memory 1000mhz.

Image
8800GTS 640mb 2D idle GPU Temperature Recording in Rivatuner

If you didn't get it to boot into Windows are you sure it's running in 2D mode? :(

Did it show as running at those speeds in Rivatuner? If you're using Windows Vista Aero or have Speedfan running the card would still be at its 3D clocks.

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Fri Aug 17, 2007 2:33 am

Hi WR304.

I tried 200/1200/500 for core/shader/memory, but have tried other values.

I run XP Pro with the classic interface so it should stay firmly in 2D mode.

What I find strange is that the BIOS allows volt and memory speed modding, but the nVidia overclocking control panel and ATITool do not. Neither allow undervolting and both tie the 2D and 3D memory clocks together.

Unfortunately the screen freezes (can move the mouse pointer but not click anything) as soon as the desktop is shown just after logging in, so I could not run Rivatuner. I will experiment some more today.

WR304
Posts: 412
Joined: Tue Sep 20, 2005 1:21 pm
Location: UK

Post by WR304 » Fri Aug 17, 2007 4:31 am

Are you using the original BIOS? Does it have a green integrity in Nibitor?

Can you underclock the card below stock speeds at all in software through Rivatuner?

If you can show underclocking is possible for your card then it's just a case of finding where the limit is. :)

The core and shader clocks are linked together. If you have too big a difference between them in the BIOS settings it could be causing the card to come up with internal errors?

http://www.madshrimps.be/?action=gethow ... howtoID=72

Image
GF8800 Clock/Shader Link Taken From Madshrimps.be Overclocking Guide

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Fri Aug 17, 2007 8:54 am

WR304 wrote:Are you using the original BIOS? Does it have a green integrity in Nibitor?
Yes and yes.
WR304 wrote:Can you underclock the card below stock speeds at all in software through Rivatuner?
Yes, I can. I found something interesting: altering core speed made almost no difference to temps, maybe 0.5-1C at 150MHz vs the stock 612MHz. However, lowering the memory clock from 1080MHz to 500MHz dropped 8C off the idle temps!

Performance at 500MHz mem and 200MHz core in Windows is indistinguishable from stock speeds. One thing I didn't try was HD video though.
WR304 wrote:The core and shader clocks are linked together. If you have too big a difference between them in the BIOS settings it could be causing the card to come up with internal errors?
Yes, perhaps. I did try lowering just the memory clock as well, but that didn't work either. I also tried the BIOS start-up defaults of 200MHz core, 1200MHz shader and 500MHz memory and had no joy. Clearly those settings work in RivaTuner/ATITool and when the system boots, but not with NiBiTor.

I am currently talking to XFX about RMAing the card anyway, because it idles at about 67C at hits 95C under load in a P182 with 120mm fan directly feeding it cool air. A GT7800 only idles at 40C in the same machine.

WR304
Posts: 412
Joined: Tue Sep 20, 2005 1:21 pm
Location: UK

Post by WR304 » Fri Aug 17, 2007 10:46 am

For DVD playback the card should return to its 3D clocks. :)

Have you tried comparing your BIOS revision with the ones found here:

http://www.mvktech.net/component/option ... 0/page,13/

I'm not sure I'd try actually flashing the card with a different BIOS but there could be some different versions of the XFX 8800 Ultra BIOS.

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Fri Aug 17, 2007 1:56 pm

Thanks, will take a look at those.

One thing I have noticed is that the default BIOS has only one performance profile. Sure enough, the default clocks for both 2D and 3D are the same. In order to set up special 2D settings, I have to enable the other modes, which could be screwing the BIOS image up.

It seems strange that no tools can adjust these things in real-time in Windows.

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Sat Aug 18, 2007 2:08 am

Okay, finally had some luck with modified BIOSs.

I have been unable to adjust 2D clock speeds via the BIOS. Any setting other than stock seems to just freeze when it hits the desktop. The exact same settings work fine in ATITool/RivaTuner. Same core, same shader, same mem.

What I did manage to do is lower the voltage on the card. Stock is 1.35V, I dropped the BIOS to 1.2V and it seems stable. Of course, there is no easy way to tell if it is actually running at 1.2V because no software can measure it, but temps did drop around 2C.

The main heat factor seems to be memory. As I mentioned earlier, dropping it to 500MHz gave an 8C drop in temps. However, it seems to be impossible to have separate memory clocks for 2D and 3D modes. Altering one always alters the other as well, and the driver/RivaTuner does not switch between speeds when changing modes.

beefy6969
Posts: 11
Joined: Sat Aug 11, 2007 7:58 pm

Post by beefy6969 » Sun Aug 19, 2007 11:11 am

I did not get a response on the other thread, so once again...

Ok I have a 8800GTS 640mb. I'm using Atitool. I want to underclock it.

I can underclock the core fine, but why cant I underclock the memory?The Core memory is the same in both 2D & 3D when I move the sliding bar.

This is what I have so far..
2D = 300/950
3DLow =400/950
3DHigh=600/950

Can anyone confirm that reducing core to less than 200 will save me a few watts, and is it worth it? I'm running Vista with Aero theme.

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Sun Aug 19, 2007 12:57 pm

Reducing the core will not save much. In my tests it dropped the temperatures less than 1C, too small to accurately measure in fact. It looks like the GPU does a good job of idling anyway.

The only major saving to be had is by underclocking the memory, but as you have found this is problematic. Adjusting memory speed affects both 2D and 3D modes for some reason. It makes no sense - you can switch memory speed on the fly, so why not allow the driver to ramp up memory speed as it does with core speed when changing to 3D mode?

WR304
Posts: 412
Joined: Tue Sep 20, 2005 1:21 pm
Location: UK

Post by WR304 » Sun Aug 19, 2007 1:54 pm

beefy6969 wrote:I did not get a response on the other thread, so once again...

Ok I have a 8800GTS 640mb. I'm using Atitool. I want to underclock it.

I can underclock the core fine, but why cant I underclock the memory?The Core memory is the same in both 2D & 3D when I move the sliding bar.
See my reply here:

viewtopic.php?t=40686&start=120

It should work if you set up a pair of profiles in ATI tool.

Flandry
Posts: 84
Joined: Wed Sep 21, 2005 8:59 pm
Location: IHTFP, MA

Post by Flandry » Mon Sep 10, 2007 11:46 am

I wish i had found this thread earlier -- i was hoping to put a 8600 GTS in my HTPC and underclock and undervolt it when not using it for gaming. Most of the replies i've had about the feasibility of the undervolting indicated that the BIOS VID setting does nothing. Do you have anymore to report, MoJo?

I'm almost considering trying a Volt mod on the card. :shock:

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Mon Sep 10, 2007 11:23 pm

I tried modding the BIOS to do the volt mod but it did not seem to make any difference. I stopped short of using a multimeter to check, but certainly the temperatures didn't drop off at all.

With an 8800 Ultra you can get maybe 8C off if you drop the core and memory speeds as far as they will go. I don't know about the 8600.

kagsp
Posts: 2
Joined: Mon Sep 24, 2007 2:40 am

Post by kagsp » Mon Sep 24, 2007 3:18 am

Another personally motivated HTPC concern brings me to SPCR forum yet again! I just received a VR series Biostar 8600GT. This board has a special voltage regulator and app called V-Ranger. Preliminary tests indicate it does under-volt. I'll return w/ more data like voltage ranges and system power measurements.

Post Reply