GF 6600GT AGP - My cooling adventures

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Ian Brumby
Posts: 12
Joined: Wed Jan 05, 2005 10:12 pm
Location: Canberra, Australia

GF 6600GT AGP - My cooling adventures

Post by Ian Brumby » Wed Jan 05, 2005 11:11 pm

I bought an Albatron Geforce 6600GT AGP version to replace a Radeon 9500. I'd seen the benchmarks on Tom's Hardware and the 6600GT certainly looked like the best card in my price range. However, the review failed to mention how noisy these cards are! I read the review on Anandtech after buying the card. 52db! Ouch!

Picture of the Albatron card (NB: non-reference cooling)
http://www.albatron.com.tw/english/it/v ... pro_id=127

So I started to look for a quiet cooling solution, which is when I found this great site. I saw a post containing an e-mail from Zalman saying the ZM80D-HP fitted the 6600GT AGP version, so I ordered one (along with the Zalman fan).

Since no-one has posted their actual results with trying to perform this, I thought I'd share by experience. I'd like to post a photo of it, but I don't currently have a digital camera.

Installation was reasonably easy. However, due to the mounting holes on the AGP version of the card being rotated, the heatsink bases and heatsinks end up being rotated about 30 degrees. This means the heatsinks stick out of the back of the case and out of the side of the case, which therefore means I can't close the case. But even with the case open my system was running a lot quieter.

It would be good if I could get longer plate springs for the heatsink bases so that they don't have to be mounted at 30 degrees. This would be all Zalman would need to do to produce a good cooling solution for the 6600GT AGP.

The Zalman fan isn't optimally positioned, due to the rotation, however the lower RPM setting (1400 RPM - 20db) was sufficent to keep the heatsinks cool.

Initially I ran the system for only a few minutes, before powering it down and testing the temperatures of the components. The heatsinks and heatpipes were cool. The memory was cool. The AGP bridge chipset was VERY hot. Albatron don't use the reference cooling solution. The reference cooling solution, based on the pictures I've seen, just has a big heatsink over the AGP bridge - but with no fan. The Albatron card has a much smaller heatsink, but probably will get some cooling from the GPU's fan.

I didn't like how quickly the AGP bridge heated up, so I got a Zalman Northbridge heatsink and mounted that on top of the AGP bridge chip. A quick test revealed better performance, but it was still quite hot. Having a heatsink that big does block the two PCI slots under the AGP slot however, but I don't mind. I then added a SilentX 120mm case fan (1600 RPM - 14 db) to the system, with it positioned to cool the VGA card. This made a big difference to the heat level of the AGP bridge.

Now that I was happy with that everything was keeping cool, I ran the system, in the non-stressful manner, for 3 days straight. No problems. After powering down the system and checking the temperatures, I found all heatsinks to be cold (not even warm).

Okay! Now some stress tests. I asked the video card to detect optimal frequencies. It detected 590 MHz/1080 MHz. Not bad! However, after a full day at this speed it started to display slight glitches in 3d intensive games. Clocking it back a bit fixed that.

Now I'm wanting to be able to close up my case. I'm wondering if using a Northbridge heatsink for the GPU (combined with the 120mm fan) would be enough. It seems to me that the Zalman ZM80D-HP is great if your mounting holes aren't rotated and if you don't need to use a fan, however it seems like overkill with a fan since the heatsinks aren't even getting warm. Any opinions on this?

Also, I just used the thermal grease that came with the ZM80D-HP. What are people's opinions on this vs. Arctic Silver?

Ian.

meglamaniac
Posts: 380
Joined: Thu Jul 15, 2004 12:44 pm
Location: UK

Post by meglamaniac » Thu Jan 06, 2005 1:33 am

Looks like it's over to me to do the

Welcome to SPCR!

thing then :)


Let's see now.
First of all a big thanks for posting your results - a fair few people (myself included) had wondered whether the Zalman would be useable on the AGP version of the 6600GT and now we have some info!
In the end I am going for a PCI-express system so can use a PCIe 6600GT with standard mounting holes, but I'm sure this will be of use to others.

It's interesting you note about the springs - maybe Zalman will get around to providing an updated version E of the cooler with such modifications... you never know.

I would be cautious about using just a NB cooler on card like that. While it should be ok on the desktop, during 3D games you may find it overheating (again, you'll be able to tell by glitches/artifacts and quit out before you do permanent damange).
Have you considered the Zalman VF700? I googled a bit and found a post here confirming it fits the 6600GT - it could be a resonable solution for your case.

Arctic Silver is superior to standard silicon based grease, but it's best applied to CPUs which generate many more watts of heat. The 6600GT is quite a cool card for its power, mostly because its manufactured at 90nm. I doubt you'd see much benefit from using A.S. on the card, although others may disagree with me!

SometimesWarrior
Patron of SPCR
Posts: 700
Joined: Thu Mar 13, 2003 2:38 pm
Location: California, US
Contact:

Post by SometimesWarrior » Thu Jan 06, 2005 2:06 am

Thanks for the write-up, Ian! I'm buying a 6600GT-AGP too, and I'll have to face the same dilemma about how to silently cool it.

What about using thermal adhesive to attach the Zalman's GPU block to the card? That way, you could orient the block however you want, regardless of screw holes. You would probably want to find another way to support the weight of the heatsink, since that's a big slab of aluminum tugging on the GPU (the Zalman website says the total weight is 350g).

The weight on its own might not be a problem for thermal adhesive; it's the big, easy-to-accidentally-hit heatsink plates that pose the biggest hazard. I've had a 300g copper heatsink glued to my graphics card for a year and a half, and it only came off once when I gave it a sharp twist. It's survived numerous car trips, too.

I didn't even use adhesive on the center of the GPU: just a tiny bit of superglue around the sides of the GPU shim, with a glob of Arctic Silver in the center for better heat dissipation. So, if you use the whole GPU area for attaching the Zalman to the video card, and you don't mis-handle the card, then the block should stay attached forever (for better or for worse! ;)) Honestly, you can probably remove the heatsink after such an installation, but it requires nerves of steel, a sharp cutting tool, and/or freezing your video card to embrittle the adhesive. Many old posts on this forum have described the procedure, since older video cards always had glued-on heatsinks. It may sound scary, but video cards are surprisingly resilient (speaking from personal experience), and you will build a strong relationship between you and your card as a result. :lol:

Speaking of Arctic Silver, I used to think the stuff was a big deal, but then I did a little test (inadvertantly... I didn't have my normal heatsink-install kit of Arctic Silver, 99% isopropyl alcohol, and razor blade). I took a heatsink with a real nice Arctic Silver 3 install, removed the heatsink, wiped away the residue with my thumb and tissue paper, then re-attached the heatsink using a glob of the generic white goop. The CPU temperature only went up a few degrees, even though I broke just about every rule there is for heatsink installation. :) So, my opinion is that AS won't make much difference for your video card.

Kozure
Posts: 49
Joined: Wed Jan 05, 2005 7:19 am
Location: Toronto, ON, Canada

Post by Kozure » Thu Jan 06, 2005 7:47 am

By a quick glance at the Zalman site, I couldn't quite figure out what the difference between the various ZM80-HP models (A, C ,D) other than colour, but this article has a good breakdown:

http://www.rojakpot.com/default.aspx?lo ... 123&var2=0

Out of curiousity, it looks like the RAM heatsinks are glued onto the RAM. Is that the case, or is thermal tape used?

nici
Posts: 3011
Joined: Thu Dec 16, 2004 8:49 am
Location: Suomi Finland Perkele

Post by nici » Thu Jan 06, 2005 7:56 am

The memory heatsinks are installed with thermal tape.

SometimesWarrior
Patron of SPCR
Posts: 700
Joined: Thu Mar 13, 2003 2:38 pm
Location: California, US
Contact:

Post by SometimesWarrior » Tue Jan 11, 2005 1:07 pm

Hey Ian, just in case you missed it:

Field Report: MSI NX6600GT TD128 w/Zalman ZM-80D-HP

This guy used the rectangular base block rather than the square one, which made it possible for him to mount the heatsink on straight. I wonder if the mounting holes are in the same place for the Albatron and the MSI TD128?

Kozure
Posts: 49
Joined: Wed Jan 05, 2005 7:19 am
Location: Toronto, ON, Canada

Post by Kozure » Tue Jan 11, 2005 1:41 pm

SometimesWarrior wrote:Hey Ian, just in case you missed it:

Field Report: MSI NX6600GT TD128 w/Zalman ZM-80D-HP

This guy used the rectangular base block rather than the square one, which made it possible for him to mount the heatsink on straight. I wonder if the mounting holes are in the same place for the Albatron and the MSI TD128?
Oh, I should clarify - the GPU on my card (the NX6600GT TD128, not the NX600GT VTD128 with video in/video out) is orthagonal to the card - that is, it isn't on a 45 to all the other components. I used the rectangular heatsink base instead of the square because the retention arms didn't reach quite far enough to the mounting holes for a stable fit.

I think (think) that it might be possible to use either of the two heatsink bases because the retention arms can pivot in their slots. HOWEVER, GPUs mounted on an angle may present a problem. Mine was parallel to the card sides - many of the 6600 chipset cards are apparently on 45s.

The full manual from MSI here: http://us-download.msi.com.tw/support/m ... s_v3.0.zip shows the GPU and ram layouts of all the various MSI 6600 and 6800 series cards. Some are orthagonal, some are skewed at 45s. The NX6600VT2D256/128s and the NX6600s (non-GT version) are examples of this 45 degree angle skew mount.

The heatsink base of the ZM-80D-HP must be mounted parallel to the card slot, but I don't see why the base couldn't sit parallel on a skewed GPU, as long as the plate is in contact with the core.

Best to check with Zalman directly - all I can confirm from my own experience is that the ZM-80D-HP will work with a MSI NX6600GT TD128.

Ian Brumby
Posts: 12
Joined: Wed Jan 05, 2005 10:12 pm
Location: Canberra, Australia

Post by Ian Brumby » Tue Jan 11, 2005 5:27 pm

@#!#! I should have got the MSI card. I thought all 6600 AGP's were mounted at 45 degrees.

I'm using the rectangular base, and the arms just don't reach when you mount the base parallel.

I'm currently just leaving the case open. I'm thinking of putting the factory heatsink back (120g Aluminium) on and attaching a case fan to it, like you did.

Ian.

luminous
Patron of SPCR
Posts: 717
Joined: Sat Oct 04, 2003 6:31 am
Location: UK

Post by luminous » Wed Jan 12, 2005 2:48 am

I may get shot when I say this, so bear with me.....

I noticed that some of you were thinking about getting a 6600GT AGP card. Where I live, the cost of the card plus some form of other cooling takes it to the price of the passive Gigabyte 6800 (discussed in other threads in this forum). If you get the Gigabyte you should get the same performance, but with a guaranteed passive cooler.

Anyway, welcome to SPCR!! I am very glad that you have managed to solve the 52dBA problem (my god that is loud).

meglamaniac
Posts: 380
Joined: Thu Jul 15, 2004 12:44 pm
Location: UK

Post by meglamaniac » Wed Jan 12, 2005 7:04 am

This is why it's not worth getting a 6800 - it's barely any faster. Most of the results are ties, and those that are wins are not by a lot.
If you scroll back a few pages through there you'll see the 6600GT coming out in front over several tests as well.

On average:
A 6600GT in the UK is £150
A Zalman ZMHP80-D is £30
A Zalman ZM-OP1 optional fan is £7
A 6800 is £220

So even with the 6600GT + Zalman AND the optional fan (not strictly needed for the 6600GT, but we'll assume you're playing safe) that comes to £187.
So a 6800 costs another £33 ($62 US) for no performance gain whatsoever, on average.

So yeah, you got shot.
But a friendly way, and only because i've researched this A LOT for my upcomming system :)

luminous
Patron of SPCR
Posts: 717
Joined: Sat Oct 04, 2003 6:31 am
Location: UK

Post by luminous » Wed Jan 12, 2005 7:32 am

I paid £205 for mine at overclock.co.uk. I've just looked there again and their prices have risen to £215.

So yes, it is a little more expensive. Yes, there is very little, if any performance gain. But it is an out of the box solution that is totally passive and warrantied.

It is an alternative worth considering for those who were not aware of its existence. You can save a little by getting a 6600GT, but you may need a fan, which will mean its not as quiet. Anyway, just though you should know of the alternatives.

JazzJackRabbit
Posts: 1386
Joined: Fri Jun 18, 2004 6:53 pm

Post by JazzJackRabbit » Wed Jan 12, 2005 11:11 am

meglamaniac wrote:This is why it's not worth getting a 6800 - it's barely any faster. Most of the results are ties, and those that are wins are not by a lot.
If you scroll back a few pages through there you'll see the 6600GT coming out in front over several tests as well.

On average:
A 6600GT in the UK is £150
A Zalman ZMHP80-D is £30
A Zalman ZM-OP1 optional fan is £7
A 6800 is £220

So even with the 6600GT + Zalman AND the optional fan (not strictly needed for the 6600GT, but we'll assume you're playing safe) that comes to £187.
So a 6800 costs another £33 ($62 US) for no performance gain whatsoever, on average.

So yeah, you got shot.
But a friendly way, and only because i've researched this A LOT for my upcomming system :)
I have no intention of starting a flame war, but you should have done your research better. Quick specs on GPUs, 6600GT - 8x3 pipelines, 6800 has 12x5 and most of the time unlocks into 12x6 without any problems - that's 50% increase from 6600GT level, memory, 6600GT has 128bit interface clocked at 1000Mhz, 6800 has 256bit interface clocked at 700Mhz which is 40% better than 6600GT level. Even without looking at the actual tests it should be clear that there is just no way 6600GT is as fast as 6800. As for the actual tests, anandtech is a very poor site to look for reviews, as any hardware website grows in size the quality of articles goes down, be it anandtech, tomshardware or any other big, banner-filled website. If you want to look for reviews, look here - http://www.digit-life.com/articles2/dig ... x0411.html . Especially at high resolutions like 1600x1200, where videocard becomes the bottleneck, you'll see that for example in FarCry 6800 is 30% faster than 6600GT. The reason you don't see any difference right now is because CPU is a bottleneck, especially at the lower resolutions, but if you set both cards to, say 1280x1024 4xAA and 8xAF 6800 will win by a fair margin.

The only question, is 30 pounds worth 30% less performance in games, modding the card with a heatisnk thus voiding the warranty, and possibly having to install an additional fan on it which will only add to the overall noise? IMHO (if 220pounds is for Gigabyte card) those 30 pounds are very well worth the performance increase, warranty and time saved by not installing after-market heatsink (even assuming it won't need a fan).

meglamaniac
Posts: 380
Joined: Thu Jul 15, 2004 12:44 pm
Location: UK

Post by meglamaniac » Wed Jan 12, 2005 12:54 pm

I remain unconvinced, the majority of reviews and benchmarks I can find show the cards tied.
As to the warrenty issue, if you enable the locked pipes you also void your warrenty - as do you if you overclock. Warrenties only cover the unmodified cards, and that covers any hardware, firmware and software modifications.

In my experience anyone rich enough to be running a monitor requring 1600x1200 has more than enough money to buy a 6800GT or Ultra. That'd be a 21" monitor or a 20"/21" panel. While some 19" monitors will support 1600x1200, there's little point as the dot-pitch is such that you actually end up with a worse picture in terms of sharpness and clarity than if you run at a lower res - you'd also get somewhat eyestrained trying to make it out.

In terms of my build I have fairly specific requirements. I need dual DVI-I for my panels, which is available on several 6600GTs but lacking on any 6800s that I can see (certainly not on ones than wouldn't need the cooler modified anyway), and my panels operate at 1280x1024 so higher resolutions are not an issue.

Ian Brumby
Posts: 12
Joined: Wed Jan 05, 2005 10:12 pm
Location: Canberra, Australia

Post by Ian Brumby » Wed Jan 12, 2005 1:54 pm

Over here the prices are:
6600 GT (PCI-X) AU$270
6800 (PCI-X) AU$580

6600 GT (AGP) AU$325
6800 LE (AGP) AU$385
6800 (AGP) AU$450

The PCI Express, the 6600 GT + Zalman is a clear winner.
For AGP, if you can afford the Gigabyte 6800 then it is worth the extra.

For the majority of users who don't care about noise, the 6600 GT is a winner for both platforms. If you want to mention the "possibility of unlocking pipes" with the 6800, you should also include the fact that the 6600 GT has better overclocking potential than the 6800. NVidia could have released a 6600 Ultra, but that would mean the 6800 would lose in a lot of benchmarks to the 6600 Ultra. This means some lucky users have got stable 20% overclocks.

Not trying to start a flame war though. :lol: I agreed for SPCR readers with AGP - Go with the 6800!

SometimesWarrior
Patron of SPCR
Posts: 700
Joined: Thu Mar 13, 2003 2:38 pm
Location: California, US
Contact:

Post by SometimesWarrior » Thu Jan 13, 2005 5:50 am

I got near-20% stable overclocks with my MSI 6600GT-AGP out of the box, both for core and memory, which corresponded to a 15% improvement in 3DMark03 scores. Also, since I was able to silence the card for the cost of a spare fan, it was still a good deal for me: US$210 at Newegg, vs. US$280 for the passive Gigabyte.

I thought about buying a Gigabyte anyway. Some benchmarks show ties between 6600GT and 6800, but certain tests, such as those with 4xFSAA, can show the 6800 in the lead by 20% (FarCry especially). 6600GT could probably match that with overclocking, but the 6800 can be OC'd too. Since the Gigabyte passive card wouldn't fit in my system, the choice was made for me, but others may not be so lucky. :)

luminous
Patron of SPCR
Posts: 717
Joined: Sat Oct 04, 2003 6:31 am
Location: UK

Post by luminous » Thu Jan 13, 2005 9:20 am

Yes the Gigaybyte card was bigger than I expected. Turned out not to be a problem, but nonetheless I was surprised.

OC'ing always varies between card. Some people have managed to take the standard 6800 from 325/700 to 400/800 and unlocked all the pixel pipelines and vertex shaders. I've not been so lucky, no unlocking for me, and I only managed 375/770. I'm still playing with my settings, so I may get a little quicker.

JazzJackRabbit
Posts: 1386
Joined: Fri Jun 18, 2004 6:53 pm

Post by JazzJackRabbit » Thu Jan 13, 2005 4:19 pm

meglamaniac wrote:I remain unconvinced, the majority of reviews and benchmarks I can find show the cards tied.
As to the warrenty issue, if you enable the locked pipes you also void your warrenty - as do you if you overclock. Warrenties only cover the unmodified cards, and that covers any hardware, firmware and software modifications.

In my experience anyone rich enough to be running a monitor requring 1600x1200 has more than enough money to buy a 6800GT or Ultra. That'd be a 21" monitor or a 20"/21" panel. While some 19" monitors will support 1600x1200, there's little point as the dot-pitch is such that you actually end up with a worse picture in terms of sharpness and clarity than if you run at a lower res - you'd also get somewhat eyestrained trying to make it out.

In terms of my build I have fairly specific requirements. I need dual DVI-I for my panels, which is available on several 6600GTs but lacking on any 6800s that I can see (certainly not on ones than wouldn't need the cooler modified anyway), and my panels operate at 1280x1024 so higher resolutions are not an issue.
Like I said, majority of the reviews don't go into enough detail, only handfull of sites run AA and AF benchmarks to push the cards to the limit. And you don't have to run games in the 1600x1200 to see the difference, in any memory intensive mode, like 4xAA 6800 will perform much better than 6600GT simply because it has 40% more memory bandwidth. Of course if you don't need AA you'll be fine with 6600GT, but AA and AF drastically improve picture quality, so if the price difference is fairly marginal between 6600GT+mods and Gigabyte 6800, I would have taken Gigabyte in a heartbeat.

Anyway, you said you needed dual-dvi, in this case modded 6600GT would probably be the best choice for you. That's fine with me. However you cannot say 6600GT is as fast as 6800, that's simply untrue, that's all I wanted to say in my original post.

spacey
Posts: 81
Joined: Mon Aug 11, 2003 10:31 am
Location: Ontario, Canada

Fanmate 2 added to my XFX 6600 GT AGP

Post by spacey » Sun Jan 16, 2005 7:51 am

View my gallery of the mod

Image

suprisingly it makes little to no difference in cooling in my shuttle, idle temps are the same when fan is turned down as when its full speed screaming. probably because of the relatively higher idle temps. but nothing dangerous, i used the nvidia temperature monitor to make sure idle temps are below 50. and i just turn it up full speed when i game. but i do enjoy a good night's sleep.

Edward Ng
SPCR Reviewer
Posts: 2696
Joined: Thu Dec 11, 2003 9:53 pm
Location: Scarsdale, NY
Contact:

Post by Edward Ng » Sun Jan 16, 2005 8:31 am

OMFG IS THAT VIDEO CARD SITTING RIGHT ON THE CARPET!?@#$%

NOOOOOOOOOOOOOOOOOOOOOOOOOOOOO!@#$%$!@#$%@#$%

luminous
Patron of SPCR
Posts: 717
Joined: Sat Oct 04, 2003 6:31 am
Location: UK

Post by luminous » Sun Jan 16, 2005 9:42 am

heh heh heh. You will get away with it 99% of the time. But yes, static is a big issue with carpets. You should never but sensitive electronics on them.

JohnMK
Posts: 159
Joined: Sun Aug 25, 2002 7:37 pm
Location: Seattle
Contact:

Post by JohnMK » Sun Jan 16, 2005 10:24 am

Thanks for the pictures spacey. Don't put computer components on carpets in the future. :) Just trying to help . . .

mathias
Posts: 2057
Joined: Sun Jul 11, 2004 3:58 pm
Location: Toronto
Contact:

Post by mathias » Sun Jan 16, 2005 1:44 pm

Edward Ng wrote:OMFG IS THAT VIDEO CARD SITTING RIGHT ON THE CARPET!?@#$%

NOOOOOOOOOOOOOOOOOOOOOOOOOOOOO!@#$%$!@#$%@#$%
:(

I'm scared now, I'm moving to a house with very elecrtical carpets. I was worried about that before, but now even more so.

Tyrdium
Posts: 272
Joined: Sat Jun 12, 2004 7:29 am
Location: Boston, MA
Contact:

Post by Tyrdium » Sun Jan 16, 2005 2:01 pm

Meh, just get a static mat. I scavenged mine from some Sun upgrade kit that was being thrown out over at BU (no parts, just the mat, strap, and manual). :P

Ian Brumby
Posts: 12
Joined: Wed Jan 05, 2005 10:12 pm
Location: Canberra, Australia

Post by Ian Brumby » Mon Jan 17, 2005 3:44 am

Update. I've removed the Zalman ZM80D-HP after seeing this:
http://forums.silentpcreview.com/viewtopic.php?t=18201

I've put back the stock heatsink. I used Arctic Silver 5 this time. Since the GPU is generating 50W (same as my CPU!) I thought I might as well use it. I've removed the stock fan from the heatsink.

Instead of bolting a fan onto the heatsink, I've made a custom slot fan. I've taken one of the PCI back plates and attached a 120mm fan onto it. I've then screwed the back plate into a PCI slot under the card. I can now close my case.

This solution so far seems excellent.
It's very quiet.
It's very cheap.
It's very easy.
Cooling performance seems good too.
It blocks a few slots though. :)

My card doesn't have a temperature sensor, but I've attached a probe to the heatsink. 34.5C right now typing this message. 45C under load.

SometimesWarrior
Patron of SPCR
Posts: 700
Joined: Thu Mar 13, 2003 2:38 pm
Location: California, US
Contact:

Post by SometimesWarrior » Mon Jan 17, 2005 6:29 pm

You know, that 50W figure I mentioned in my 6600GT-AGP silencing thread... I'm not so sure I believe it myself. The heatsink just isn't getting hot enough. I mean, the heatsink is getting less airflow than my Athlon XP CPU, which has a huge Thermalright heatsink on it, so shouldn't it be hotter than the CPU heatsink? Maybe the heatsink mounting on my vidcard is really bad, which would make the heatsink colder and the GPU hotter... but I suspect it's the 50W figure that's the problem, not the heatsink.

I wish my card had a temperature monitor so I could investigate some more... maybe I'll play with the heatsink mounting and see if it's possible to get a better contact with the GPU.

Edward Ng
SPCR Reviewer
Posts: 2696
Joined: Thu Dec 11, 2003 9:53 pm
Location: Scarsdale, NY
Contact:

Post by Edward Ng » Mon Jan 17, 2005 7:05 pm

I just said in a different thread the same thing; I don't believe that 50watt figure whatsoever. A different VGA power draw test found the 12-pipe 6800, made on 13nm, to draw under 41watts. How can the 6600GT draw a whole 50watts?

Tyrdium
Posts: 272
Joined: Sat Jun 12, 2004 7:29 am
Location: Boston, MA
Contact:

Post by Tyrdium » Mon Jan 17, 2005 7:08 pm

50 watts is for total power consumption, not heat output, no?

Edward Ng
SPCR Reviewer
Posts: 2696
Joined: Thu Dec 11, 2003 9:53 pm
Location: Scarsdale, NY
Contact:

Post by Edward Ng » Mon Jan 17, 2005 7:11 pm

Under 41.0watts total power consumption on a 12-pipe, 13nm 6800. How can the 11nm, 8-pipe 6600GT be so leaky that it would consume 10watts more? That's highly unlikely, particularly considering I'm able to stress my own 6600GT, overclocked (575/1075), as hard as I like, with a VM-101 cooling it and have zero problems. I just don't see where all that power would be going to if it's not heat.

EDIT: Okay, somebody linked me to that test. That's pretty damn leaky of the 11nm process right there. :?

Ian Brumby
Posts: 12
Joined: Wed Jan 05, 2005 10:12 pm
Location: Canberra, Australia

Post by Ian Brumby » Mon Jan 17, 2005 7:22 pm

X-bit labs claim 39W for the 6800, and 48W for the 6600 GT. The 39W figure is in-line with your measurements, so it's not as though their tests are totally screwed.

Under load my CPU heatsink is 5 degrees cooler than the VGA heatsink. It's obviously a much bigger heatsink though.

Edward Ng
SPCR Reviewer
Posts: 2696
Joined: Thu Dec 11, 2003 9:53 pm
Location: Scarsdale, NY
Contact:

Post by Edward Ng » Mon Jan 17, 2005 7:26 pm

Er I don't have any measurements on 6800; never even touched or used one before. I've got a 6800GT and a 6600GT, and the 6800GT is water cooled.

Wow, that 11nm is seriously leaky. :(

Post Reply