6800 - powerdraw. How *can* this be cooled?
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
6800 - powerdraw. How *can* this be cooled?
I did a few basic maths.
The 6800 Ultra has got ca. 222 million transistors (give or take, I'm going by memory).
A P4 EE has got a good 50 million transistors less than that.
In terms of cooling, it's a little easier, as rather than a 0.13 micron process (such as the P4), the NVIDIA chip is based on 0.15 micron manufacturing process.
Ergo, more surface area, ergo easier to cool.
However, Since a Prescott 3.4 generates a good 100W of heat ... what's this say about the GeForce 6800 Ultra? (Not commenting on ATI's baby yet, as there's no reviews up on it). SURELY, the 6800 must generate more than that then?
Having chewed the numbers a little, I am not entire certain how this thing *CAN* be cooled? Going by the reviews from THG, Anandtech and so on, one has had a good look at their cooler. OK, so it takes a PCI slot's worth ... but still.
This makes me worry something. Especially considering what lengths one has to go to, to cool 100W of CPU.
Now, to (assumption) cool THAT much again, in terms of GFX-cards ...? Can anything "out now" do this with any reasonable volume? (Preferably without being watercooled - I've got my own reasons not to want watercooling).
Just a thought. How *CAN* one cool such a toaster?
The 6800 Ultra has got ca. 222 million transistors (give or take, I'm going by memory).
A P4 EE has got a good 50 million transistors less than that.
In terms of cooling, it's a little easier, as rather than a 0.13 micron process (such as the P4), the NVIDIA chip is based on 0.15 micron manufacturing process.
Ergo, more surface area, ergo easier to cool.
However, Since a Prescott 3.4 generates a good 100W of heat ... what's this say about the GeForce 6800 Ultra? (Not commenting on ATI's baby yet, as there's no reviews up on it). SURELY, the 6800 must generate more than that then?
Having chewed the numbers a little, I am not entire certain how this thing *CAN* be cooled? Going by the reviews from THG, Anandtech and so on, one has had a good look at their cooler. OK, so it takes a PCI slot's worth ... but still.
This makes me worry something. Especially considering what lengths one has to go to, to cool 100W of CPU.
Now, to (assumption) cool THAT much again, in terms of GFX-cards ...? Can anything "out now" do this with any reasonable volume? (Preferably without being watercooled - I've got my own reasons not to want watercooling).
Just a thought. How *CAN* one cool such a toaster?
BTW, I posted this in a seperate thread, as http://forums.silentpcreview.com/viewtopic.php?t=11847 goes on about the power-needs.
I'm more concerned with the actual theoretical heat-output I'm seeing. I've not seen any sites yet speak about the actual heat generation with the NVIDIA samples, so it's a bit tricky to say.
Also - does anyone have any idea if ATI's is likely to be cooler? Seeing as I've no idea yet on transistor count, I believe that it will have fewer than its competitor. So, should be easier to cool.
While my personal prefereance lies with NVIDIA, I might well have to go for ATI, in the event that they have less of a fusion reactor going on.
Of course, the "rumours" (?) that they're also going to need 2 Molex connectors don't make this a particularly likely matter.
Hmmm ... definately, I think, one should point a few CEO's from the "big two" GPU manufacturers to SPCR .
Something like "Centrino for GPU" would be a good idea (I know it's far fetched, and the difficulties in it, but ... you never know) .
I'm more concerned with the actual theoretical heat-output I'm seeing. I've not seen any sites yet speak about the actual heat generation with the NVIDIA samples, so it's a bit tricky to say.
Also - does anyone have any idea if ATI's is likely to be cooler? Seeing as I've no idea yet on transistor count, I believe that it will have fewer than its competitor. So, should be easier to cool.
While my personal prefereance lies with NVIDIA, I might well have to go for ATI, in the event that they have less of a fusion reactor going on.
Of course, the "rumours" (?) that they're also going to need 2 Molex connectors don't make this a particularly likely matter.
Hmmm ... definately, I think, one should point a few CEO's from the "big two" GPU manufacturers to SPCR .
Something like "Centrino for GPU" would be a good idea (I know it's far fetched, and the difficulties in it, but ... you never know) .
Full size CPU heatsink + ducted flow throwing the air out of the back of the case? For a 30 C delta-T (i.e. about 50 C exit temp - max you can probably run at stably) you need a fan capable of pushing about 50 CFM when the card is at full power. That means at least a 92mm fan if you're keeping it quiet...
-
- Posts: 90
- Joined: Mon Feb 09, 2004 4:16 pm
- Location: Greece
It is supposed to draw about 120Watts , that is huge.
It uses GDDR3 which , is supposed , to have much lower power consumption than DDR2 or DDR1.
So in fact , it's core will be a lot hotter than a 9800XT or 5950U.
I'd bet it's core uses ~100Watts , that's almost as much as a Prescott or a heavily overclocked northwood/barton (my barton is ~110-120Watts).
I don't think anything below a waterblock will be enough to cool such a beast quietly.
Hell , leadtek's stock cooler looks almost like an SLK900U....
It uses GDDR3 which , is supposed , to have much lower power consumption than DDR2 or DDR1.
So in fact , it's core will be a lot hotter than a 9800XT or 5950U.
I'd bet it's core uses ~100Watts , that's almost as much as a Prescott or a heavily overclocked northwood/barton (my barton is ~110-120Watts).
I don't think anything below a waterblock will be enough to cool such a beast quietly.
Hell , leadtek's stock cooler looks almost like an SLK900U....
pdf27
That's my point exactly. Throwing a 100W CPU into your system is "OK", because you've got a fat FHS sitting on it.
Can you show me how I attach a Thermaltake SP-94 to an AGP card, exactly?
apocalypse80
I doubt it's the GDDR-3 which suck the power. Rather the 222 million transistors on the GPU.
Where's the figure of 120 watts from, if I might inquire?
Even so it scares me, in a way, as it's more than my darn 3.4 Prescott/PreHott .
You've seen Leadtek's stock cooler? Where? How? Any pictures?
I've only seen the "NVIDIA Sample" stuff...
That's my point exactly. Throwing a 100W CPU into your system is "OK", because you've got a fat FHS sitting on it.
Can you show me how I attach a Thermaltake SP-94 to an AGP card, exactly?
apocalypse80
I doubt it's the GDDR-3 which suck the power. Rather the 222 million transistors on the GPU.
Where's the figure of 120 watts from, if I might inquire?
Even so it scares me, in a way, as it's more than my darn 3.4 Prescott/PreHott .
You've seen Leadtek's stock cooler? Where? How? Any pictures?
I've only seen the "NVIDIA Sample" stuff...
-
- SPCR Reviewer
- Posts: 8636
- Joined: Sat Nov 23, 2002 6:33 am
- Location: Sunny SoCal
Arrrggghhh!1 I saw one of these this morning and let it pass, but two in one day is more than I can take.shathal wrote:pdf27
Can you show me how I attach a Thermaltake SP-94 to an AGP card, exactly?
It's a Thermalright SP94.
Thermalright = Well designed, cutting-edge, form follws function heatsinks.
Thermaltake = Crappy, noisey, ugly, riced up, cheap knock-offs of other products, gussied up with lights and other assorted bling to appeal to a certain segment of the market.
If it wasn't that Thermalright makes such nice stuff and Thermaltake makes such garbage, I wouldn't care.
-
- Posts: 90
- Joined: Mon Feb 09, 2004 4:16 pm
- Location: Greece
What I meant by memory consumption was ;
it was measured by THG (I think) that the 6800U draws ~20Watts more than the 5950U , however , since GDDR3 (used by 6800U) , draws less power than the DDR1 (used by 5950U) , the core of the 6800U has to draw >20Watts more than the 5950 core (I'd bet ~30Watts).
120Watts total consumption was (if I remember correctly) in nvidia's specs.
Therefore I draw a conclusion that the core of the 6800U will draw ~100Watts (but then again , I may be talking out of my b*tt).
Also , I have no idea if 120 watts is max power , or "thermal design power" (which would probably be worse).
And here's the leadtek , scary if you ask me.
it was measured by THG (I think) that the 6800U draws ~20Watts more than the 5950U , however , since GDDR3 (used by 6800U) , draws less power than the DDR1 (used by 5950U) , the core of the 6800U has to draw >20Watts more than the 5950 core (I'd bet ~30Watts).
120Watts total consumption was (if I remember correctly) in nvidia's specs.
Therefore I draw a conclusion that the core of the 6800U will draw ~100Watts (but then again , I may be talking out of my b*tt).
Also , I have no idea if 120 watts is max power , or "thermal design power" (which would probably be worse).
And here's the leadtek , scary if you ask me.
Well, for a start get away from the mindset that the weight of the cooler has to be carried by the card. That'll probably require some modifications to the case to support the cooler, but should be possible. More of an issue is the number of PCI slots it would take up, unless you connect it with heatpipes (come to think of it, is a flexible heatpipe technically possible? That would solve the problem neatly).shathal wrote:pdf27
That's my point exactly. Throwing a 100W CPU into your system is "OK", because you've got a fat FHS sitting on it.
Can you show me how I attach a Thermalright SP-94 to an AGP card, exactly?
This is why I personally think the future of silent computing is either watercooled or using heatpipes - you've got increasing amounts of heat generated in a very small area, and air as a working fluid is just too limited without a huge heatsink to spread it. This trend will probably get worse, and while improved heat sink design and use of ducting will keep things under control for a while, there are physical limits you will start to hit eventually where the air flow required is just too high to do it silently.
The BTX-standard will somewhat help out here. Firstly the cooling will be on the upper side of the card, så the fans will be facing away from the PCI-ports. Second will the hot air be blown out of the case and the air that will be used by the GPU-HS will have a constant temp (after passing through the CPU-HS:)
-
- Posts: 90
- Joined: Mon Feb 09, 2004 4:16 pm
- Location: Greece
I actually believe that BTX will make things much worse for VGA cooling.
The "cold" air that will be fed to the VGA HS will actually be hot , after going through the HS of a 100+ Watts CPU.
Substitute a 150Watts CPU and you have a hairdryer blowing over your VGA...
I don't think that will help...
We have to face it , BTX was only created to keep Preshots cold , Intel doesn't care about anything else that might be inside the case.
The "cold" air that will be fed to the VGA HS will actually be hot , after going through the HS of a 100+ Watts CPU.
Substitute a 150Watts CPU and you have a hairdryer blowing over your VGA...
I don't think that will help...
We have to face it , BTX was only created to keep Preshots cold , Intel doesn't care about anything else that might be inside the case.
Don't quite agree.
It's been my experience that Intel doesn't so much "not care" as it might simply not "be aware" that this is an issue. Ultimately, all of their efforts to get systems run cooler are based around the concept of their CPU's - true.
They're not an altruistic organisation, after all, but a company that needs/wants to make profit.
However, with the current way the roadmap seems to be heading to be "Centrino on desktop", things should improve a lot, cooling-wise. I also see a problem in "selling" this to Intel as a problem, to be honest.
"It's up to the graphics manufacturers to make sure they're running cool." is what I'd expect the answer to be. Unless Intel would suddenly (for example) buy up a big chunk of ATI or NVIDIA (merely mentioning the two top dogs, as they're the most obvious heat "culprits") I doubt Intel will really be "into" caring overly about graphics cards.
AMD obviously I'm not mentioning much, because their say in the industry is not that great, compared to Intel's muscle (a.k.a. "mucho dollar").
Disappointing? Maybe.
I certainly hope that SOME sort of "revolution" comes along for cooling graphics cards.
If a 6800 is taking up 120W THIS year - then it's pretty safe to assume, that in ca. 1.5 - 2 years, the mid-low end cards will have that much power, and ergo, it's a much wider issue.
I doubt, however, that much will happen before this hits some sort of "critical mass", knowing the industry from my own experiences.
- Shathal.
It's been my experience that Intel doesn't so much "not care" as it might simply not "be aware" that this is an issue. Ultimately, all of their efforts to get systems run cooler are based around the concept of their CPU's - true.
They're not an altruistic organisation, after all, but a company that needs/wants to make profit.
However, with the current way the roadmap seems to be heading to be "Centrino on desktop", things should improve a lot, cooling-wise. I also see a problem in "selling" this to Intel as a problem, to be honest.
"It's up to the graphics manufacturers to make sure they're running cool." is what I'd expect the answer to be. Unless Intel would suddenly (for example) buy up a big chunk of ATI or NVIDIA (merely mentioning the two top dogs, as they're the most obvious heat "culprits") I doubt Intel will really be "into" caring overly about graphics cards.
AMD obviously I'm not mentioning much, because their say in the industry is not that great, compared to Intel's muscle (a.k.a. "mucho dollar").
Disappointing? Maybe.
I certainly hope that SOME sort of "revolution" comes along for cooling graphics cards.
If a 6800 is taking up 120W THIS year - then it's pretty safe to assume, that in ca. 1.5 - 2 years, the mid-low end cards will have that much power, and ergo, it's a much wider issue.
I doubt, however, that much will happen before this hits some sort of "critical mass", knowing the industry from my own experiences.
- Shathal.
-
- Site Admin
- Posts: 12285
- Joined: Sun Aug 11, 2002 3:26 pm
- Location: Vancouver, BC, Canada
- Contact:
shathal -- There is a broad consensus among PC component makers I've discussed the BTX: They agree with apocalypse80. Putting everything else (esp the VGA) downstream of the CPU-heated airflow is not exactly a systems approach. It's downright dumb IMO.apocalypse80 wrote:I actually believe that BTX will make things much worse for VGA cooling.
The "cold" air that will be fed to the VGA HS will actually be hot , after going through the HS of a 100+ Watts CPU.
Substitute a 150Watts CPU and you have a hairdryer blowing over your VGA...
I don't think that will help...
We have to face it , BTX was only created to keep Preshots cold , Intel doesn't care about anything else that might be inside the case.
As for your comment...
They're not an altruistic organisation, after all, but a company that needs/wants to make profit.
No kidding!!!
Yep - I fully agree.MikeC wrote:shathal -- There is a broad consensus among PC component makers I've discussed the BTX: They agree with apocalypse80. Putting everything else (esp the VGA) downstream of the CPU-heated airflow is not exactly a systems approach. It's downright dumb IMO.
Point hairdryer at graphics card. Real clever that. Or not.
Makes me somewhat dubious of the expected lifetime of BTX, assuming that in 2 years time even mid/low-end graphics cards will be putting out 100+ W of heat.