The new Nvidia GTX 280 : 240W TDP. Are silencers doomed?
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
The new Nvidia GTX 280 : 240W TDP. Are silencers doomed?
HI:
I've been checking the latest news on the upcoming Nvidia GTX 280 (formerly called the 9900 GTX), which is supposed to be out on June 18th. Just two days after the release of the AMD/ATI 4870 series.
http://www.tweaktown.com/news/9487/nvid ... ped_naked/
All the specs are very impressive. Except for one thing...
The TDP of the card is 240W.
Unnoficial sources say that Coolermaster is currently working on the cooler for this... this THING!
Imagine... the 8800 Ultra's TDP was 185W and that was toasty enough for everybody. Many people on Nvidia boards have complained for overheating problems with the 8800's series. What can we expect with this?
a) A furnace?
b) One component that will create a fire inside our rigs?
I'm kind of dissapointed specially since the 8800 Ultras were built at 90nm.
These new GTX 280's are made at 65nm, and even so, the size of the chips is the biggest one I've ever seen: about 576mm2.
At least for the design in the pictures the cooler is expected to exhaust the air outside the case.
But my guess is that with that huge die size, Tharmalright, Arctic and Zalman will have a new challenge. With a TDP of 240W I think it is very unlikely that we have an aftermarket cooler solution whithout exhaust.
I say with no exahsut solution, if we ever put this... ABOMINATION, inside our rigs, our other components will eventually overheat and die.
Also, forget about silence, I think we will see Delta fans all around this thing.
LOL!
I'm starting to hate NVIDIA, with all the global warming we are facing, and trying to keep up with energy costs, now I can see a bunch of spoled brats putting 3 of these things on SLI and burning their rigs with a 2000W power supplies.... LMAO!
At least AMD is having the winning on energy conservation. I sincerely hope they grab the crwon this time. I'm no fan of either ATI or NVIDIA, but I'm just hoping for the time I get rid of my Ultra and get a real evolved solution
Opinions. I would like to see the consensus.
I've been checking the latest news on the upcoming Nvidia GTX 280 (formerly called the 9900 GTX), which is supposed to be out on June 18th. Just two days after the release of the AMD/ATI 4870 series.
http://www.tweaktown.com/news/9487/nvid ... ped_naked/
All the specs are very impressive. Except for one thing...
The TDP of the card is 240W.
Unnoficial sources say that Coolermaster is currently working on the cooler for this... this THING!
Imagine... the 8800 Ultra's TDP was 185W and that was toasty enough for everybody. Many people on Nvidia boards have complained for overheating problems with the 8800's series. What can we expect with this?
a) A furnace?
b) One component that will create a fire inside our rigs?
I'm kind of dissapointed specially since the 8800 Ultras were built at 90nm.
These new GTX 280's are made at 65nm, and even so, the size of the chips is the biggest one I've ever seen: about 576mm2.
At least for the design in the pictures the cooler is expected to exhaust the air outside the case.
But my guess is that with that huge die size, Tharmalright, Arctic and Zalman will have a new challenge. With a TDP of 240W I think it is very unlikely that we have an aftermarket cooler solution whithout exhaust.
I say with no exahsut solution, if we ever put this... ABOMINATION, inside our rigs, our other components will eventually overheat and die.
Also, forget about silence, I think we will see Delta fans all around this thing.
LOL!
I'm starting to hate NVIDIA, with all the global warming we are facing, and trying to keep up with energy costs, now I can see a bunch of spoled brats putting 3 of these things on SLI and burning their rigs with a 2000W power supplies.... LMAO!
At least AMD is having the winning on energy conservation. I sincerely hope they grab the crwon this time. I'm no fan of either ATI or NVIDIA, but I'm just hoping for the time I get rid of my Ultra and get a real evolved solution
Opinions. I would like to see the consensus.
To be fair, even the stock 8800 Ultra cooler is very quiet, and it only ramps up under serious load (most games don't touch it).
To be honest I'm not that bothered about the noise level when maxed out (within reason), because that condition only occurs when playing high-end games which make plenty of noise anyway.
The important part is how loud the cooler is on the desktop, when I care about noise. In that area, the current 8800 cards are actually quite good, even if they do idle at about 60-65C. ATI are definitely way ahead in that area, but as I have not heard the 3870 cooler I can't make a comparison.
To be honest I'm not that bothered about the noise level when maxed out (within reason), because that condition only occurs when playing high-end games which make plenty of noise anyway.
The important part is how loud the cooler is on the desktop, when I care about noise. In that area, the current 8800 cards are actually quite good, even if they do idle at about 60-65C. ATI are definitely way ahead in that area, but as I have not heard the 3870 cooler I can't make a comparison.
Yeah but if you purchase a new nvidia motherboard with your expensive graphics card it will be, Load = 240W, Idle = 0w. Since the new nvidia boards can turn off the discrete card and use the onboard.
But if you have an intel chipset then your out of luck.
The next gen AMD/ATI's released in June are supposed to be ~110w and ~140w load and ~10w idle on any motherboard.
But if you have an intel chipset then your out of luck.
The next gen AMD/ATI's released in June are supposed to be ~110w and ~140w load and ~10w idle on any motherboard.
I wouldn't worry about the high-end monsters that serve one purpose: Generating buzz.
What I look forward to, is the new production process. The last change brought us the 9600GT, which is twice as fast as the previous readily available NVidia GPU with easy passive cooling (8600GTS), while having almost the same power draw. Even without large architecture changes, the smaller production process should provide another contender in this area.
What I look forward to, is the new production process. The last change brought us the 9600GT, which is twice as fast as the previous readily available NVidia GPU with easy passive cooling (8600GTS), while having almost the same power draw. Even without large architecture changes, the smaller production process should provide another contender in this area.
Yep! I'm looking for the GTX 2xx equivalent of the 9600GT @55nm.Modo wrote:I wouldn't worry about the high-end monsters that serve one purpose: Generating buzz.
What I look forward to, is the new production process. The last change brought us the 9600GT, which is twice as fast as the previous readily available NVidia GPU with easy passive cooling (8600GTS), while having almost the same power draw. Even without large architecture changes, the smaller production process should provide another contender in this area.
If power draw is high, I'll stick with my 8800GTS/HR-03 Plus and wait another refresh.
I just want to play Crisis eventually and Stalker: Clear Skies at full-res.
I'd say it's overplaying things a tad to suggest that GTX 280 spells doom for silencers. For those who want silent and a GTX 280, well they've got a challenge on their hands. For the rest of us who want silence but don't want a GTX 280 well it doesn't make much of a difference does it
It's quite entertaining that AMD/ATI somehow come out of this round smelling of roses because their graphics card only pulls 140W.
I have my doubts that the cut-down versions of this card will be any more sensible. I mean you could disable half of it and you'd still be in power loony-land. As for shrinks of this they'll be a way off yet.
The upcoming die-shrunk G92b (9900GTX I think it's called) looks like it might be quite interesting, it has the potential to be a new sweet-spot and it should support HybridPower too.
It's quite entertaining that AMD/ATI somehow come out of this round smelling of roses because their graphics card only pulls 140W.
I have my doubts that the cut-down versions of this card will be any more sensible. I mean you could disable half of it and you'd still be in power loony-land. As for shrinks of this they'll be a way off yet.
The upcoming die-shrunk G92b (9900GTX I think it's called) looks like it might be quite interesting, it has the potential to be a new sweet-spot and it should support HybridPower too.
I honestly don't see a problem with this. As long as idle/2D power is very low and load power is reasonable (so a 400W PSU will cope with it) then heat/noise production in gaming mode isn't a big issue for me. Games are noisy anyway.nutball wrote:It's quite entertaining that AMD/ATI somehow come out of this round smelling of roses because their graphics card only pulls 140W.
Well, maybe if you do a lot of 3D modelling or CAD or something it might be an issue I guess. Even then, I bet that doesn't heat the card up that much.
It seems like an ideal solution to me. Power when you need it, but otherwise low heat/power/noise.
-
- Patron of SPCR
- Posts: 744
- Joined: Tue Mar 04, 2008 4:05 am
- Location: London
- Contact:
I wasnt sure between this and the 4800 series from ATI. Now its a no contest, i just hope that ATI use the same mounting holes as my S1. Probably wont just to annoy people (really, why is there any need for more than 1 type of mounting on all cards??) and we'll have to wait for a good passive cooler.
-
- Patron of SPCR
- Posts: 744
- Joined: Tue Mar 04, 2008 4:05 am
- Location: London
- Contact:
This.Modo wrote:What makes you think an ATI offering of similar capabilities will consume (much) less power?FartingBob wrote:I wasnt sure between this and the 4800 series from ATI. Now its a no contest, i just hope that ATI use the same mounting holes as my S1.
114W TDP for 4850 and 157W TDP for the 4870.
Obviously we'll have to see benchmarks before we decide how efficient they are compared to current cards, but if the 240w TDP rumour is true for the 280 thats 83W more than its rival. Even if it outperforms it in benchmarks (id say likely) it wouldnt be worth the extra heat IMO. You have no chance of cooling 240w cards passively unless you live in the Arctic circle.
And to compare them against each other on an FPS per watt scale.FartingBob wrote:Obviously we'll have to see benchmarks before we decide how efficient they are compared to current cards,[snip]
Agreed. Hence my comment about a 9600GT replacement.FartingBob wrote:Even if it outperforms it in benchmarks (id say likely) it wouldnt be worth the extra heat IMO.
-
- Posts: 239
- Joined: Sat Mar 29, 2008 7:05 pm
- Location: Toronto Ontario
Now that it's here, I'll revive this thread. I'll start off by posting:
I'm dubious about Tom's Hardware and their sound analysis, but that's the first thing I came across. With ginormous dies, perhaps DHT coolers can actually make use of more heatpipes now. Next thing you know, we'll have soundcard companies suing Nvidia for lost revenue because they deafened all their customers.
I'm dubious about Tom's Hardware and their sound analysis, but that's the first thing I came across. With ginormous dies, perhaps DHT coolers can actually make use of more heatpipes now. Next thing you know, we'll have soundcard companies suing Nvidia for lost revenue because they deafened all their customers.
Its hard to believe it would be that high on idle, especially considering the new GTX cards have a much lower idle watt usage (and why would it be so much higher on idle at GTX 280 than GTX 260...).
Most reviews seems to agree that its very loud on full load though.
However, isnt that the case for any standard GPU cooler.
The GTX 280 doesnt seem to reach its TDP by the way (probably not a surprise), still, it gets over 200W, a fair amount :O (many of the places where they report the power usage, they are testing overclocked versions)
What turns me off on the GTX isnt really the excessive power use, it seems to be reasonable considered the cards performance.
It is the insane price. Should cost $150 less just to be somewhat competitive (still paying a premium for getting the fastest).
Just wondering one thing, partly related to the GTX 280.
What is 'easiest' to cool quietly (not passive)? One card drawing 200W or 2 cards drawing 100W each?
Most reviews seems to agree that its very loud on full load though.
However, isnt that the case for any standard GPU cooler.
The GTX 280 doesnt seem to reach its TDP by the way (probably not a surprise), still, it gets over 200W, a fair amount :O (many of the places where they report the power usage, they are testing overclocked versions)
What turns me off on the GTX isnt really the excessive power use, it seems to be reasonable considered the cards performance.
It is the insane price. Should cost $150 less just to be somewhat competitive (still paying a premium for getting the fastest).
Just wondering one thing, partly related to the GTX 280.
What is 'easiest' to cool quietly (not passive)? One card drawing 200W or 2 cards drawing 100W each?
-
- Patron of SPCR
- Posts: 744
- Joined: Tue Mar 04, 2008 4:05 am
- Location: London
- Contact:
Simple: Don't buy a card you can't or won't cool quietly.
Beasts like the GTX 280 always have a market, if you don't like it just make sure you're not it. If you're serious about cooling this one quietly, I'd advice water-cooling. Should be able to handle it easily (with some quality components).
I'm definitely going to try WC some day. I've even got two high-end WC pumps lying around (1xIwaki RD-30 and 1xIwaki RD-20). Just need to find a good radiator like this one (definitely check it out!).
Beasts like the GTX 280 always have a market, if you don't like it just make sure you're not it. If you're serious about cooling this one quietly, I'd advice water-cooling. Should be able to handle it easily (with some quality components).
I'm definitely going to try WC some day. I've even got two high-end WC pumps lying around (1xIwaki RD-30 and 1xIwaki RD-20). Just need to find a good radiator like this one (definitely check it out!).
Either your jealous or are not thinking.CA_Steve wrote:I've read the various reviews posted this week....and am bemused by the results shown.
- insane price
- insane power in 3D mode
- very little performance benefit vs other cheaper/lower power/much quieter solutions.
Until this chip gets shrunk from 65nm to 55nm, this is one useless product.
There's obvious power in this thing, and plenty of room to improve from drivers. It owns things like the 8800GTX, GTS 512, etc -- with CRAP drivers.
that was a counterproductive comment. obviously it outperforms an 8800GTX- and so it should, for it is twice the price! and what exactly is CA_Steve supposed to be jealous of, having a small nuclear reactor in his PC? you will learn quickly that SPCR is not the kind of place that tolerates "l33t/haxor" kiddies who can't communicate in a civilised and respectful way.aSASa wrote:Either your jealous or are not thinking.
There's obvious power in this thing, and plenty of room to improve from drivers. It owns things like the 8800GTX, GTS 512, etc -- with CRAP drivers.
NVidia outran their process when they developed this thing. Huge chip = low yields, low good die per wafer, leading to very high price as well as the crazy high power req. As a demonstration vehicle to show processing power, it's nice. The wider memory path fixes a known bottleneck. But, until they shrink it to 55nm, it's just a fanboy product.
I look at this stuff from two points of view:
- What's the state of technology?
- What would I recommend to friends for their builds?
In it's current incarnation, it doesn't pass muster for the second point. None of my gaming friends have 30" monitors or need to show a high Crysis benchmark....most are at 1680 x 1050 (22") and some at 1920 x 1200 (24"). At these resolutions, a $130 card generally gets the job done. If the user needs more horsepower, two $130 cards in SLI can kick butt.
I love this Tech Report article. A pair of 9600GT cards (costing about $260 these days) will hold their own vs most high end cards that sell for $400+. And, you can go passive or nearly silent with the 9600GT.
So, jealous? Not hardly. Not thinking? Try again.
I look at this stuff from two points of view:
- What's the state of technology?
- What would I recommend to friends for their builds?
In it's current incarnation, it doesn't pass muster for the second point. None of my gaming friends have 30" monitors or need to show a high Crysis benchmark....most are at 1680 x 1050 (22") and some at 1920 x 1200 (24"). At these resolutions, a $130 card generally gets the job done. If the user needs more horsepower, two $130 cards in SLI can kick butt.
I love this Tech Report article. A pair of 9600GT cards (costing about $260 these days) will hold their own vs most high end cards that sell for $400+. And, you can go passive or nearly silent with the 9600GT.
So, jealous? Not hardly. Not thinking? Try again.
-
- Posts: 239
- Joined: Sat Mar 29, 2008 7:05 pm
- Location: Toronto Ontario
I don't think the huge die size significantly raises the cost of production. Most of the cost is due to the R&D as well as the premium charged for the latest and greatest. One thing's for sure though, we can expect price drops on every other card. I learned this the painful way when I was forced to get a 7800gtx shortly before the 7900gt came out. I paid $500 for it back then and literally lost $200 in depreciation overnight.
A picture is worth a thousand words...widowmaker wrote:I don't think the huge die size significantly raises the cost of production. Most of the cost is due to the R&D as well as the premium charged for the latest and greatest. One thing's for sure though, we can expect price drops on every other card. I learned this the painful way when I was forced to get a 7800gtx shortly before the 7900gt came out. I paid $500 for it back then and literally lost $200 in depreciation overnight.
(From Anandtech)
-
- Posts: 239
- Joined: Sat Mar 29, 2008 7:05 pm
- Location: Toronto Ontario
That just proves that die size has no significant impact. If it did, logically the smaller die would cost significantly less. If you look at nvidia's income statements, you see that their research expenses are more than double the production, selling, admin, and general expenses combined. (For the quarterly period ending 2008-04-27)
http://finance.google.com/finance?fstyp ... ASDAQ:NVDA
Financial statements paint a bigger picture.
-
- Posts: 239
- Joined: Sat Mar 29, 2008 7:05 pm
- Location: Toronto Ontario
That is true. As the Anandtech article points out, per wafer there will be a significant difference. That however will have no large effect on the larger scale. Suppose you need to produce 100 chips. Smaller dies might allow you to create it all with 2 wafers, but larger dies may result in 5 wafers. We'll assume the production cost more than doubles. That still doesn't compare to the research and development costs. In the end it will affect the selling price of each unit, but at most probably a few dollars.
-
- Posts: 3142
- Joined: Mon Feb 26, 2007 9:20 am
- Location: Missing in Finnish wilderness, howling to moon with wolf brethren and walking with brother bears
- Contact:
Waffer costs arm and leg. More waffers you need to use produce GPU's more costly it will be. Currently GTX280 needs significantly more waffers than Ati HD 48x0-line. So possible revenue you get from GTX 280 is a lot smaller... GTX280 costs 385 - 425$ each for the manufacturer just he components and all... What Ati thing costs? Well under 300$. So it has quite a lot impact...