The new Nvidia GTX 280 : 240W TDP. Are silencers doomed?

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

oscar3d
Posts: 202
Joined: Sat Dec 10, 2005 11:35 am
Location: California

The new Nvidia GTX 280 : 240W TDP. Are silencers doomed?

Post by oscar3d » Thu May 22, 2008 2:51 pm

HI:

I've been checking the latest news on the upcoming Nvidia GTX 280 (formerly called the 9900 GTX), which is supposed to be out on June 18th. Just two days after the release of the AMD/ATI 4870 series.

http://www.tweaktown.com/news/9487/nvid ... ped_naked/

All the specs are very impressive. Except for one thing...

The TDP of the card is 240W.

Unnoficial sources say that Coolermaster is currently working on the cooler for this... this THING!

Imagine... the 8800 Ultra's TDP was 185W and that was toasty enough for everybody. Many people on Nvidia boards have complained for overheating problems with the 8800's series. What can we expect with this?

a) A furnace?
b) One component that will create a fire inside our rigs?


I'm kind of dissapointed specially since the 8800 Ultras were built at 90nm.

These new GTX 280's are made at 65nm, and even so, the size of the chips is the biggest one I've ever seen: about 576mm2.

At least for the design in the pictures the cooler is expected to exhaust the air outside the case.

But my guess is that with that huge die size, Tharmalright, Arctic and Zalman will have a new challenge. With a TDP of 240W I think it is very unlikely that we have an aftermarket cooler solution whithout exhaust.

I say with no exahsut solution, if we ever put this... ABOMINATION, inside our rigs, our other components will eventually overheat and die. :-)

Also, forget about silence, I think we will see Delta fans all around this thing.

LOL!

I'm starting to hate NVIDIA, with all the global warming we are facing, and trying to keep up with energy costs, now I can see a bunch of spoled brats putting 3 of these things on SLI and burning their rigs with a 2000W power supplies.... LMAO!

At least AMD is having the winning on energy conservation. I sincerely hope they grab the crwon this time. I'm no fan of either ATI or NVIDIA, but I'm just hoping for the time I get rid of my Ultra and get a real evolved solution

Opinions. I would like to see the consensus.

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Thu May 22, 2008 3:11 pm

To be fair, even the stock 8800 Ultra cooler is very quiet, and it only ramps up under serious load (most games don't touch it).

To be honest I'm not that bothered about the noise level when maxed out (within reason), because that condition only occurs when playing high-end games which make plenty of noise anyway.

The important part is how loud the cooler is on the desktop, when I care about noise. In that area, the current 8800 cards are actually quite good, even if they do idle at about 60-65C. ATI are definitely way ahead in that area, but as I have not heard the 3870 cooler I can't make a comparison.

Mikey
Friend of SPCR
Posts: 156
Joined: Mon Aug 14, 2006 8:14 pm

Post by Mikey » Thu May 22, 2008 4:27 pm

to be even more fair, the 9800GTX cooler which has to deal with less heat than the 8800GTX/ultra, is louder than both! :P

dragmor
Posts: 301
Joined: Sun Jul 10, 2005 7:54 pm
Location: Oz

Post by dragmor » Thu May 22, 2008 9:16 pm

Yeah but if you purchase a new nvidia motherboard with your expensive graphics card it will be, Load = 240W, Idle = 0w. Since the new nvidia boards can turn off the discrete card and use the onboard.

But if you have an intel chipset then your out of luck.

The next gen AMD/ATI's released in June are supposed to be ~110w and ~140w load and ~10w idle on any motherboard.

Modo
Posts: 486
Joined: Wed Apr 16, 2008 3:32 am
Location: Poland

Post by Modo » Thu May 22, 2008 10:07 pm

I wouldn't worry about the high-end monsters that serve one purpose: Generating buzz.

What I look forward to, is the new production process. The last change brought us the 9600GT, which is twice as fast as the previous readily available NVidia GPU with easy passive cooling (8600GTS), while having almost the same power draw. Even without large architecture changes, the smaller production process should provide another contender in this area.

aztec
Posts: 443
Joined: Mon Dec 12, 2005 5:01 am
Location: Foster City, CA

Post by aztec » Thu May 22, 2008 11:40 pm

Modo wrote:I wouldn't worry about the high-end monsters that serve one purpose: Generating buzz.

What I look forward to, is the new production process. The last change brought us the 9600GT, which is twice as fast as the previous readily available NVidia GPU with easy passive cooling (8600GTS), while having almost the same power draw. Even without large architecture changes, the smaller production process should provide another contender in this area.
Yep! I'm looking for the GTX 2xx equivalent of the 9600GT @55nm.

If power draw is high, I'll stick with my 8800GTS/HR-03 Plus and wait another refresh.

I just want to play Crisis eventually and Stalker: Clear Skies at full-res. :D

Elvellon
Posts: 104
Joined: Sun Dec 09, 2007 1:19 am
Location: Moscow, Russia
Contact:

Post by Elvellon » Fri May 23, 2008 12:12 am

Now let's just wait one generation for the slightly and moderately cut-down shrinks at a reasonable price point (like 8800GT and 9600GT to 8800GTX/Ultra).

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Fri May 23, 2008 12:31 am

I'd say it's overplaying things a tad to suggest that GTX 280 spells doom for silencers. For those who want silent and a GTX 280, well they've got a challenge on their hands. For the rest of us who want silence but don't want a GTX 280 well it doesn't make much of a difference does it :)

It's quite entertaining that AMD/ATI somehow come out of this round smelling of roses because their graphics card only pulls 140W.

I have my doubts that the cut-down versions of this card will be any more sensible. I mean you could disable half of it and you'd still be in power loony-land. As for shrinks of this they'll be a way off yet.

The upcoming die-shrunk G92b (9900GTX I think it's called) looks like it might be quite interesting, it has the potential to be a new sweet-spot and it should support HybridPower too.

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Fri May 23, 2008 3:34 am

nutball wrote:It's quite entertaining that AMD/ATI somehow come out of this round smelling of roses because their graphics card only pulls 140W.
I honestly don't see a problem with this. As long as idle/2D power is very low and load power is reasonable (so a 400W PSU will cope with it) then heat/noise production in gaming mode isn't a big issue for me. Games are noisy anyway.

Well, maybe if you do a lot of 3D modelling or CAD or something it might be an issue I guess. Even then, I bet that doesn't heat the card up that much.

It seems like an ideal solution to me. Power when you need it, but otherwise low heat/power/noise.

FartingBob
Patron of SPCR
Posts: 744
Joined: Tue Mar 04, 2008 4:05 am
Location: London
Contact:

Post by FartingBob » Fri May 23, 2008 7:03 am

I wasnt sure between this and the 4800 series from ATI. Now its a no contest, i just hope that ATI use the same mounting holes as my S1. Probably wont just to annoy people (really, why is there any need for more than 1 type of mounting on all cards??) and we'll have to wait for a good passive cooler.

Modo
Posts: 486
Joined: Wed Apr 16, 2008 3:32 am
Location: Poland

Post by Modo » Fri May 23, 2008 9:10 am

FartingBob wrote:I wasnt sure between this and the 4800 series from ATI. Now its a no contest, i just hope that ATI use the same mounting holes as my S1.
What makes you think an ATI offering of similar capabilities will consume (much) less power?

FartingBob
Patron of SPCR
Posts: 744
Joined: Tue Mar 04, 2008 4:05 am
Location: London
Contact:

Post by FartingBob » Fri May 23, 2008 9:32 am

Modo wrote:
FartingBob wrote:I wasnt sure between this and the 4800 series from ATI. Now its a no contest, i just hope that ATI use the same mounting holes as my S1.
What makes you think an ATI offering of similar capabilities will consume (much) less power?
This.
114W TDP for 4850 and 157W TDP for the 4870.
Obviously we'll have to see benchmarks before we decide how efficient they are compared to current cards, but if the 240w TDP rumour is true for the 280 thats 83W more than its rival. Even if it outperforms it in benchmarks (id say likely) it wouldnt be worth the extra heat IMO. You have no chance of cooling 240w cards passively unless you live in the Arctic circle.

Modo
Posts: 486
Joined: Wed Apr 16, 2008 3:32 am
Location: Poland

Post by Modo » Fri May 23, 2008 9:45 am

FartingBob wrote:Obviously we'll have to see benchmarks before we decide how efficient they are compared to current cards,[snip]
And to compare them against each other on an FPS per watt scale.
FartingBob wrote:Even if it outperforms it in benchmarks (id say likely) it wouldnt be worth the extra heat IMO.
Agreed. Hence my comment about a 9600GT replacement. :)

widowmaker
Posts: 239
Joined: Sat Mar 29, 2008 7:05 pm
Location: Toronto Ontario

Post by widowmaker » Tue Jun 17, 2008 2:13 pm

Now that it's here, I'll revive this thread. I'll start off by posting:

Image

I'm dubious about Tom's Hardware and their sound analysis, but that's the first thing I came across. With ginormous dies, perhaps DHT coolers can actually make use of more heatpipes now. Next thing you know, we'll have soundcard companies suing Nvidia for lost revenue because they deafened all their customers.

Shadout
Posts: 117
Joined: Tue Apr 01, 2008 6:04 pm
Location: Denmark

Post by Shadout » Tue Jun 17, 2008 2:39 pm

Its hard to believe it would be that high on idle, especially considering the new GTX cards have a much lower idle watt usage (and why would it be so much higher on idle at GTX 280 than GTX 260...).

Most reviews seems to agree that its very loud on full load though.

However, isnt that the case for any standard GPU cooler.

The GTX 280 doesnt seem to reach its TDP by the way (probably not a surprise), still, it gets over 200W, a fair amount :O (many of the places where they report the power usage, they are testing overclocked versions)

What turns me off on the GTX isnt really the excessive power use, it seems to be reasonable considered the cards performance.
It is the insane price. Should cost $150 less just to be somewhat competitive (still paying a premium for getting the fastest).


Just wondering one thing, partly related to the GTX 280.
What is 'easiest' to cool quietly (not passive)? One card drawing 200W or 2 cards drawing 100W each?

FartingBob
Patron of SPCR
Posts: 744
Joined: Tue Mar 04, 2008 4:05 am
Location: London
Contact:

Post by FartingBob » Tue Jun 17, 2008 2:43 pm

From a performance standpoint they are very much a let down. Around the same as the 9800GX2 for hundreds of dollars more, with loud beefy coolers and more power needed at loud than my whole computer.

aSASa
Posts: 130
Joined: Fri May 16, 2008 12:54 pm
Location: Milwaukee, Wisconsin

Post by aSASa » Tue Jun 17, 2008 2:52 pm

Mikey wrote:to be even more fair, the 9800GTX cooler which has to deal with less heat than the 8800GTX/ultra, is louder than both! :P
False - They use the same fans as does the 8800GTS 512.

krille
Posts: 357
Joined: Thu Mar 23, 2006 4:56 am
Location: Sweden

Post by krille » Tue Jun 17, 2008 3:56 pm

Simple: Don't buy a card you can't or won't cool quietly.

Beasts like the GTX 280 always have a market, if you don't like it just make sure you're not it. If you're serious about cooling this one quietly, I'd advice water-cooling. Should be able to handle it easily (with some quality components).

I'm definitely going to try WC some day. I've even got two high-end WC pumps lying around (1xIwaki RD-30 and 1xIwaki RD-20). Just need to find a good radiator like this one (definitely check it out!). :D

CA_Steve
Moderator
Posts: 7651
Joined: Thu Oct 06, 2005 4:36 am
Location: St. Louis, MO

Post by CA_Steve » Tue Jun 17, 2008 3:57 pm

I've read the various reviews posted this week....and am bemused by the results shown.
- insane price
- insane power in 3D mode
- very little performance benefit vs other cheaper/lower power/much quieter solutions.

Until this chip gets shrunk from 65nm to 55nm, this is one useless product.

Modo
Posts: 486
Joined: Wed Apr 16, 2008 3:32 am
Location: Poland

Post by Modo » Tue Jun 17, 2008 9:23 pm

CA_Steve wrote: - very little performance benefit vs other cheaper/lower power/much quieter solutions.
Until NVidia fixes the drivers. The decrease in performance in some low resolutions and low settings indicates that there is still much to optimize.

aSASa
Posts: 130
Joined: Fri May 16, 2008 12:54 pm
Location: Milwaukee, Wisconsin

Post by aSASa » Tue Jun 17, 2008 10:33 pm

CA_Steve wrote:I've read the various reviews posted this week....and am bemused by the results shown.
- insane price
- insane power in 3D mode
- very little performance benefit vs other cheaper/lower power/much quieter solutions.

Until this chip gets shrunk from 65nm to 55nm, this is one useless product.
Either your jealous or are not thinking.

There's obvious power in this thing, and plenty of room to improve from drivers. It owns things like the 8800GTX, GTS 512, etc -- with CRAP drivers.

jaganath
Posts: 5085
Joined: Tue Sep 20, 2005 6:55 am
Location: UK

Post by jaganath » Wed Jun 18, 2008 1:09 am

aSASa wrote:Either your jealous or are not thinking.

There's obvious power in this thing, and plenty of room to improve from drivers. It owns things like the 8800GTX, GTS 512, etc -- with CRAP drivers.
that was a counterproductive comment. obviously it outperforms an 8800GTX- and so it should, for it is twice the price! and what exactly is CA_Steve supposed to be jealous of, having a small nuclear reactor in his PC? you will learn quickly that SPCR is not the kind of place that tolerates "l33t/haxor" kiddies who can't communicate in a civilised and respectful way.

CA_Steve
Moderator
Posts: 7651
Joined: Thu Oct 06, 2005 4:36 am
Location: St. Louis, MO

Post by CA_Steve » Wed Jun 18, 2008 7:06 am

NVidia outran their process when they developed this thing. Huge chip = low yields, low good die per wafer, leading to very high price as well as the crazy high power req. As a demonstration vehicle to show processing power, it's nice. The wider memory path fixes a known bottleneck. But, until they shrink it to 55nm, it's just a fanboy product.

I look at this stuff from two points of view:
- What's the state of technology?
- What would I recommend to friends for their builds?

In it's current incarnation, it doesn't pass muster for the second point. None of my gaming friends have 30" monitors or need to show a high Crysis benchmark....most are at 1680 x 1050 (22") and some at 1920 x 1200 (24"). At these resolutions, a $130 card generally gets the job done. If the user needs more horsepower, two $130 cards in SLI can kick butt.

I love this Tech Report article. A pair of 9600GT cards (costing about $260 these days) will hold their own vs most high end cards that sell for $400+. And, you can go passive or nearly silent with the 9600GT.

So, jealous? Not hardly. Not thinking? Try again.

widowmaker
Posts: 239
Joined: Sat Mar 29, 2008 7:05 pm
Location: Toronto Ontario

Post by widowmaker » Wed Jun 18, 2008 8:35 am

I don't think the huge die size significantly raises the cost of production. Most of the cost is due to the R&D as well as the premium charged for the latest and greatest. One thing's for sure though, we can expect price drops on every other card. I learned this the painful way when I was forced to get a 7800gtx shortly before the 7900gt came out. I paid $500 for it back then and literally lost $200 in depreciation overnight.

CA_Steve
Moderator
Posts: 7651
Joined: Thu Oct 06, 2005 4:36 am
Location: St. Louis, MO

Post by CA_Steve » Wed Jun 18, 2008 9:08 am

widowmaker wrote:I don't think the huge die size significantly raises the cost of production. Most of the cost is due to the R&D as well as the premium charged for the latest and greatest. One thing's for sure though, we can expect price drops on every other card. I learned this the painful way when I was forced to get a 7800gtx shortly before the 7900gt came out. I paid $500 for it back then and literally lost $200 in depreciation overnight.
A picture is worth a thousand words...
(From Anandtech)

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Wed Jun 18, 2008 9:29 am

Just wait for ATI's new cards to become available. It's looking like they should offer excellent performance and reasonable power requirements.

widowmaker
Posts: 239
Joined: Sat Mar 29, 2008 7:05 pm
Location: Toronto Ontario

Post by widowmaker » Wed Jun 18, 2008 9:35 am

CA_Steve wrote:A picture is worth a thousand words...
(From Anandtech)
That just proves that die size has no significant impact. If it did, logically the smaller die would cost significantly less. If you look at nvidia's income statements, you see that their research expenses are more than double the production, selling, admin, and general expenses combined. (For the quarterly period ending 2008-04-27)

http://finance.google.com/finance?fstyp ... ASDAQ:NVDA

Financial statements paint a bigger picture.

Cistron
Posts: 618
Joined: Fri Mar 14, 2008 5:18 am
Location: London, UK

Post by Cistron » Wed Jun 18, 2008 9:50 am

The die-size influences the production capcities though. If you can squeeze more processors onto one wafer, the output will increase.

widowmaker
Posts: 239
Joined: Sat Mar 29, 2008 7:05 pm
Location: Toronto Ontario

Post by widowmaker » Wed Jun 18, 2008 10:00 am

That is true. As the Anandtech article points out, per wafer there will be a significant difference. That however will have no large effect on the larger scale. Suppose you need to produce 100 chips. Smaller dies might allow you to create it all with 2 wafers, but larger dies may result in 5 wafers. We'll assume the production cost more than doubles. That still doesn't compare to the research and development costs. In the end it will affect the selling price of each unit, but at most probably a few dollars.

thejamppa
Posts: 3142
Joined: Mon Feb 26, 2007 9:20 am
Location: Missing in Finnish wilderness, howling to moon with wolf brethren and walking with brother bears
Contact:

Post by thejamppa » Wed Jun 18, 2008 11:44 am

Waffer costs arm and leg. More waffers you need to use produce GPU's more costly it will be. Currently GTX280 needs significantly more waffers than Ati HD 48x0-line. So possible revenue you get from GTX 280 is a lot smaller... GTX280 costs 385 - 425$ each for the manufacturer just he components and all... What Ati thing costs? Well under 300$. So it has quite a lot impact...

Post Reply