next gen nVidia to pull 200+ W

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

mond
Posts: 97
Joined: Thu Jun 05, 2003 7:58 am
Location: Europe
Contact:

next gen nVidia to pull 200+ W

Post by mond » Sat Apr 30, 2005 12:40 am


benx
Posts: 31
Joined: Thu Mar 03, 2005 10:34 am

Post by benx » Sat Apr 30, 2005 2:36 am

Its from the theinquirer so probably bullshit^^ but than still who cares about 225W if you can have photo realisme:D

sgtpokey
*Lifetime Patron*
Posts: 301
Joined: Sat Dec 14, 2002 11:29 pm
Location: Dublin, CA / Liverpool UK

Post by sgtpokey » Sat Apr 30, 2005 4:59 am

This sounds like one of those things the inquirer might actually get right --> they do get things right sometimes. They just never verify anything and just print out anything.

Regarding the who cares comment:

This is the silentpc forum, so I imagine a lot of people here care, given that power is related to heat and heat is related to noise. I still consider myself a gamer but I'd balk at such an output.

The only interesting thing to me is: given that the power of these gpus keep increasing, I can imagine that the next gen video cards will ALL have some kind of GPU thermal throttling system, where the full power of the gpu won't be turned on unless needed (A GPU Cool'N'Quiet).

Both ATI and NVIDIA do that with their notebook parts, It would make sense for them to just extend those capabilities to their desktop line.

benx
Posts: 31
Joined: Thu Mar 03, 2005 10:34 am

Post by benx » Sat Apr 30, 2005 5:23 am

if you want a silent pc you are not going to buy high end gamers cart;)

Shining Arcanine
Friend of SPCR
Posts: 502
Joined: Sat Oct 23, 2004 2:02 pm

Post by Shining Arcanine » Sat Apr 30, 2005 5:34 am

benx wrote:if you want a silent pc you are not going to buy high end gamers cart;)
I did. I have a ATI AIW 9700 Pro and it was a high end gaming graphics card in 2002. :P

mathias
Posts: 2057
Joined: Sun Jul 11, 2004 3:58 pm
Location: Toronto
Contact:

Post by mathias » Sat Apr 30, 2005 12:19 pm

I wonder if environmentalists will be up in arms about these power gluttons?

DGK
Posts: 70
Joined: Sun May 18, 2003 3:59 pm

Post by DGK » Sat Apr 30, 2005 12:59 pm

The graphics card power escalation is starting to get ridiculous. AMD with their A64 line and Intel with their Dothans have shown that with a smart design you can keep the power requirements low. Since the top of the line graphics card from Nvidia and ATI are over 400 bucks at least one of them should be designing cards with more energy efficient processors. They need to consider energy requirements early in the design stage and its obvious that they have not done this.

I am still using a 9700 Pro but the next time I buy a graphics card I will probably have to put my monitor, speakers, printer etc on a extension cord because I will be worried about putting to much of a load on one circuit. I am only semi-kidding with this thought.

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Sat Apr 30, 2005 1:11 pm

Over 200W, I don't think so, but their top 7xxx card may require upto 160W, according to an ITRS prediction. That's crazy enough, IMO.
It may require over 10A on the 12V1 line for just one card. For SLI, you would need a PSU with 24A on the 12V1 line, to be stable. And AFAIK now the best PSUs only have 18A on 12V1.

darthan
Posts: 237
Joined: Sat Apr 30, 2005 1:28 pm
Location: San Francisco

Post by darthan » Sat Apr 30, 2005 1:48 pm

It isn't impossible to have a gamer card and very near silence. I have a 9800Pro with the passive Zalman ZM80C-HP cooler and my card works without any trouble. If you look at these articles:
http://www.xbitlabs.com/articles/video/ ... rcons.html
http://www.xbitlabs.com/articles/video/ ... power.html
they show that even the 6800 Ultra doesn't draw hugely more power than a 9800Pro. If you used something like the newer ZM80D cooler you could probably put it on even a 6800 Ultra and it might just work. I don't know if a fan would be needed but if it was, a very undervolted Panaflo or something like that would be all you need.

Of course, if the next gen cards need 160 watts then we'll all have to watercool. I can't imagine what the standard cooling system will look (and sound *shudders*) like.

mathias
Posts: 2057
Joined: Sun Jul 11, 2004 3:58 pm
Location: Toronto
Contact:

Post by mathias » Sat Apr 30, 2005 2:22 pm

darthan wrote:I can't imagine what the standard cooling system will look (and sound *shudders*) like.
My first thought was like a VGA silencer(except with a loud fan of course), but that probably wouldn't help the card itself all that much. Maybe it'll be the opposite of a silencer, pulling air in from the outside. Or maybe it will have lots of heatpipes and huge loud fans. Or all of the above.

tay
Friend of SPCR
Posts: 793
Joined: Sat Dec 06, 2003 5:56 pm
Location: Boston, MA
Contact:

Post by tay » Sat Apr 30, 2005 2:54 pm

The current cards that require a connector dont pull power from the bus. But Inq are too stupid to take that into account. So it has a >75Watt requirement. This is not exactly shocking.

wim
Posts: 777
Joined: Wed Apr 28, 2004 5:16 am
Location: canberra, australia

Post by wim » Sat Apr 30, 2005 5:31 pm

DGK wrote:The graphics card power escalation is starting to get ridiculous. AMD with their A64 line and Intel with their Dothans have shown that with a smart design you can keep the power requirements low. Since the top of the line graphics card from Nvidia and ATI are over 400 bucks at least one of them should be designing cards with more energy efficient processors. They need to consider energy requirements early in the design stage and its obvious that they have not done this.
right on DGK!
when i first heard about dothan and 21W my immediate first thought was "why wasn't it always done like this?!" even for desktops. i can only think that a dramatically low power drain, now evidently entirely possible, was simply not a consideration and they didn't care. or at least a low priority and not a serious aim/design goal.

perhaps because so many people (idiots) think more power = better, not more power = ineffecient. e.g. "my new video card is so powerful, it's 200W!". sadly, same story goes for cars

mathias
Posts: 2057
Joined: Sun Jul 11, 2004 3:58 pm
Location: Toronto
Contact:

Post by mathias » Sat Apr 30, 2005 6:57 pm

wim wrote: perhaps because so many people (idiots) think more power = better, not more power = ineffecient. e.g. "my new video card is so powerful, it's 200W!". sadly, same story goes for cars
I wonder if Intel could succesfully use that for a marketing campaign, actually saying "our chips must be better, just look how much more power they use!"

dano
Posts: 27
Joined: Sat Feb 26, 2005 4:16 pm
Location: Dallas, Texas

Post by dano » Sat Apr 30, 2005 9:34 pm

No one brags about a low gas milage, if your referering to horse power though thats different.

Tibors
Patron of SPCR
Posts: 2674
Joined: Sun Jul 04, 2004 6:07 am
Location: Houten, The Netherlands, Europe

Post by Tibors » Sat Apr 30, 2005 10:18 pm

dano wrote:No one brags about a low gas milage,
Apparently you have never seen advertisements for compact cars in Europe.
(Why does this "discussion" about gas crop up so often? My computers don't need no gas.)

mathias
Posts: 2057
Joined: Sun Jul 11, 2004 3:58 pm
Location: Toronto
Contact:

Post by mathias » Sat Apr 30, 2005 10:18 pm

But Intel could use the spin that it heats the room. I think they could use an add campaign like that somewhere very cold, but it could really backfire if news of it spread to everywhere else.

Inefficiency isn't really a problem in the winter, but in the summer it hits you twice.

It would be neat if someone could trick Intel into pulling this little stunt.

EvoFire
Posts: 265
Joined: Tue Dec 21, 2004 10:06 pm
Location: Vancouver, BC
Contact:

Post by EvoFire » Sat Apr 30, 2005 10:41 pm

What are the typical wattage consumed by current cards anyways?? For like 9600's, 9800's, 6600's, X800's, 6800's?? I just wanna compare the power used today and tomorrow.

I think gas is always brought up here is because its the same as wattage used by computer components. People only started caring about efficiency recently, for both cars and computers.

Also question about why does it seem that GPU technologies seem to be behind CPU tech? Nvidia just started 130nm and last time I looked, ATi was still on 150nm?? At the same time, they seem to be using a lot faster memory chips and systems, why is that?

And also why does it seem that GPUs can take a harder beating in temperatures than CPUs do?

mathias
Posts: 2057
Joined: Sun Jul 11, 2004 3:58 pm
Location: Toronto
Contact:

Post by mathias » Sat Apr 30, 2005 10:52 pm

EvoFire wrote:Also question about why does it seem that GPU technologies seem to be behind CPU tech? Nvidia just started 130nm and last time I looked, ATi was still on 150nm?? At the same time, they seem to be using a lot faster memory chips and systems, why is that?
No, ATi's been using 130nm since 9600's (though they stuck with 150nm for 9800's), Nvidia's been using 130nm since FX5600, and they're both starting to use 110nm now, although it's apparently making the 6600's very leaky.

http://www.rojakpot.com/showarticle.asp ... =88&pgno=0

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Sat Apr 30, 2005 10:59 pm

EvoFire wrote:Also question about why does it seem that GPU technologies seem to be behind CPU tech? Nvidia just started 130nm and last time I looked, ATi was still on 150nm??
Intel and AMD own their own fabs, NVIDIA and ATI don't. That gives the CPU guys a lead.
At the same time, they seem to be using a lot faster memory chips and systems, why is that?
Because they need huge memory bandwidth whereas CPUs are limited by other things.

Comparing power consumption of CPUs to GPUs doesn't really get you very far IMO, they are very different beasts and they perform very different tasks. One might ask why hard-drives use so much more power than CD-ROM drives and wouldn't it be great if it wasn't like that ... but the answer is uninformative.

The next generation of cards will *not* use 100-200 watts if you buy the low-end variants.

EvoFire
Posts: 265
Joined: Tue Dec 21, 2004 10:06 pm
Location: Vancouver, BC
Contact:

Post by EvoFire » Sun May 01, 2005 12:00 am

mathias wrote:
EvoFire wrote:Also question about why does it seem that GPU technologies seem to be behind CPU tech? Nvidia just started 130nm and last time I looked, ATi was still on 150nm?? At the same time, they seem to be using a lot faster memory chips and systems, why is that?
No, ATi's been using 130nm since 9600's (though they stuck with 150nm for 9800's), Nvidia's been using 130nm since FX5600, and they're both starting to use 110nm now, although it's apparently making the 6600's very leaky.

http://www.rojakpot.com/showarticle.asp ... =88&pgno=0
I didn't know that... or probably didn't remember after reading, I have a rather bad memory. And 6600's leaky?? First time I've heard that, though, it might be true, I'm not sure.

nutball wrote:
EvoFire wrote:Also question about why does it seem that GPU technologies seem to be behind CPU tech? Nvidia just started 130nm and last time I looked, ATi was still on 150nm??
Intel and AMD own their own fabs, NVIDIA and ATI don't. That gives the CPU guys a lead.
At the same time, they seem to be using a lot faster memory chips and systems, why is that?
Because they need huge memory bandwidth whereas CPUs are limited by other things.

Comparing power consumption of CPUs to GPUs doesn't really get you very far IMO, they are very different beasts and they perform very different tasks. One might ask why hard-drives use so much more power than CD-ROM drives and wouldn't it be great if it wasn't like that ... but the answer is uninformative.

The next generation of cards will *not* use 100-200 watts if you buy the low-end variants.
I didn't know they didn't have their own fabs. One would think they are making their own chips since they have so much revenue...

Isn't the more bandwidth the better?? Since DDR3 is already out and video cards are using it, why didn't Intel skip DDR2 and go straight to DDR3?

Maybe I'm just simple minded, or I'm just not knowledgable, but I like to compare things that are similar. Technically, GPUs and CPUs are fairly similar aren't they? Boiling it all down, they are there doing calculations, albeit at different speeds. Also the thing with CD rom and HDDs, they all spin, but the difference would be well know as HDDs have mutiple platters, mutiple heads to move, and are of a heavier material, right? Maybe I'm just being stupid, hahaha.

Splinter
Posts: 245
Joined: Mon Jul 05, 2004 9:01 pm

Post by Splinter » Sun May 01, 2005 1:18 am

CPUs and GPUs are very very different.

If you do a bit of googling, you'll find something far more indepth, but in very basic terms, CPUs are designed as general workhorses. They can do whatever calculations you want. They'll take whatever you throw at them. GPUs only do very specific things. They work with triangles and pixels. Shading and texturing. They can't do anything but graphics, they're only designed to do graphics, and they do it well.

Imagine a pickup truck with a winch and a great big 700hp engine. It can do a lot, it can haul heavy stuff, drive pretty fast, pull things around, tow things. It can do a lot. That's your CPU.

Then imagine a nice fast ferrari. Maybe only a 400hp engine, but it can drive circles around that pickup. It cant really do much else though. That's your GPU.

DG
Posts: 424
Joined: Thu Feb 26, 2004 3:40 pm
Location: EU

Post by DG » Sun May 01, 2005 3:01 am

I don't think that 200W cards would become a reality in the (near) future. How they will cool such a beast? Air cooling can't handle it. Maybe watercooling... :?

nici
Posts: 3011
Joined: Thu Dec 16, 2004 8:49 am
Location: Suomi Finland Perkele

Post by nici » Sun May 01, 2005 5:52 am

It will probably never happen, im just guessing here, but i would think that computers have pretty much reached the highest power-draw they will have. Newer CPUs are faster but still use less power, HDDs are likely to move towards smaller sizes soon, PSUs are getting more efficient.. I would imagine GPUs would follow this in a while because if the heat-dissipation continues to increase, it swill soon be impossible to cool the cards with something that only takes up one slot. Im also hoping graphics card manufacturers to incorporate something similar to Cool n´Quiet sometime in the near future..

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Sun May 01, 2005 6:19 am

AFAIK ATi may also launch a very hot running next-gen GPU. They bought the technology to manufacture GPUs in a more CPU-like manner, which should allow for higher frequencies, but they will probably have to limit the number of pipes in order to avoid a meltdown. ;-)
For nVidia, a 32-pipe (general purpose, both for VS and PS) GPU, running at about 600 MHz, manufactured at 0.11 micron, should dissipate 120-150W. I just hope the lower power version (like the vanilla 6800 of today) won't be so hungry...
At worst, we'll need two extra slots for the cooling solution. I wonder how will the owners of DFI SLI mobos handle that... ;-)

tay
Friend of SPCR
Posts: 793
Joined: Sat Dec 06, 2003 5:56 pm
Location: Boston, MA
Contact:

Post by tay » Sun May 01, 2005 8:04 am

A couple of things :

- 200 watt cards are not coming in the next generation. Does the existance of PCIe 6-pin plugs mean we have 150 watt cards? No!!! So the existance of 2 PCIe plugs does not mean we will have 200w cards. Now that thats cleared up....

- Nv/ATI are not manufacturers. As such they are limited by TSMC/UMC/IBM etc. The interfacing to another manufacturers process hinders moving to the latest/greatest. GPU manufacturers have also been burned by issues in manufacturing killing their output/availability.

- In addition GPUs get redesigned to a greater extent and have more logic transistors than CPUs. The NV40 is quite different from the NV30 which in turn is different from the NV20. The Pentium D is still closely related to the 1.4 Ghz P4. The A64 a close relative of the K7. Redesigning to an unknown process isnt as easy as to a known one which is one of the reasons process is lagging.

- Clock speeds are limited by issues like large transistor count, process, heat, and power. Theoretically a GPU can be pipelined more than a CPU, so should not be clock limited. It is less complicated at a basic level and has more repeated elements. With modern day GPUs becoming like CPUs this distinction is gone.

- WRT memory a GPU benefits more from the latest/greatest than a CPU (maybe more data crunching, lower cache hit ratio). So you see more advanced memory on GPUs.
Last edited by tay on Sun May 01, 2005 9:38 am, edited 1 time in total.

alglove
Posts: 363
Joined: Fri Feb 06, 2004 11:21 am
Location: Houston, TX, USA

Post by alglove » Sun May 01, 2005 9:27 am

Tibors wrote:
dano wrote:No one brags about a low gas milage,
Apparently you have never seen advertisements for compact cars in Europe.
Sorry to wander off-topic here, but by low gas mileage, dano means the "miles per gallon" figure used in the US, not "liters per 100 km". That confused me for a little while the last time I went to Europe.

EvoFire, I have read in the rumor mill that AMD is considering extending the use of DDR, skipping DDR2, and going straight to DDR3 or XDR, as you have suggested. However, the GDDR3 currently in use by graphics cards is a type of DDR optimized for graphics cards.

mathias
Posts: 2057
Joined: Sun Jul 11, 2004 3:58 pm
Location: Toronto
Contact:

Post by mathias » Sun May 01, 2005 9:37 am

EvoFire wrote: I didn't know that... or probably didn't remember after reading, I have a rather bad memory. And 6600's leaky?? First time I've heard that, though, it might be true, I'm not sure.
It runs hotter than more powerful 6800NU's, almost as hot as 6800GT's, despite using a finner process. Seems pretty obvious to me.
EvoFire wrote:Isn't the more bandwidth the better?? Since DDR3 is already out and video cards are using it, why didn't Intel skip DDR2 and go straight to DDR3?
I'm pretty sure DDR2 is completely different from GDDR2. It might have similarities with it, and with GDDR3, but it's a separate specification.
EvoFire wrote:Also the thing with CD rom and HDDs, they all spin, but the difference would be well know as HDDs have mutiple platters, mutiple heads to move, and are of a heavier material, right?
I'm pretty sure the main difference is that hard drives internals are isolated and securely attached together.
Splinter wrote:GPUs only do very specific things. They work with triangles and pixels. Shading and texturing. They can't do anything but graphics, they're only designed to do graphics, and they do it well.
I heard that GPU's do mostly or entirely floating point operations, and that they could be made to do things other that graphic rendering.

EvoFire
Posts: 265
Joined: Tue Dec 21, 2004 10:06 pm
Location: Vancouver, BC
Contact:

Post by EvoFire » Sun May 01, 2005 10:39 am

Splinter: I understand now, thank you for ur excellent analogy.
DG wrote:I don't think that 200W cards would become a reality in the (near) future. How they will cool such a beast? Air cooling can't handle it. Maybe watercooling...
I don't think it will happen either, and at least not in the current configuration where the heat has nowhere to go.
nici wrote:It will probably never happen, im just guessing here, but i would think that computers have pretty much reached the highest power-draw they will have. Newer CPUs are faster but still use less power, HDDs are likely to move towards smaller sizes soon, PSUs are getting more efficient.. I would imagine GPUs would follow this in a while because if the heat-dissipation continues to increase, it swill soon be impossible to cool the cards with something that only takes up one slot. Im also hoping graphics card manufacturers to incorporate something similar to Cool n´Quiet sometime in the near future..
I doubt that. Intel is still pushing the thermal envelope of the P4 processors. Now with DC chips, they can only release lower clockspeed versions due to heat restrictions. I think if they ever did 3.8Ghz on a Pentium D, it would reach excesses of 180W.
alglove wrote:EvoFire, I have read in the rumor mill that AMD is considering extending the use of DDR, skipping DDR2, and going straight to DDR3 or XDR, as you have suggested. However, the GDDR3 currently in use by graphics cards is a type of DDR optimized for graphics cards.
I guess that's cool. There doesn't seem to be any advantage for AMD to move to DDR2 anyways... DDR2 isn't any much faster than DDR.
mathias wrote:It runs hotter than more powerful 6800NU's, almost as hot as 6800GT's, despite using a finner process. Seems pretty obvious to me.
Oh, you learn something new everyday.
mathias wrote:I'm pretty sure DDR2 is completely different from GDDR2. It might have similarities with it, and with GDDR3, but it's a separate specification.
hmm... another thing learned, I never knew that, I thought it would be interchangable. I have been wondering why graphics memory run at so much higher clock speeds and doesn't seem to make the card go up that much in price while computer memory costs a fortune.
mathias wrote:I'm pretty sure the main difference is that hard drives internals are isolated and securely attached together.
I dunno, no knowledge in that category, I'm just stating what I know personally that I think would affect power usage based on my high school physics.

mathias
Posts: 2057
Joined: Sun Jul 11, 2004 3:58 pm
Location: Toronto
Contact:

Post by mathias » Sun May 01, 2005 12:59 pm

Regarding what the cooler for such a hot card would be like, here's how I think the back of it would look:

Image

(the orange stuff is heatpipes, the inside is a radial fan. Also, the fins would be a lot more densely packed)

EvoFire
Posts: 265
Joined: Tue Dec 21, 2004 10:06 pm
Location: Vancouver, BC
Contact:

Post by EvoFire » Sun May 01, 2005 2:24 pm

Nice drawing Mathias, well... gets the point across, hahaha, and even not very experenced users like me can understand.

Post Reply