Nvidia G80 with external PSU
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
Nvidia G80 with external PSU
In the light of this news: http://www.theinquirer.net/default.aspx?article=34319
...let`s assume this will be a fact, would it be a good choice to switch back to much less powerful PSUs to power the rest of the system (300W), maybe complete passive PSUs, too?
Let me know what you think.
...let`s assume this will be a fact, would it be a good choice to switch back to much less powerful PSUs to power the rest of the system (300W), maybe complete passive PSUs, too?
Let me know what you think.
Actually I think this is just another rumour, it IS the Inquirer.
I don't believe nVidia wants to power their own cards - it'll come down to smaller benefit.
And powersupplies are better anyway - in efficiency maybe/maybe not; but in space and practiveness, they are.
Personally I really think external PSU's are a bad idea
I don't believe nVidia wants to power their own cards - it'll come down to smaller benefit.
And powersupplies are better anyway - in efficiency maybe/maybe not; but in space and practiveness, they are.
Personally I really think external PSU's are a bad idea
Even though it is the Inquirer their rumours about graphics cards are often correct. Furthermore, Anandtech posted similar information several weeks ago that the new cards may draw that much power and that external PSU`s may become a reality.
Anyway, may question was more in the direction, if passive PSU`s will become more viable in the case that this rumour turns out to be true.
What kind of PSU`s would be necessary, if the graphics card would be powered externally in the future? 250W, 300W, 400W?
Anyway, may question was more in the direction, if passive PSU`s will become more viable in the case that this rumour turns out to be true.
What kind of PSU`s would be necessary, if the graphics card would be powered externally in the future? 250W, 300W, 400W?
Last edited by Lux! on Thu Sep 14, 2006 8:04 am, edited 1 time in total.
-
- Posts: 38
- Joined: Mon Jul 31, 2006 5:53 pm
I agree with Aris.
Not to sound like a total fruit, but as the weather parterns continue to worsen throughout the world due to carbon emissions, cutting down on enery usage will become a large priority in almost everything we currently take for granted, and manufacturers will be forced to comply with this either through cutomer demand or governmental regulation.
First generation technologies will probably be power hogs, but the more powerful the GPUs get, the smaller the fab process becomes, which means less power consumption, less heat, less cost to the manufacturer, and spiffier processors for us.
Not to sound like a total fruit, but as the weather parterns continue to worsen throughout the world due to carbon emissions, cutting down on enery usage will become a large priority in almost everything we currently take for granted, and manufacturers will be forced to comply with this either through cutomer demand or governmental regulation.
First generation technologies will probably be power hogs, but the more powerful the GPUs get, the smaller the fab process becomes, which means less power consumption, less heat, less cost to the manufacturer, and spiffier processors for us.
Let's hope it comes true. Although smaller fab process also brings problems with current leakage etc, which are not insurmountable.the more powerful the GPUs get, the smaller the fab process becomes, which means less power consumption, less heat, less cost to the manufacturer, and spiffier processors for us.
If the G80 has 48 pipes and twice the die area of the G71, then one can expect 150-180 W.
That's if nVidia doesn't try to boost the frequency, which I doubt is possible at 90 nm.
Maybe at 80 nm they could try 750 MHz, we'll see.
Which means it's going to be VERY difficult to cool quietly, so the external PSU issue is of secondary importance.
I believe that a triple slot cooler, AC Silencer style, could be used to quietly cool a 180 W part.
That's if nVidia doesn't try to boost the frequency, which I doubt is possible at 90 nm.
Maybe at 80 nm they could try 750 MHz, we'll see.
Which means it's going to be VERY difficult to cool quietly, so the external PSU issue is of secondary importance.
I believe that a triple slot cooler, AC Silencer style, could be used to quietly cool a 180 W part.
My previous post was based on 'info' from The Inquirer, so not very reliable.
I just saw some new 'info' on G80, from vr-zone IIRC:
* Unified Shader Architecture
* Support FP16 HDR+MSAA
* Support GDDR4 memories
* Close to 700M transistors (G71 - 278M / G70 - 302M)
* New AA mode : VCAA
* Core clock scalable up to 1.5GHz
* Shader Peformance : 2x Pixel / 12x Vertex over G71
* 8 TCPs & 128 stream processors
* Much more efficient than traditional architecture
* 384-bit memory interface (256-bit+128-bit)
* 768MB memory size (512MB+256MB)
* Two models at launch : GeForce 8800GTX and GeForce 8800GT
* GeForce 8800GTX : 7 TCPs chip, 384-bit memory interface, hybrid water/fan cooler, water cooling for overclocking. US$649
* GeForce 8800GT : 6 TCPs chip, 320-bit memory interface, fan cooler. US$449-499
Take it with a large grain of salt...
I just saw some new 'info' on G80, from vr-zone IIRC:
* Unified Shader Architecture
* Support FP16 HDR+MSAA
* Support GDDR4 memories
* Close to 700M transistors (G71 - 278M / G70 - 302M)
* New AA mode : VCAA
* Core clock scalable up to 1.5GHz
* Shader Peformance : 2x Pixel / 12x Vertex over G71
* 8 TCPs & 128 stream processors
* Much more efficient than traditional architecture
* 384-bit memory interface (256-bit+128-bit)
* 768MB memory size (512MB+256MB)
* Two models at launch : GeForce 8800GTX and GeForce 8800GT
* GeForce 8800GTX : 7 TCPs chip, 384-bit memory interface, hybrid water/fan cooler, water cooling for overclocking. US$649
* GeForce 8800GT : 6 TCPs chip, 320-bit memory interface, fan cooler. US$449-499
Take it with a large grain of salt...