New nVidia GeForce 6800 -- DUAL power connectors

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

wumpus
Patron of SPCR
Posts: 946
Joined: Sat Sep 06, 2003 9:57 pm
Location: Berkeley, CA, USA
Contact:

New nVidia GeForce 6800 -- DUAL power connectors

Post by wumpus » Mon Apr 12, 2004 7:39 am

Carrying this over from another thread-- it deserves its own topic. nVidia is about to release its newest video card, the GeForce 6800 "ultra":

http://www.theinquirer.net/?article=15269

http://www.nvidiot.com/nv40.jpg

Did anyone else notice this card has TWO molex power connectors on the end? There is talk of nVidia "requiring" a 480w power supply, which I think is crap, but I could see this pushing us over 300w in real world gaming situations pretty easily..

DG
Posts: 424
Joined: Thu Feb 26, 2004 3:40 pm
Location: EU

Post by DG » Mon Apr 12, 2004 3:22 pm

I don't know about the 2 molex, but i know that's a BIG cooler!!! It's gonna be a very hot board, i think, at least the memory chips will be...Very hard to silence it..:(

PretzelB
Posts: 513
Joined: Tue Feb 11, 2003 6:53 am
Location: Frisco, TX

Post by PretzelB » Mon Apr 12, 2004 5:04 pm

The dual power connectors concerns me. There have been many debates on the actual power needed today but this new video card appears to change things a bit. I'd hate to get one of the ARM systems with a 300 psu or one of the recommended psus from this site and not be able to upgrade when the new video cards come out.

Rusty075
SPCR Reviewer
Posts: 4000
Joined: Sun Aug 11, 2002 3:26 pm
Location: Phoenix, AZ
Contact:

Post by Rusty075 » Mon Apr 12, 2004 6:57 pm

The stat that jumped out at me: Transistor count. The 6800 is going to have 220 million transistors. By comparison a P4 Northwood has 54 million, and the Prescotts have 125 million.

But I don't understand the 2 molex thing. Most people are just going to plug them both into plugs on the same sting. Even if they don't, in (nearly) every PSU out there all the molex's are sprung from the same line anyway. Are they going to require a revision to the ATX spec, and call for a dedicated VGA line?

And Ati has announced that it's X800xt will be available with 512megs of DDR3.

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Mon Apr 12, 2004 11:27 pm

Rusty075 wrote:But I don't understand the 2 molex thing. Most people are just going to plug them both into plugs on the same sting. Even if they don't, in (nearly) every PSU out there all the molex's are sprung from the same line anyway. Are they going to require a revision to the ATX spec, and call for a dedicated VGA line?
At the rate the power consumption of these damned graphics cards is increasing, I'd be more worried about having to have my house rewired and my own electricity sub-station! :)

It certainly calls the focus of the BTX spec (cooling power greedy CPUs) into question IMO.

1398342003
Posts: 324
Joined: Wed May 07, 2003 10:35 pm
Location: Surrey, B,C

Post by 1398342003 » Tue Apr 13, 2004 9:00 am

This kind of thing is bad, especially since the case air is already heated by the CPU. (as per the BTX spec)

Rusty075
SPCR Reviewer
Posts: 4000
Joined: Sun Aug 11, 2002 3:26 pm
Location: Phoenix, AZ
Contact:

Post by Rusty075 » Tue Apr 13, 2004 1:05 pm

The PCI-X spec also paves the way for higher wattage VGA's. In the AGP spec they can only pull 25 watts from the socket before they have to go to accessory power plugs...with PCI-X the limit is 60 watts.

1398342003
Posts: 324
Joined: Wed May 07, 2003 10:35 pm
Location: Surrey, B,C

Post by 1398342003 » Tue Apr 13, 2004 2:45 pm

^^^ :(

But those ones will have only 3 molex connecters.

Jan Kivar
Friend of SPCR
Posts: 1310
Joined: Mon Apr 28, 2003 4:37 am
Location: Finland

Post by Jan Kivar » Wed Apr 14, 2004 3:53 am

Rusty075 wrote:The PCI-X spec also paves the way for higher wattage VGA's. In the AGP spec they can only pull 25 watts from the socket before they have to go to accessory power plugs...with PCI-X the limit is 60 watts.
PCI-E, not PCI-X... :D 8)

Cheers,

Jan

Fabool
Posts: 141
Joined: Tue Jun 10, 2003 12:00 pm
Location: Finland

Post by Fabool » Wed Apr 14, 2004 4:48 am

http://www.warp2search.net/modules.php? ... &sid=17433

Links for reviews.
I don't think there's any chance of cooling this thing passively..

Interesting to see what ATi has up its sleeve.

sneaker
Posts: 133
Joined: Tue Dec 24, 2002 8:37 pm
Location: Australia

Re: New nVidia GeForce 6800 -- DUAL power connectors

Post by sneaker » Wed Apr 14, 2004 5:40 am

wumpus wrote:Carrying this over from another thread-- it deserves its own topic.
Yeah, in the Hot & Noisy VGA forum ;)

The most interesting thing to come out of this card are the power consumption measurements performed by Tom's Hardware:

http://www.tomshardware.com/graphic/200 ... 00-19.html

For example, a Radeon 9800XT demands 33W more from the mains than a 9600XT when doing next to nothing in Windows. Power consumption tests for various PC components (CPUs and video cards in particular) in various system states is something we need to see more of.

Gekkani
Posts: 116
Joined: Sun Aug 11, 2002 3:26 pm
Contact:

Post by Gekkani » Wed Apr 14, 2004 5:50 am

I dont understand TomsHardware... According to his test, a P4 3.2 & Geforce 6800 demands 288w Full load.

But at the end of the article we can read:


"To those of you ready to camp out in front of your store to get your hands on one of these cards - be warned! The first barrier on your way to graphics ecstasy is the power requirement calling for a PSU with at least 480Watts."

:?:

Cyberpukish
Posts: 59
Joined: Sun Jan 18, 2004 5:35 am

New photos

Post by Cyberpukish » Wed Apr 14, 2004 6:48 am

http://www.theinquirer.net/?article=15325
More pictures are out! That heatsink looks big.

Cyberpukish
Posts: 59
Joined: Sun Jan 18, 2004 5:35 am

Post by Cyberpukish » Wed Apr 14, 2004 6:56 am

Hmmm looking at the screenshots, i must say the quality of the nvidia rendering is really poor. Check out the pictures where they compare the, 6800ultra, 5950 ultra and the 9800XT. After looking at all the screenshots and wrenching my eyes (not to mention the clicking of back and forward on the browser buttons) ATi still renders and filters better than the nvidia card. What are your opinions?

wumpus
Patron of SPCR
Posts: 946
Joined: Sat Sep 06, 2003 9:57 pm
Location: Berkeley, CA, USA
Contact:

Post by wumpus » Wed Apr 14, 2004 8:07 am

I don't agree, and even if I did, I doubt that kind of "back and forth 5 times until I convince myself there's a difference between these two images" testing would never be discernable in actual gameplay.

I must admit the performance is very impressive. The card really is literally 2x as fast as the current 9800xt, in games that use pixel shaders extensively.. Halo for example. And at least 1.5x faster in a lot of other games, which is not exactly chopped liver either.

I sense an upgrade on my horizon. At least the power profile won't be that much worse than my current 9800 pro (slightly overclocked).

Image

Cyberpukish
Posts: 59
Joined: Sun Jan 18, 2004 5:35 am

Post by Cyberpukish » Wed Apr 14, 2004 8:18 am

I agree that when playing a game, we tend not to notice the quality, however, by looking at the farcry screen shots, you can see obvious differences. I didnt even had to try. I hope its the driver's fault, not the card as mentioned by Tom's Hardware. I may be thinking of getting the 9800pro instead of the 5900XT since the quality difference is quite drastic..... The next upgrade i will wait for ATi's solution to come and fight with Nvidia's, then hopefully a price war will follow ;))

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Wed Apr 14, 2004 8:29 am

Cyberpukish wrote:I agree that when playing a game, we tend not to notice the quality, however, by looking at the farcry screen shots, you can see obvious differences. I didnt even had to try. I hope its the driver's fault, not the card as mentioned by Tom's Hardware. I may be thinking of getting the 9800pro instead of the 5900XT since the quality difference is quite drastic..... The next upgrade i will wait for ATi's solution to come and fight with Nvidia's, then hopefully a price war will follow ;))
FarCry is clearly the odd-one-out in all of the tests I've seen. There must be something screwy going on ... somewhere.

With the other ones -- well to be honest I've never really seen the point of comparing AF/AA settings in fast-actioned FPS games using static screenshots (unless you want to sit and admire your beautifully rendered giblets that is). The differences are more evident in-game for slower paced games, and no static screenshot really tells the story IMO.

frosty
Posts: 636
Joined: Fri Jun 06, 2003 9:40 am
Location: USA

Post by frosty » Wed Apr 14, 2004 9:14 am

I think this will go over with hard core gamers as usual and not worry about heat/noise. Not me, I am content to play at lower res and no sound, no sound helps visual framerates and lets me know if someone is breaking in my home to kill me real time.

Fabool, me too, waiting for the ATI comeback, I love the video card wars!

1911user
Posts: 109
Joined: Sat Mar 13, 2004 12:08 pm
Location: Oklahoma

Post by 1911user » Wed Apr 14, 2004 5:24 pm

I just read the 6800 review at Anandtech. The fact that it requires 2 molex connectors says alot. Using just 1 power connector would have exceeded the ATX spec :shock: My 9700 pro almost seems wimpy by comparison, but I'm not paying $500 for a new video card and installing a 500W PSU. The bar has certainly been raised for video card performance. If they truly price the slighly degraded version at $300 as planned, 9800 prices are coming down this summer.

I'd like to try it with Far Cry which I'm currently playing. Rig#1 with the 9700 pro at 9800 pro speeds only rates a "medium" using the autosetup utility. Most of the graphics options also autoset at medium, but the game looks very good anyway. With 512MB ram, the game still stutters ocassionally. Another 512MB stick arrives tomorrow, then the real fun should begin. 1GB better be enough for games released this year. If you like intelligent shooters, this is a must have game.

wumpus
Patron of SPCR
Posts: 946
Joined: Sat Sep 06, 2003 9:57 pm
Location: Berkeley, CA, USA
Contact:

Post by wumpus » Wed Apr 14, 2004 5:55 pm

FarCry is clearly the odd-one-out in all of the tests I've seen. There must be something screwy going on ... somewhere.
Oh, THAT'S what you meant. Yeah. The GeForce 5xxx definitely had problems with this game due to the weak pixel shaders/precision, however, the 6xxx series has shader/precision power out the wazoo.. so I'm guessing in this case, the devs have defaulted to some reduced precision mode on all nVidia FX series hardware. Makes sense, since the game was released a few weeks ago, prior to the GeForce 6xxx launch.

I feel very confident this can be resolved with a patch.

apocalypse80
Posts: 90
Joined: Mon Feb 09, 2004 4:16 pm
Location: Greece

Post by apocalypse80 » Wed Apr 14, 2004 6:16 pm

Must......resist......the......urge......to.......upgrade..........

For all that power , it might as well have 12 connectors :D .
Nvidia really made a great card , no more FX ****.
If it wasn't for those IQ screenshots , I'd already be gathering money.
I've played farcry , both on my 9800pro and on a friends 5950U.
The difference in image quality was not little , at all , it looked incredibly better with the ATI (not to mention the fps difference : 9800pro fps = 1.5x 5950U fps).
However NV-40 is almost on par with ATI in IQ.
And with all that power , they might as well take away the "optimisations".

No way will this be cooled passively (except perhaps with a prescott HS...) , but I think a waterblock would do the trick :wink:.
As for the consumption , I bet they were thinking about no-name PSUs when they came up with it (I'd rather have a good brand 350W than a no-name 600W).

Cyberpukish
Posts: 59
Joined: Sun Jan 18, 2004 5:35 am

Post by Cyberpukish » Wed Apr 14, 2004 8:38 pm

1911user wrote:I just read the 6800 review at Anandtech. The fact that it requires 2 molex connectors says alot. Using just 1 power connector would have exceeded the ATX spec :shock: My 9700 pro almost seems wimpy by comparison, but I'm not paying $500 for a new video card and installing a 500W PSU. The bar has certainly been raised for video card performance. If they truly price the slighly degraded version at $300 as planned, 9800 prices are coming down this summer.

I'd like to try it with Far Cry which I'm currently playing. Rig#1 with the 9700 pro at 9800 pro speeds only rates a "medium" using the autosetup utility. Most of the graphics options also autoset at medium, but the game looks very good anyway. With 512MB ram, the game still stutters ocassionally. Another 512MB stick arrives tomorrow, then the real fun should begin. 1GB better be enough for games released this year. If you like intelligent shooters, this is a must have game.
You should check this out, http://www.xbitlabs.com/articles/video/ ... 40_38.html Even the mighty 6800ultra couldnt handle this game... Sim games are CPU intensive and GPU intensive?

1911user
Posts: 109
Joined: Sat Mar 13, 2004 12:08 pm
Location: Oklahoma

Post by 1911user » Wed Apr 14, 2004 9:15 pm

Cyberpukish wrote:
You should check this out, http://www.xbitlabs.com/articles/video/ ... 40_38.html Even the mighty 6800ultra couldnt handle this game... Sim games are CPU intensive and GPU intensive?
Wow, you have to go to 800x600 video for a decent framerate with the current top video cards :shock: and I thought Far Cry was a system killer.

wumpus
Patron of SPCR
Posts: 946
Joined: Sat Sep 06, 2003 9:57 pm
Location: Berkeley, CA, USA
Contact:

Post by wumpus » Wed Apr 14, 2004 9:19 pm

Well, if a game is 100% CPU limited, you could put in an infinitely fast video card and it really wouldn't make any difference.. with computers, you're always playing a game of "find the bottleneck".

1911user
Posts: 109
Joined: Sat Mar 13, 2004 12:08 pm
Location: Oklahoma

Post by 1911user » Wed Apr 14, 2004 10:11 pm

Since the framerates did vary at least 40% among the cards, I'm not convinced they are CPU limited. I can believe the cpus are heavily loaded and that might somehow affect the video card performance. Or the game could just be using enough advanced DirectX-9.0 effects to extremely load even the newest, best video cards. It's just that for 1024x768 and higher, the framerates were below 28 FPS for non-AA/AF and below 21 FPS with 4x/8x AA/AF enabled.

apocalypse80
Posts: 90
Joined: Mon Feb 09, 2004 4:16 pm
Location: Greece

Post by apocalypse80 » Wed Apr 14, 2004 10:55 pm

Did you think the reference cooler was a monster?

Just take a look at this Leadtek 6800U

Image

HUGE and copper , can anyone say SLK900U?

Cyberpukish
Posts: 59
Joined: Sun Jan 18, 2004 5:35 am

Post by Cyberpukish » Thu Apr 15, 2004 12:05 am

That heatsink is very big and tall! Instead of expanding horizontally, seems they go for height instead. The fins are really packed together. hmm i wonder if the zalman heat sink can cool it sufficiently without a fan...doubt so. Hmm They made it a closed system sort of cooling with the plastic enclosure. I suspect the noise will be higher than normal coolers since the velocity of air is faster. What do u think?

Sardaan
Posts: 42
Joined: Mon Apr 12, 2004 6:12 pm

Post by Sardaan » Thu Apr 15, 2004 6:40 am

This card gives me an even better excuse to water cool my next system. I can't wait to see ATI's announcement in a few weeks.

DGK
Posts: 70
Joined: Sun May 18, 2003 3:59 pm

Post by DGK » Thu Apr 15, 2004 6:51 pm

I think I will be keeping my 9700 pro until it gets to the low end of the performance scale. I thought the 9700 pro was a bit much and the only reason I finally got it was that I got a vga silencer for it which helped keep the noise/heat down. You know vid cards are starting to use to much power when they use more electricity than your fridge :lol:

grandpa_boris
Posts: 255
Joined: Thu Jun 05, 2003 9:45 am
Location: CA

Post by grandpa_boris » Fri Apr 16, 2004 10:29 am

apocalypse80 wrote:Did you think the reference cooler was a monster?

Just take a look at this Leadtek 6800U
OK, i don't get it. the 6800 cards are already eating up 2 slots. why not vent them to the outside? why pump all that hot air back into the case, where it will only cause us grief? what i am noticing most on the leadtek card is the fan that's smaller than the reference card had. smaller fan == higher RPMs == a lot more noise.

makes me want to start looking into massive watercooling solutions...

Post Reply