Page 1 of 2

New nVidia GeForce 6800 -- DUAL power connectors

Posted: Mon Apr 12, 2004 7:39 am
by wumpus
Carrying this over from another thread-- it deserves its own topic. nVidia is about to release its newest video card, the GeForce 6800 "ultra":

http://www.theinquirer.net/?article=15269

http://www.nvidiot.com/nv40.jpg

Did anyone else notice this card has TWO molex power connectors on the end? There is talk of nVidia "requiring" a 480w power supply, which I think is crap, but I could see this pushing us over 300w in real world gaming situations pretty easily..

Posted: Mon Apr 12, 2004 3:22 pm
by DG
I don't know about the 2 molex, but i know that's a BIG cooler!!! It's gonna be a very hot board, i think, at least the memory chips will be...Very hard to silence it..:(

Posted: Mon Apr 12, 2004 5:04 pm
by PretzelB
The dual power connectors concerns me. There have been many debates on the actual power needed today but this new video card appears to change things a bit. I'd hate to get one of the ARM systems with a 300 psu or one of the recommended psus from this site and not be able to upgrade when the new video cards come out.

Posted: Mon Apr 12, 2004 6:57 pm
by Rusty075
The stat that jumped out at me: Transistor count. The 6800 is going to have 220 million transistors. By comparison a P4 Northwood has 54 million, and the Prescotts have 125 million.

But I don't understand the 2 molex thing. Most people are just going to plug them both into plugs on the same sting. Even if they don't, in (nearly) every PSU out there all the molex's are sprung from the same line anyway. Are they going to require a revision to the ATX spec, and call for a dedicated VGA line?

And Ati has announced that it's X800xt will be available with 512megs of DDR3.

Posted: Mon Apr 12, 2004 11:27 pm
by nutball
Rusty075 wrote:But I don't understand the 2 molex thing. Most people are just going to plug them both into plugs on the same sting. Even if they don't, in (nearly) every PSU out there all the molex's are sprung from the same line anyway. Are they going to require a revision to the ATX spec, and call for a dedicated VGA line?
At the rate the power consumption of these damned graphics cards is increasing, I'd be more worried about having to have my house rewired and my own electricity sub-station! :)

It certainly calls the focus of the BTX spec (cooling power greedy CPUs) into question IMO.

Posted: Tue Apr 13, 2004 9:00 am
by 1398342003
This kind of thing is bad, especially since the case air is already heated by the CPU. (as per the BTX spec)

Posted: Tue Apr 13, 2004 1:05 pm
by Rusty075
The PCI-X spec also paves the way for higher wattage VGA's. In the AGP spec they can only pull 25 watts from the socket before they have to go to accessory power plugs...with PCI-X the limit is 60 watts.

Posted: Tue Apr 13, 2004 2:45 pm
by 1398342003
^^^ :(

But those ones will have only 3 molex connecters.

Posted: Wed Apr 14, 2004 3:53 am
by Jan Kivar
Rusty075 wrote:The PCI-X spec also paves the way for higher wattage VGA's. In the AGP spec they can only pull 25 watts from the socket before they have to go to accessory power plugs...with PCI-X the limit is 60 watts.
PCI-E, not PCI-X... :D 8)

Cheers,

Jan

Posted: Wed Apr 14, 2004 4:48 am
by Fabool
http://www.warp2search.net/modules.php? ... &sid=17433

Links for reviews.
I don't think there's any chance of cooling this thing passively..

Interesting to see what ATi has up its sleeve.

Re: New nVidia GeForce 6800 -- DUAL power connectors

Posted: Wed Apr 14, 2004 5:40 am
by sneaker
wumpus wrote:Carrying this over from another thread-- it deserves its own topic.
Yeah, in the Hot & Noisy VGA forum ;)

The most interesting thing to come out of this card are the power consumption measurements performed by Tom's Hardware:

http://www.tomshardware.com/graphic/200 ... 00-19.html

For example, a Radeon 9800XT demands 33W more from the mains than a 9600XT when doing next to nothing in Windows. Power consumption tests for various PC components (CPUs and video cards in particular) in various system states is something we need to see more of.

Posted: Wed Apr 14, 2004 5:50 am
by Gekkani
I dont understand TomsHardware... According to his test, a P4 3.2 & Geforce 6800 demands 288w Full load.

But at the end of the article we can read:


"To those of you ready to camp out in front of your store to get your hands on one of these cards - be warned! The first barrier on your way to graphics ecstasy is the power requirement calling for a PSU with at least 480Watts."

:?:

New photos

Posted: Wed Apr 14, 2004 6:48 am
by Cyberpukish
http://www.theinquirer.net/?article=15325
More pictures are out! That heatsink looks big.

Posted: Wed Apr 14, 2004 6:56 am
by Cyberpukish
Hmmm looking at the screenshots, i must say the quality of the nvidia rendering is really poor. Check out the pictures where they compare the, 6800ultra, 5950 ultra and the 9800XT. After looking at all the screenshots and wrenching my eyes (not to mention the clicking of back and forward on the browser buttons) ATi still renders and filters better than the nvidia card. What are your opinions?

Posted: Wed Apr 14, 2004 8:07 am
by wumpus
I don't agree, and even if I did, I doubt that kind of "back and forth 5 times until I convince myself there's a difference between these two images" testing would never be discernable in actual gameplay.

I must admit the performance is very impressive. The card really is literally 2x as fast as the current 9800xt, in games that use pixel shaders extensively.. Halo for example. And at least 1.5x faster in a lot of other games, which is not exactly chopped liver either.

I sense an upgrade on my horizon. At least the power profile won't be that much worse than my current 9800 pro (slightly overclocked).

Image

Posted: Wed Apr 14, 2004 8:18 am
by Cyberpukish
I agree that when playing a game, we tend not to notice the quality, however, by looking at the farcry screen shots, you can see obvious differences. I didnt even had to try. I hope its the driver's fault, not the card as mentioned by Tom's Hardware. I may be thinking of getting the 9800pro instead of the 5900XT since the quality difference is quite drastic..... The next upgrade i will wait for ATi's solution to come and fight with Nvidia's, then hopefully a price war will follow ;))

Posted: Wed Apr 14, 2004 8:29 am
by nutball
Cyberpukish wrote:I agree that when playing a game, we tend not to notice the quality, however, by looking at the farcry screen shots, you can see obvious differences. I didnt even had to try. I hope its the driver's fault, not the card as mentioned by Tom's Hardware. I may be thinking of getting the 9800pro instead of the 5900XT since the quality difference is quite drastic..... The next upgrade i will wait for ATi's solution to come and fight with Nvidia's, then hopefully a price war will follow ;))
FarCry is clearly the odd-one-out in all of the tests I've seen. There must be something screwy going on ... somewhere.

With the other ones -- well to be honest I've never really seen the point of comparing AF/AA settings in fast-actioned FPS games using static screenshots (unless you want to sit and admire your beautifully rendered giblets that is). The differences are more evident in-game for slower paced games, and no static screenshot really tells the story IMO.

Posted: Wed Apr 14, 2004 9:14 am
by frosty
I think this will go over with hard core gamers as usual and not worry about heat/noise. Not me, I am content to play at lower res and no sound, no sound helps visual framerates and lets me know if someone is breaking in my home to kill me real time.

Fabool, me too, waiting for the ATI comeback, I love the video card wars!

Posted: Wed Apr 14, 2004 5:24 pm
by 1911user
I just read the 6800 review at Anandtech. The fact that it requires 2 molex connectors says alot. Using just 1 power connector would have exceeded the ATX spec :shock: My 9700 pro almost seems wimpy by comparison, but I'm not paying $500 for a new video card and installing a 500W PSU. The bar has certainly been raised for video card performance. If they truly price the slighly degraded version at $300 as planned, 9800 prices are coming down this summer.

I'd like to try it with Far Cry which I'm currently playing. Rig#1 with the 9700 pro at 9800 pro speeds only rates a "medium" using the autosetup utility. Most of the graphics options also autoset at medium, but the game looks very good anyway. With 512MB ram, the game still stutters ocassionally. Another 512MB stick arrives tomorrow, then the real fun should begin. 1GB better be enough for games released this year. If you like intelligent shooters, this is a must have game.

Posted: Wed Apr 14, 2004 5:55 pm
by wumpus
FarCry is clearly the odd-one-out in all of the tests I've seen. There must be something screwy going on ... somewhere.
Oh, THAT'S what you meant. Yeah. The GeForce 5xxx definitely had problems with this game due to the weak pixel shaders/precision, however, the 6xxx series has shader/precision power out the wazoo.. so I'm guessing in this case, the devs have defaulted to some reduced precision mode on all nVidia FX series hardware. Makes sense, since the game was released a few weeks ago, prior to the GeForce 6xxx launch.

I feel very confident this can be resolved with a patch.

Posted: Wed Apr 14, 2004 6:16 pm
by apocalypse80
Must......resist......the......urge......to.......upgrade..........

For all that power , it might as well have 12 connectors :D .
Nvidia really made a great card , no more FX ****.
If it wasn't for those IQ screenshots , I'd already be gathering money.
I've played farcry , both on my 9800pro and on a friends 5950U.
The difference in image quality was not little , at all , it looked incredibly better with the ATI (not to mention the fps difference : 9800pro fps = 1.5x 5950U fps).
However NV-40 is almost on par with ATI in IQ.
And with all that power , they might as well take away the "optimisations".

No way will this be cooled passively (except perhaps with a prescott HS...) , but I think a waterblock would do the trick :wink:.
As for the consumption , I bet they were thinking about no-name PSUs when they came up with it (I'd rather have a good brand 350W than a no-name 600W).

Posted: Wed Apr 14, 2004 8:38 pm
by Cyberpukish
1911user wrote:I just read the 6800 review at Anandtech. The fact that it requires 2 molex connectors says alot. Using just 1 power connector would have exceeded the ATX spec :shock: My 9700 pro almost seems wimpy by comparison, but I'm not paying $500 for a new video card and installing a 500W PSU. The bar has certainly been raised for video card performance. If they truly price the slighly degraded version at $300 as planned, 9800 prices are coming down this summer.

I'd like to try it with Far Cry which I'm currently playing. Rig#1 with the 9700 pro at 9800 pro speeds only rates a "medium" using the autosetup utility. Most of the graphics options also autoset at medium, but the game looks very good anyway. With 512MB ram, the game still stutters ocassionally. Another 512MB stick arrives tomorrow, then the real fun should begin. 1GB better be enough for games released this year. If you like intelligent shooters, this is a must have game.
You should check this out, http://www.xbitlabs.com/articles/video/ ... 40_38.html Even the mighty 6800ultra couldnt handle this game... Sim games are CPU intensive and GPU intensive?

Posted: Wed Apr 14, 2004 9:15 pm
by 1911user
Cyberpukish wrote:
You should check this out, http://www.xbitlabs.com/articles/video/ ... 40_38.html Even the mighty 6800ultra couldnt handle this game... Sim games are CPU intensive and GPU intensive?
Wow, you have to go to 800x600 video for a decent framerate with the current top video cards :shock: and I thought Far Cry was a system killer.

Posted: Wed Apr 14, 2004 9:19 pm
by wumpus
Well, if a game is 100% CPU limited, you could put in an infinitely fast video card and it really wouldn't make any difference.. with computers, you're always playing a game of "find the bottleneck".

Posted: Wed Apr 14, 2004 10:11 pm
by 1911user
Since the framerates did vary at least 40% among the cards, I'm not convinced they are CPU limited. I can believe the cpus are heavily loaded and that might somehow affect the video card performance. Or the game could just be using enough advanced DirectX-9.0 effects to extremely load even the newest, best video cards. It's just that for 1024x768 and higher, the framerates were below 28 FPS for non-AA/AF and below 21 FPS with 4x/8x AA/AF enabled.

Posted: Wed Apr 14, 2004 10:55 pm
by apocalypse80
Did you think the reference cooler was a monster?

Just take a look at this Leadtek 6800U

Image

HUGE and copper , can anyone say SLK900U?

Posted: Thu Apr 15, 2004 12:05 am
by Cyberpukish
That heatsink is very big and tall! Instead of expanding horizontally, seems they go for height instead. The fins are really packed together. hmm i wonder if the zalman heat sink can cool it sufficiently without a fan...doubt so. Hmm They made it a closed system sort of cooling with the plastic enclosure. I suspect the noise will be higher than normal coolers since the velocity of air is faster. What do u think?

Posted: Thu Apr 15, 2004 6:40 am
by Sardaan
This card gives me an even better excuse to water cool my next system. I can't wait to see ATI's announcement in a few weeks.

Posted: Thu Apr 15, 2004 6:51 pm
by DGK
I think I will be keeping my 9700 pro until it gets to the low end of the performance scale. I thought the 9700 pro was a bit much and the only reason I finally got it was that I got a vga silencer for it which helped keep the noise/heat down. You know vid cards are starting to use to much power when they use more electricity than your fridge :lol:

Posted: Fri Apr 16, 2004 10:29 am
by grandpa_boris
apocalypse80 wrote:Did you think the reference cooler was a monster?

Just take a look at this Leadtek 6800U
OK, i don't get it. the 6800 cards are already eating up 2 slots. why not vent them to the outside? why pump all that hot air back into the case, where it will only cause us grief? what i am noticing most on the leadtek card is the fan that's smaller than the reference card had. smaller fan == higher RPMs == a lot more noise.

makes me want to start looking into massive watercooling solutions...