Fanless heatsink for nVidia 6800 series?

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Inexplicable
Posts: 226
Joined: Sat Sep 06, 2003 5:59 am
Location: Finland

Post by Inexplicable » Thu Jul 22, 2004 10:36 am

JazzJackRabbit wrote:BTW when is the Aerocool review coming out?
Indeed. I just went and ordered myself one, so it had better be good. Otherwise I'm going to blame Rusty and his tardy review. 8)

shathal
Posts: 1083
Joined: Wed Apr 14, 2004 11:36 am
Location: Reading, UK

Post by shathal » Thu Jul 22, 2004 12:35 pm

Haven't seen a "out of the box" quiet variant of the NVIDIA cards though.

For the X800 you've obviously got HIS IceQ II's - and Sapphire intend to do a "Toxic" range which is also supposed to be quiet.

I should be getting my X800XT PE IceQ II in a few days, so I can confirm (or revoke) that statement soon hopefully... :)

Rusty075
SPCR Reviewer
Posts: 4000
Joined: Sun Aug 11, 2002 3:26 pm
Location: Phoenix, AZ
Contact:

Post by Rusty075 » Thu Jul 22, 2004 12:48 pm

There's nothing wrong with tweaked drivers... but I would hesitate to make too much of the Doom3 benchmarks just yet: It's an unreleased game, with a new engine that Nvidia has been involved in the development of from the very beginning. Of course both it and their drivers are going to be optimized for each other. Ati hasn't even had access to the engine's code yet, who knows, their next driver could completely change the performance landscape.

Single benchmarks are not the way to decide which card to buy, unless that is the only game you plan on playing. :lol:


Right now the 6800Gt is probably the best bang for the buck in terms of performance, but the jury is still out on what the quiet cooling options will be for it.



As for the Aerocool...

Shoulda asked me first....it causes all cards to burst into flames spontaniously upon installation.


(just kidding, mostly)


It's in the backlog of reviews to be edited and then queued for publishing. (I'm not actually the writer of the review in question, I'm only the editor for this one....but the backlog is all my fault :roll: )

shathal
Posts: 1083
Joined: Wed Apr 14, 2004 11:36 am
Location: Reading, UK

Post by shathal » Thu Jul 22, 2004 1:06 pm

Oh, there are reviews on the IceQ II.

Here: (that was the first one I ran across a few weeks ago)
http://www.hardavenue.com/reviews/hisx8001.shtml


Here:
http://www.guru3d.com/article/Videocards/141/1/

THIS is an interesting one:
http://www.3dgameman.com/vr/his/x800pro ... eview.html

It's a video review. While highly patronizing and mostly useless, it does have ONE really nifty bit. You get to HEAR the fan on the card. And I know now that I'll not ear it over my system :).

So - should be a DEFINITE killer :D.

shathal
Posts: 1083
Joined: Wed Apr 14, 2004 11:36 am
Location: Reading, UK

Post by shathal » Thu Jul 22, 2004 1:08 pm

There's a review Here:
http://www.hardavenue.com/reviews/hisx8001.shtml

<First review I found a few weeks ago>

And another one here:
http://www.guru3d.com/article/Videocards/141/1/

There's a video review here:
http://www.3dgameman.com/vr/his/x800pro ... ew_02.html

This is, while quite a crap review (and quite patronising), it does have ONE cool bit in it. You get to HEAR the fan (it being a video review) ... and I know already that this card will NOT be loud in my system ;).

Definite SPCR material, by the look of things. :D

shathal
Posts: 1083
Joined: Wed Apr 14, 2004 11:36 am
Location: Reading, UK

Post by shathal » Thu Jul 22, 2004 1:09 pm

(Sorry for the double-post - SPCR barfed up on me when I posted first time, and I didn't think the post went through...)

Inexplicable
Posts: 226
Joined: Sat Sep 06, 2003 5:59 am
Location: Finland

Post by Inexplicable » Thu Jul 22, 2004 1:30 pm

Rusty075 wrote:As for the Aerocool...

Shoulda asked me first....it causes all cards to burst into flames spontaniously upon installation.


(just kidding, mostly)
Mostly!?! :?

These guys at http://twinsbyte.de at least seem to like it (babelfish translation).

azmo
Posts: 16
Joined: Mon Jun 07, 2004 1:21 am

Post by azmo » Sun Jul 25, 2004 5:25 am

JazzJackRabbit wrote: ... And you should consider that 6800 series is already becoming widely available while ATI still struggles to put any of its new cards on the market. ...
I'm finding the opposite is true here. Called around and found the 6800's backordered until at least beginning of august, some middle of august, whereas x800 pro has been available for at least a month, x800xt half that.

Straker
Posts: 657
Joined: Fri Jul 23, 2004 11:10 pm
Location: AB, Canada
Contact:

Post by Straker » Sun Jul 25, 2004 6:36 am

azmo wrote:
JazzJackRabbit wrote: ... And you should consider that 6800 series is already becoming widely available while ATI still struggles to put any of its new cards on the market. ...
I'm finding the opposite is true here. Called around and found the 6800's backordered until at least beginning of august, some middle of august, whereas x800 pro has been available for at least a month, x800xt half that.
Yes, JR had it completely backwards, not sure if he really meant what he said or what. :?

availability on the X800XT PE (remember, the "normal" XT will be PCI-X only... but that's just semantics and $50) is still sorta iffy, but the pro has been readily available for two months now; considering how long it took to find the 6800U in anything more than homeopathic doses, I don't even wanna guess on the 6800GT.
Bottom line is you will get a few more fps with a 6800U (and possibly GT) than with an X800 Pro, as long as you use ghetto IQ settings on both, but noone does that any more. With these cards' fillrates, all that not using 8x aniso (at a minimum, unless you're using an FX53 or P4EE) is doing is letting them stand around mocking your CPU's lack of sexual prowess while waiting for something to do. Aside from that, nVidia has closed the image quality gap admirably, but imho their AF/FSAA could still use some work in terms of both quality and performance hit.

Also consider that the X800 XT PE actually being available is finally starting to push Pro prices down where they *should* be; no idea how long this will take with the 6800. If I didn't pay CAN$600 last month for an x800 pro I'd probably find it a lot more interesting how their prices are based purely on supply and demand, and are also like the only consumer product selling above MSRP. :P

wumpus
Patron of SPCR
Posts: 946
Joined: Sat Sep 06, 2003 9:57 pm
Location: Berkeley, CA, USA
Contact:

Post by wumpus » Sun Jul 25, 2004 10:45 am

as long as you use ghetto IQ settings on both, but noone does that any more. With these cards' fillrates, all that not using 8x aniso
I don't agree with this. I haven't yet purchased any card from the next-gen stable (x800, 6800, etc) but my 9800 Pro struggles to keep up at 1600x1200 in a lot of games-- fex, Warcraft 3-- with AA or aniso enabled. The lowered FPS is very noticeable, particularly during periods of heavy special effects (translucency, etc).

Personally, with cards that powerful, I'd rather run 1920x1440 or 2048x1536, both resolutions that many 21" monitors support. I find that more real pixels always looks better than the "synthetic" additional resolution of antialiasing. And anisotropic filtering, while nice @ 2x, rapidly loses any visible benefit beyond that level.

So I run with "ghetto IQ" settings, but at much higher resolution. I call it "better IQ" :D

mathias
Posts: 2057
Joined: Sun Jul 11, 2004 3:58 pm
Location: Toronto
Contact:

Post by mathias » Sun Jul 25, 2004 1:39 pm

Sure many 21" , and even some 19" monitors support those kinds of resolutions, but not at very good refresh rates. And a lot of monitors out there are garbage that max out at 1280x1024x60hz, though I don't mean to defend people foolish enough to get those. And then with LCD monitors you can't even change the resolution. And I'm sure lots of people expect to be able to use the maximum resolution their monitor supports with 8x AA and 16x AS or however high those things I can't pronounce go.

wumpus
Patron of SPCR
Posts: 946
Joined: Sat Sep 06, 2003 9:57 pm
Location: Berkeley, CA, USA
Contact:

Post by wumpus » Sun Jul 25, 2004 1:42 pm

Actually, with the 9800 Pro, I get unacceptable slowdown @ 1600x1200 in warcraft 3 even with AA and aniso completely disabled-- that is, it would dip into the low 20's at times. I had to back down to 1280x960.

My hope is with a 6800 or x800, I could run 1600x1200 or 1920x1440 at a nearly constant 50+fps. That is just not possible right now with the 9800 pro/xt...

cliche
Posts: 150
Joined: Sun Apr 20, 2003 9:30 pm
Location: uk

Post by cliche » Sun Jul 25, 2004 1:57 pm

good points there mathias and wumpus.
Will ponder them on my next card choice (I have a 21" trinitron btw)

Straker
Posts: 657
Joined: Fri Jul 23, 2004 11:10 pm
Location: AB, Canada
Contact:

Post by Straker » Sun Jul 25, 2004 4:16 pm

wumpus wrote:but my 9800 Pro struggles to keep up at 1600x1200 in a lot of games-- fex, Warcraft 3-- with AA or aniso enabled.
"these cards" = X800/6800 :P
Sure, it's only one generation different, but an X800XT PE is more than double a 9800 Pro (and before someone says it, not just in price). Yes, more than 2 or 4x aniso is sometimes a waste (but still looks great and even 8x carries VERY little penalty), same with FSAA. The only point I was making was that really, the only way the 6800 is "better" is if you consistently play anything without AF/AA (and possibly other iq settings), and that I doubt the fill rate of these cards will be seriously challenged by many combinations of games/CPUs before they've already been superseded. Not sure about nVidia but IIRC ATI is planning on releasing a new core in ~6 months already.
And yes, of course I completely agree that extra real res is better than FSAA. For the rest of us with monitors worth less than the whole PC, there's AA. :( Also remember that AA is a lot more efficient than simply increasing res; even going from 1024x768 to 1600x1200 means filling twice as many pixels. I seriously doubt that aniso is making or breaking WC3 on what's still a really good card; WC3 ran great for me when I had a GF4 4400, and 8x aniso was the one thing I never turned off (except in Deus Ex 2, shudder). Most 3d RTSes seem to bog down every now and then, and almost all seem to usually be CPU limited, but at 1600x1200 it's hard to say.

sorry for length, just woke up and it's hard being concise; long and short of it is that (especially with an ATI card) even max AA/AF will be easier on your card than a noteworthy res increase. more res is always awesome, but not only will going without aniso not make enough of a difference to let anyone increase res, it'll look like crap (relatively speaking... enough so to defeat the purpose of increasing res at least).

also sorry for the hijack, but i'm a lot more familiar with video than silence. :)

wumpus
Patron of SPCR
Posts: 946
Joined: Sat Sep 06, 2003 9:57 pm
Location: Berkeley, CA, USA
Contact:

Post by wumpus » Sun Jul 25, 2004 8:14 pm

Well, as for FPS in Warcraft 3 here's a recent review on a P4 3.2ghz:

http://www.gamepc.com/labs/view_content ... 700&page=7

As you can see, barely 50fps @ 1024x768 with no AA/aniso.. so you can imagine what happens when you jack it up to 1600x1200.
Bottom line is you will get a few more fps with a 6800U (and possibly GT) than with an X800 Pro, as long as you use ghetto IQ settings on both, but noone does that any more. With these cards' fillrates, all that not using 8x aniso (at a minimum, unless you're using an FX53 or P4EE) is doing is letting them stand around mocking your CPU's lack of [power] while waiting for something to do
There's still a lot of need for faster cards even with vanilla settings. Even more so with DX9 (shader 2.0) games like Far Cry which are incredibly GPU dependent, much more so than WC3.

shathal
Posts: 1083
Joined: Wed Apr 14, 2004 11:36 am
Location: Reading, UK

Post by shathal » Mon Jul 26, 2004 10:28 am

You mean shader 3.0 games like Far Cry?

That being NVIDIA's "hype bunny" over ATI, what with the X800 still being on Shader 2.0 (and NVIDIA being able to do 3.0).

wumpus
Patron of SPCR
Posts: 946
Joined: Sat Sep 06, 2003 9:57 pm
Location: Berkeley, CA, USA
Contact:

Post by wumpus » Mon Jul 26, 2004 12:27 pm

Far Cry is 98% Shader 2.0 even with the latest patch; there's an article on xbit-labs on that, I believe. But yeah, the new DX9/shader games are incredibly GPU dependent, much like 3dMark2003 is..

JazzJackRabbit
Posts: 1386
Joined: Fri Jun 18, 2004 6:53 pm

Post by JazzJackRabbit » Mon Jul 26, 2004 5:17 pm

Rusty075
There's nothing wrong with tweaked drivers... but I would hesitate to make too much of the Doom3 benchmarks just yet: It's an unreleased game
AFAIK the game has already gone gold, so there will be no major changes, if any to the game engine.

game, with a new engine that Nvidia has been involved in the development of from the very beginning. Of course both it and their drivers are going to be optimized for each other. Ati hasn't even had access to the engine's code yet, who knows, their next driver could completely change the performance landscape
True, theoretically. However practically, I don't think so. 12 vs. 16 pipelines isn't a fair game, no matter how much you optimize the drivers X800PRO will always be at a disadvantage. Besides 6800GT is still faster in the majority of the games...

Single benchmarks are not the way to decide which card to buy, unless that is the only game you plan on playing.
I would be careful if I were you. Just think of how many games are using Q2/Q3 engine, and I strongly suspect there will be lots of the games in the future that'll use Doom3 engine, which means 6800 series will have a huge advantage in those games.

Straker & Asmo
I meant what I said. 6800ULTRA is extremely rare, but frankly it's the top of the line card - expensive, too hot, and requires two power connectors - all of that is a major turn off for me. As for 6800GT/STD I can find them almost anywhere from my local merchants FRYs/CompUSA to various online stores. Of course the price most of the time bites, but they are available. It's the meantime X800 is still hard to come by, I'm not saying you can't buy them, but they are still very rare. Come on, even ATI webstore doesn't have them in stock, and do you remember all the hype when gateway and CDW got 200ish X800XT - and how all of those cards were sold in a matter of hours? No... I still say ATI has major supply issues.

shathal
Posts: 1083
Joined: Wed Apr 14, 2004 11:36 am
Location: Reading, UK

Post by shathal » Tue Jul 27, 2004 3:12 am

BOTH of them are culprits. ATI *AND* NVIDIA.

Used to be AMD and INTEL who were primary culprits in the paper launches. Seems that this has now migrated over instead to the GFX buisness...

Hmpf. :(

Compddd
Posts: 276
Joined: Tue Oct 21, 2003 6:06 pm
Location: California

Post by Compddd » Tue Jul 27, 2004 9:16 am

Does anyone else find this disturbing about the new artic cooling silencers?

2.How loud are they going to be? Will they have a
silent mode just like Arctic Silencer Rev2/3?

2. No there will be just one mode, since a lot of VGA cards have
already
a thermal control onboard. It will be 2 to 3 times quieter than the
stock cooler.

WHATS THE FRIGGIN POINT? The whole thing about the silencers is being able to force them to run in silent mode. This 6800GT is already friggin loud because its thermal control chip is garbage and makes the fan high even in a AC chilled room in 2D mode surfing the net. So if I put a NV4 Artic Silencer on it and it uses the onboard nvidia control chip, its gonna run in high mode still, blech this news disgusts me.

shathal
Posts: 1083
Joined: Wed Apr 14, 2004 11:36 am
Location: Reading, UK

Post by shathal » Tue Jul 27, 2004 1:09 pm

Well - I suppose the manufacturers see it like this.

"Better a bit louder, than a bit of fried hardware" (and thus, RMA)?

mathias
Posts: 2057
Joined: Sun Jul 11, 2004 3:58 pm
Location: Toronto
Contact:

Post by mathias » Tue Jul 27, 2004 8:05 pm

Compddd wrote:Does anyone else find this disturbing about the new artic cooling silencers?

2.How loud are they going to be? Will they have a
silent mode just like Arctic Silencer Rev2/3?

2. No there will be just one mode, since a lot of VGA cards have
already
a thermal control onboard. It will be 2 to 3 times quieter than the
stock cooler.

WHATS THE FRIGGIN POINT? The whole thing about the silencers is being able to force them to run in silent mode. This 6800GT is already friggin loud because its thermal control chip is garbage and makes the fan high even in a AC chilled room in 2D mode surfing the net. So if I put a NV4 Artic Silencer on it and it uses the onboard nvidia control chip, its gonna run in high mode still, blech this news disgusts me.
Do you think you'll get one and add a resistor?

Post Reply