ATI's answer to NVidia 8800: R600

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

spookmineer
Patron of SPCR
Posts: 749
Joined: Sat Nov 11, 2006 6:02 pm

ATI's answer to NVidia 8800: R600

Post by spookmineer » Fri Feb 09, 2007 6:26 pm

ATi R600XTX/XT/XL Series Unveiled
The new generation Radeon XTX is a true monster. System builders will have the 31.5 cm long card with a power consumption of no less then 270 Watt.
The retail cards measure 24.1 cm and "only" needs 240 Watt.
It's not cool man, not cool... :shock:

Look at where the PCI-e connection is as opposed to the cards' length:

Image

It's huge. Too huge. As is power consumption...

BillyBuerger
Patron of SPCR
Posts: 855
Joined: Fri Dec 27, 2002 1:49 pm
Location: Somerset, WI - USA
Contact:

Post by BillyBuerger » Fri Feb 09, 2007 7:03 pm

Oh wow, a full length expansion card. Can't say I've seen one of those in years. I can remember a PC a friend had back in about '93 with a sound card that was full length. Otherwise I saw a dual channel SCSI card from an old server that was in the trash pile at work around '00. Considering most cases put the hard drives in the location where full length expansion cards would go, it wouldn't fit in many cases.

I guess they just couldn't let nVidia beat them. Even on something as little as who has the biggest card.

merlin
Friend of SPCR
Posts: 717
Joined: Mon Jun 13, 2005 6:48 am
Location: San Francisco, CA

Re: ATI's answer to NVidia 8800: R600

Post by merlin » Fri Feb 09, 2007 8:15 pm

Hope you guys realize that this is the oem only version. The retail version is only 9.5" long. Not that I'd want it anyways...230 watts = not good for silence...

Shadowknight
Posts: 1283
Joined: Thu Aug 07, 2003 2:43 pm
Location: Charlotte, NC, USA

Post by Shadowknight » Fri Feb 09, 2007 10:23 pm

I remember the occasional bitching about cases that can't take full length video cards, even though no one has made them in years. Lots of people who buy Antec cases are going to be pissed when they find out this can't fit...

JaRoD
Posts: 53
Joined: Sat Dec 16, 2006 11:18 am

Post by JaRoD » Sat Feb 10, 2007 4:41 am

I read somewhere that the fan uses 24w of power, no way in hell it could make less noise than a vacuumcleaner :shock:

BillyBuerger
Patron of SPCR
Posts: 855
Joined: Fri Dec 27, 2002 1:49 pm
Location: Somerset, WI - USA
Contact:

Post by BillyBuerger » Sat Feb 10, 2007 8:36 am

I read the article after posting. So yes, that's the OEM version. In which case the system builder would obviously be using a case that supports full length cards. Still, it's interesting to see that.

Although the fan on that thing will be way up towards the front of the case. Meaning it'll get fresh cool air without needing a stupid side vent on the case. So that's kind of a nice idea. But I agree, way too much of a power hog.

merlin
Friend of SPCR
Posts: 717
Joined: Mon Jun 13, 2005 6:48 am
Location: San Francisco, CA

Post by merlin » Sun Feb 11, 2007 12:15 pm

BillyBuerger wrote:I read the article after posting. So yes, that's the OEM version. In which case the system builder would obviously be using a case that supports full length cards. Still, it's interesting to see that.

Although the fan on that thing will be way up towards the front of the case. Meaning it'll get fresh cool air without needing a stupid side vent on the case. So that's kind of a nice idea. But I agree, way too much of a power hog.
Personally I think I'll take an interest once the DirectX 10 Video cards hit the 65nm node and use 100 Watts or less. That's far more possible to keep quiet...especially with a 7900/8800 Style Cooler.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Fri Feb 16, 2007 8:10 am

65 nm isnt great for these chips as "yields" are crappy for gpu's. cpu's are relatively staple and simple compared to these ever changing and complex gpu's. 80nm is the same as 90nm without changing much tech I hear. of course 65 would be better but only if they put other techonology into it. look at 90nm Prescotts. ew. blech. 90nm didnt help them a drop. GPU companies use the larger nm chips when they make their expensive versions. youll see the smaller nm process on the cheaper gpu's.

240watts is the draw of these supposedly. SUPPOSEDLY. I can say that they make these numbers with the idea of crappy psu's in mind. I am into higher end gfx cards for gaming and never found any of them to actually use the wattage they claim. I would guess 180 watts on an efficient psu with adequate amperage going to the cards. still kinda nuts though :)


im waiting for the all-in-wonder version of the card. my x1900 aiw uses considerably less power than the x1900 xt. it has half the ram and is slightly underclocked. same chips and ram are used just going a little slower. the r600 is a 1 gig card. take 512mb off of it, underclock it like 10%, that will help it quite a bit.

cant wait for it :) I guess I am going to have to get rid of this 350 phantom and go buy the 620 watt corsair psu that mikeC reviewed. :(

Sandip
Posts: 9
Joined: Fri Jan 12, 2007 7:01 pm

Post by Sandip » Fri Feb 16, 2007 8:14 pm

The future's going back to the 70's :shock: :arrow: 8)

Calculators gotta be big and noisy!!

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Sat Feb 17, 2007 4:15 pm

I have a calculator that requires a power adaptor. it has a green led screen that glows brightly. it is black and silver keypad with a flippy switch for functions.

it rox.

Bobfantastic
Posts: 193
Joined: Sat Jul 08, 2006 10:32 am
Location: Folding in Aberdeen

Post by Bobfantastic » Sun Feb 18, 2007 4:23 am

Just saw this http://www.dailytech.com/article.aspx?newsid=6138...
X2900 XTX retail card comes as a two-slot, 9.5" design with a vapor chamber cooler
Correct me if I'm wrong, but doesn't this mean there's basically a big heatpipe with an open end, directly touching the core? i.e. An end to aftermarket coolers, without spraying chemicals all over the place?

If you can't take off the heatsink, then the only thing you can to quieten it would be change the fans- and there's only so much you can do without increasing the area of cooling.

Not a good sign :(

Happy Hopping
Posts: 254
Joined: Wed Jun 08, 2005 4:38 am

Post by Happy Hopping » Sun Feb 18, 2007 5:03 am

Since Nvidia is rel. 8900 and 8900GX2, this R600 is meaningless to them

s_xero
Posts: 154
Joined: Sun Sep 10, 2006 2:56 pm

Post by s_xero » Sun Feb 18, 2007 11:26 am

I'm getting the feeling that AMD (and ATI) is/are losing control over power-requirements.

How the hell could the 7xx0 and the 8xx0-series be so much less power-hungry then the ATI's offerings??

I couldn't even get even get an idea why the hell you should be willing to get such a card that is only tenths of a frame faster than a card which requires only 2/3 of the power (counts for the 7900-series, it's an estimate though).

For the power the ATI-stuff requires, you could overclock the nVidia's opponent like hell I guess.

Still hard for me to accept that AMD is losing competition of Intel and nVidia now though....

On the other hand, and a bit off-topic - Intel fucks their CPU's up by not implementing a good 64-bit-structure = very weak if you ask me...

spookmineer
Patron of SPCR
Posts: 749
Joined: Sat Nov 11, 2006 6:02 pm

Post by spookmineer » Sun Feb 18, 2007 4:17 pm

s_xero wrote:I'm getting the feeling that AMD (and ATI) is/are losing control over power-requirements.
AMD is still on track with their CPU's.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Mon Feb 19, 2007 6:52 pm

yeah ati is getting this way....

BUT!!

consider that the best psu out now is a 620 watt corsair (silence wise).... you might as well I guess go for the glory.

r600 will shoot nvidia's cards in the nuts. the 8900 bump is a bump. people always said my x1900 was comparable to this or that, but when the effects start pumping, the aa and af cranked, my card never drops a frame. nvidia never bothers to make a real all in wonder type of card. if it did, I would consider then but eh, they never do.

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Tue Feb 20, 2007 1:50 am

~El~Jefe~ wrote:r600 will shoot nvidia's cards in the nuts. the 8900 bump is a bump.
At the rate AMD are going getting the bloody thing out of the door, R600 will be competing with G90, not G80.

As for 8900, you know what, I think that NVIDIA probably have a better idea than you or I do what they need to bring to the table to deal with R600. So 8900 "just a bump" might say something about R600. There's something weird about R600, something doesn't feel right.

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Wed Feb 21, 2007 7:23 am

nutball wrote:At the rate AMD are going getting the bloody thing out of the door, R600 will be competing with G90, not G80.
Anyway, as I was saying...
Our sources confirm that AMD's next generation R600 GPU has been further delayed to Q2 2007 and, sadly, they didn't mention April 1st as the likely ETA. In fact, our understanding is that we're likely looking at late April as a minimum here. Other reports on the net, as of the time of writing, also claim they have official statements from AMD confirming the new Q2 release goal.
http://www.beyond3d.com/forum/showthread.php?t=38927

More interesting are the comments about the cards lower in the range though, sounds like they might still be on track.

Ryan Norton
Patron of SPCR
Posts: 169
Joined: Wed Sep 21, 2005 5:13 pm
Location: South FL

Post by Ryan Norton » Thu Feb 22, 2007 7:14 am

I'm very interested in the "vapor chamber" cooler on the retail, i.e. 9.5", new ATI card. It's nice to see some real technological progress in stock VGA card cooling, albeit only because the GPUs have gotten so insanely power-hungry...

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Fri Feb 23, 2007 6:23 pm

nutball wrote:
~El~Jefe~ wrote:r600 will shoot nvidia's cards in the nuts. the 8900 bump is a bump.
At the rate AMD are going getting the bloody thing out of the door, R600 will be competing with G90, not G80.

As for 8900, you know what, I think that NVIDIA probably have a better idea than you or I do what they need to bring to the table to deal with R600. So 8900 "just a bump" might say something about R600. There's something weird about R600, something doesn't feel right.
youd think so right? IT has been a while with no benches or real stats about what the card will do. I am betting on the ati card though as the 7xxx series had terrible AA/af abilities and any x1800 or x1900 card seemed to crush them. the 8800 is a great card of course, but nvidia never makes a real all-in-wonder contender, so I am for ati in this respect only.....

I am not for anythign over 150 watts of power. what the hell???!!! I too wonder about the r600. I think however it will be the new standard. apparently ati doesnt care about showing it yet. I have a 620 watt corsair psu waiting in box for the new gpu. sad times.


I havent heard about the all-in-wonder card yet. I know its on its way. That will be a complicated card with all lamo crapped up m$ DRM in vista and vista's terrible sound support. Probably is a nightmare card to make with future proof certainty. 8900 is a bump, nothing particularly new about it but obviously it will be faster. this will make a 320mb 8800 GTS fall nicely into a near normal price range, something that is needed.

spookmineer
Patron of SPCR
Posts: 749
Joined: Sat Nov 11, 2006 6:02 pm

Post by spookmineer » Fri Feb 23, 2007 7:33 pm

~El~Jefe~ wrote:
nutball wrote:
~El~Jefe~ wrote:r600 will shoot nvidia's cards in the nuts. the 8900 bump is a bump.
At the rate AMD are going getting the bloody thing out of the door, R600 will be competing with G90, not G80.

As for 8900, you know what, I think that NVIDIA probably have a better idea than you or I do what they need to bring to the table to deal with R600. So 8900 "just a bump" might say something about R600. There's something weird about R600, something doesn't feel right.
youd think so right? IT has been a while with no benches or real stats about what the card will do. I am betting on the ati card though as the 7xxx series had terrible AA/af abilities and any x1800 or x1900 card seemed to crush them. the 8800 is a great card of course, but nvidia never makes a real all-in-wonder contender, so I am for ati in this respect only.....

I am not for anythign over 150 watts of power. what the hell???!!! I too wonder about the r600. I think however it will be the new standard. apparently ati doesnt care about showing it yet. I have a 620 watt corsair psu waiting in box for the new gpu. sad times.


I havent heard about the all-in-wonder card yet. I know its on its way. That will be a complicated card with all lamo crapped up m$ DRM in vista and vista's terrible sound support. Probably is a nightmare card to make with future proof certainty. 8900 is a bump, nothing particularly new about it but obviously it will be faster. this will make a 320mb 8800 GTS fall nicely into a near normal price range, something that is needed.
We, here on the sideline, can only take a guess at what marketing strategies come in handy. And we don't know what the R600 (from ATI's point of view) or the 8900 (from NVidia's point of view) is capable of.

Maybe the R600 is awesome (it will be). ATI might choose to wait until NVidia releases the 8900 before they let the R600 go.
Maybe the R600 has problems (yields, benches, whatever) and they try to solve this in the little time left (improbable).
Maybe the R600 is awesome but they still have problems in production capacity. They might choose to wait until it's widely available.
And maybe there is even another reason to hold back from ATI's stand. In the meantime though, everyone wanting to go for a DX10 card is going to buy NVidia. Something is going on, ATI knows this and it would be foolish to hold back while people are buying new cards for their Vista.

I don't have a crystal ball, but these people have studied marketing and business practices and plan their strategy. I am not putting any money on either card because I'd always lose. Both manufacturers have more info then I/we do.
Also, results accomplished in the past is no guarantee for the future, if NVidia booked bad results in AA/AF department as opposed to ATI, this doesn't mean it's still the same.

Unless this new material has some benefits in the near future, I'm not willing to spend money on a power hungry piece of hardware. But then again, I always had NVidia cards... :?
620 W will be enough, way enough to power a system including the R600 though.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Mon Feb 26, 2007 9:52 am

I would assume that it has to do with yields and not marketing. I could be wrong though. I know both of these collude with one another in fixing prices, so that really, its set up so both win and we lose our money.

200 dollars should buy a 8800gtx but it doesnt

125 should buy a 320mb GT but it doesnt either. yay.

qviri
Posts: 2465
Joined: Tue May 24, 2005 8:22 pm
Location: Berlin
Contact:

Post by qviri » Mon Feb 26, 2007 10:12 am

~El~Jefe~ wrote:240watts is the draw of these supposedly. SUPPOSEDLY.
hai guys, remember that 8800 GTX that was supposed to draw 300 watts?

klankymen
Patron of SPCR
Posts: 1069
Joined: Thu Aug 04, 2005 3:31 pm
Location: Munich, Bavaria, Europe

Post by klankymen » Mon Feb 26, 2007 10:44 am

well, let's start of considering the power connectors.

2xPCIe6 + 1xPCIx16 = 225W max
PCIe6 + PCIe8 + PCIx16 = 250W max

Anyways, the 240W TDP figure was for the A15 revision.

Apparently with the delay AMDATI will wind up releasing the A18 revision, with a TDP of a mere 180Watts.

And we all know how exaggerated TDPs can be. (for reference, the 8800GTX has a TDP of 165W IIRC). And 1,25A more or less aint gon make or break a heatsink.

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Mon Feb 26, 2007 12:06 pm

The 8800 GTX has a max TDP of 177W and measured about 145W.
I would expect the R600 with 240W max TDP to be measured at 180-200W.
If the respinned R600 wil run at higher frequencies, I can't see the TDP lowered to 180W.
Where did the info on 180W TDP come from?

thejamppa
Posts: 3142
Joined: Mon Feb 26, 2007 9:20 am
Location: Missing in Finnish wilderness, howling to moon with wolf brethren and walking with brother bears
Contact:

Post by thejamppa » Mon Feb 26, 2007 12:23 pm

Even nVidia seems better now on paper, we have to wait and see R600 real test results. Without tests everything is speculation at the best... And let's remember how nVidia screwed up with their Vista drivers and its still causing them troubles.

Like Dell is now selling their gaming rigs with XP instead Vista and Dell recomends Windows XP for gamers instead Vista. Even best hardware doesn't bring success if its not compatible with software you're running like OS.

oscar3d
Posts: 202
Joined: Sat Dec 10, 2005 11:35 am
Location: California

Post by oscar3d » Tue Feb 27, 2007 2:20 pm

I'm reading this:

http://www.theinquirer.net/default.aspx?article=37607

It seems that cards are going to be so big, they will be impossible to fit inside the case.

Now ATI/AMD is planning to create external versions of R600. I guess it's a box with it's own PSU+FANS+CARD.
It will be connected to an external PCI-E extension bracket on the back.

I guess it's not to bad, but it makes me wonder, this looks like buying a Console. It's the same thing, you plug it to your TV and voila!
Now you buy a big ass box and you plug-it to your PC.

Well as long as the damn thing is silent. But still this is madness.

thejamppa
Posts: 3142
Joined: Mon Feb 26, 2007 9:20 am
Location: Missing in Finnish wilderness, howling to moon with wolf brethren and walking with brother bears
Contact:

Post by thejamppa » Tue Feb 27, 2007 3:37 pm

I am bit worried but only OEM coolers are that big, which means when they are placed in pre-build cases by manufacturers there will be no issues that much. And retail version is shorter. Although retail version can cause problems when aligning properly in case, so that HDD places won't come in the way.

Of course this issue is rarely problem with higher quality cases that are longer or which has removable HDD trays and mounting your HDD's in 5,25" place but many with "cheaper" or shorter cases this will be some sort of issue.

8800-series cards are not tiny ones either. But R600 series card will have smaller versions released soon after first samples are officially released. The "light" budget weight R600's should not be that much more bigger than current VGA cards.

But I will place my faith on ATI/AMD now, I have no reason to do otherwise until we've seen first test results of R600 and all the cards they'll be releasing in first 2 or 3 months after introducing R600.

PopCorn
Posts: 346
Joined: Tue Dec 12, 2006 5:09 pm
Location: U.S.A. Massuchusetts...... *Folding For SPCR*
Contact:

Post by PopCorn » Tue Feb 27, 2007 4:11 pm

i dont know but to me with that kind of power comsumtion and size OEM size it feels to me like the whole this was a bit rushed... i prefer AMD cpus but thats it

jazkat
Posts: 1
Joined: Thu Mar 01, 2007 8:13 pm

Post by jazkat » Thu Mar 01, 2007 8:36 pm

Happy Hopping wrote:Since Nvidia is rel. 8900 and 8900GX2, this R600 is meaningless to them

pffft good one brains the r600 will trounce the 8900 series too do your homework also r600 will have 64 x4 = 256 unfied shaders the 8900 only 128, i wouldnt be suprised if the r600 slashes over the 8950x2 either
r600 is 512bit, you got to remember amd is in the game now and because of the transition is why the r600 is late also the vapo chill coolers arent ready and they are trying to get the power usage down mmmm i just thinking of the 179gb/s memory bandwidth.

also the intel thing people are rattling on how crap amd are now cause of intels new chips well that wasnt the case when amd released the 64 bit was it? it looks like amd is going to trounce the new intel chips again hahahhaah with the 128 bit barcelona using csi.

when daamit get the ball rolling nvidia dont satnd a chance sorry,plus when i buy my r600 i know it will work properly with vista as daamt have been workin alongside microsoft so there will be no blue screen of death and pixel corruption like u get with the 8800.
you wont be seing the big card because so dont worry i think they are for the apple mac pro and pc manufacturers

when daamit start selling there cpu+grafix card pakage ull have the ultimate gaming rig simple as that!

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Thu Mar 01, 2007 11:45 pm

eh, well I guess.

still cant berate a 8800 GT 320MByte version for lacking.

it currently roasts every usable screen resolution. And, it runs a lot cooler than a gtx or a hyperpumped out GTS.

I hate to say it... but if ati's card hits 200watts actually, I dont think it will be worth it this time around. the 8800 GT 320mb will be the main stay for users for a year solid. prices will go below 300 dollars quickly on them and that will satisfy the vast majority of gamers who are also looking to have a total system wattage of under 300-ish watts on load. Anything above that is mad complicated to cool. 400+ watts would be nuts :(

According to theinquirer.net , the R600 is ATI only, no amd intervention at all. The next R700 will be revamped version with amd written all over it. Let's hope it gets some cool and quiet technology of some kind or better process that has less need for such insane amperage draws!

Post Reply