First "OFFICIAL" X1900 XT (not XTX) 512MB review
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
First "OFFICIAL" X1900 XT (not XTX) 512MB review
Here: http://www.hardwarezone.com/articles/vi ... =1808&pg=1
Can you say disappointing? Doesn't seem that much better than the X1800 XT 512MB.
Of course someone will always jump in and say "Where are the real benchmarks? We want HL2/BF2/CoD2, etc" / "They didn't turn on AA/AF high enough ! That's were it really shines" ... OK, so what?
Can you say disappointing? Doesn't seem that much better than the X1800 XT 512MB.
Of course someone will always jump in and say "Where are the real benchmarks? We want HL2/BF2/CoD2, etc" / "They didn't turn on AA/AF high enough ! That's were it really shines" ... OK, so what?
Last edited by rpsgc on Mon Jan 23, 2006 11:25 am, edited 1 time in total.
All modern high end graphics cards aren't that impressive until you enable AA and AF and go into higher resolutions. You don't buy a fancy graphics card so you can play computer games with framerates above 100fps, you buy it to enjoy really nice graphics (at playable framerates), so IMHO if they haven't tested the card with AA and AF, they haven't tested the card at all...
Re: First "OFFICIAL" X1900 XT (not XTX) 512MB revi
OK - I should have read the arcticle before reacting, but I was just a bit shocked at your slightly misleading post:
Apart from that I'm dissapointed that it didn't beat the 7800GTX512 in all disciplines, from the technical viewpoint the ATI's "dynamic" architecture and it's large number of pixel shaders makes it superior to NVIDIA's "brute-force" approach. Pity the performance just isn't that spectacular...rpsgc wrote: Of course someone will always jump in and say "They didn't turn on AA/AF ! That's were it really shines" ... OK, so what?
Last edited by cAPSLOCK on Mon Jan 23, 2006 11:01 am, edited 1 time in total.
Re: First "OFFICIAL" X1900 XT (not XTX) 512MB revi
-double post-
Re: First "OFFICIAL" X1900 XT (not XTX) 512MB revi
Fixed nowcAPSLOCK wrote:OK - I should have read the arcticle before reacting, but I was just a bit shocked at your slightly misleading post:rpsgc wrote: Of course someone will always jump in and say "They didn't turn on AA/AF high enough ! That's were it really shines" ... OK, so what?
-
- Posts: 323
- Joined: Wed Jul 20, 2005 10:56 am
- Location: USA
Note that results were obtained using an Athlon64 3500+ (bottleneck, considerable gains have been shown through use of a dual-core proc), 1Gb DDR (bottleneck) and Cat. 5.13 drivers which do not officially support the x1900 series.
I look forward to testing with a 2.6GHz DC, 2Gb DDR and Cat. 6.2 drivers to see the potential of this card.
Now if they'd only replaced that aweful cooler...
I look forward to testing with a 2.6GHz DC, 2Gb DDR and Cat. 6.2 drivers to see the potential of this card.
Now if they'd only replaced that aweful cooler...
If it's a bottleneck for the X1900 then it's also a bottleneck for the other cardswarriorpoet wrote:Note that results were obtained using an Athlon64 3500+ (bottleneck, considerable gains have been shown through use of a dual-core proc), 1Gb DDR (bottleneck)
Right...warriorpoet wrote:and Cat. 5.13 drivers which do not officially support the x1900 series.
Although the official ATI Catalyst 5.13 drivers have no support for the Radeon X1900 series, PowerColor's drivers basically shipped us a driver set that's based on the Catalyst 5.13, but with support for the newcomer via new ID tags and configuration switches.
How do you know the bottleneck have the same effect on all cards?rpsgc wrote:If it's a bottleneck for the X1900 then it's also a bottleneck for the other cardswarriorpoet wrote:Note that results were obtained using an Athlon64 3500+ (bottleneck, considerable gains have been shown through use of a dual-core proc), 1Gb DDR (bottleneck)
Listen to warriorpoet, we want to see the limits of the graphics cards. By using a CPU that possibly makes a bottleneck it will affect all the cards, but the question is how much. We have seen reviews with such bottlenecks and sometimes even with results showing the difference in FPS is next to nothing between the latest card and and a two year old one. It's a bad thing, you want to avoid it, nobody wants those bottlenecks when trying to make a decent review.
Oh and about the drivers, you know that the first ones are not always the best, I would never make any conclusions about the card as long as they're using some old tweaked drivers. The X1900 is quite different from the X1800, the X1800 and the 7800 have been around for a while so they have drivers that works better with them. Just wait a couple of weeks and you'll see.
http://www.xbitlabs.com/articles/cpu/di ... es2_8.htmlEverything we said in our previous article called Contemporary CPUs and New Games: No Way to Delusions! was absolutely right. It is true: you don’t need a high-end processor for real gaming with realistic settings and high image quality. The gaming performance will still be limited by the graphics card. The recommended system requirements mentioned by all the game developers are absolutely correct. Do not be surprised that the game developers mention Pentium 4 3GHz+ and Athlon 64 2GHz+ processors as the minimum suitable CPUs for comfortable gameplay, even though today we can get 3.8GHz Intel CPUs and 2.8GHz AMD CPUs easily. It is true that faster CPUs than those mentioned in the minimum system requirements do not really stimulate and significant fps rate increase. So, the slower processors models from the Pentium 4 and Athlon 64 processor families can cope easily with the latest generation 3D shooters. So, if you already have one of those CPUs in your home system, then you shouldn’t worry about upgrading them for your gaming needs.
I think they know what they are talking about.
Last edited by rpsgc on Mon Jan 23, 2006 2:17 pm, edited 1 time in total.
-
- Posts: 323
- Joined: Wed Jul 20, 2005 10:56 am
- Location: USA
Yup, just more of a bottleneck for a more capable card, including the GTX 512; I have no preference.rpsgc wrote: If it's a bottleneck for the X1900 then it's also a bottleneck for the other cards
Reference also the other part of that statement, the part about significant gains being shown with dual-core processors. Preliminary testing shows a difference of 730 3DMarks in '06, hardly a paltry sum, even for a synthetic.
source: here for '06
My point exactly. The Cat. 5.13 drivers were not originally designed for the x1900 series. I do not expect to see as significant a gain from the Cat 6.1/2 drivers as the '05 OpenGL fix, but I'm sure there will be minor gains.rpsgc wrote:Right...warriorpoet wrote:and Cat. 5.13 drivers which do not officially support the x1900 series.
Although the official ATI Catalyst 5.13 drivers have no support for the Radeon X1900 series, PowerColor's drivers basically shipped us a driver set that's based on the Catalyst 5.13, but with support for the newcomer via new ID tags and configuration switches.
That said, there is no better deal at this price point. Remember, the GTX 512 is (when available) $100-$200 more. The fact that the much cheaper and more available x1900 matches it in any category is remarkable.
Now, let's see what the boys in green have to say...
Last edited by warriorpoet on Mon Jan 23, 2006 4:54 pm, edited 1 time in total.
-
- Posts: 323
- Joined: Wed Jul 20, 2005 10:56 am
- Location: USA
edit: post edited for snappishness, appologies to rpsgc
Also, a 30% FPS increase in any situation for a refresh product is beyond impressive. Unfortunately, so is 175 watts peak power draw...
Also, a 30% FPS increase in any situation for a refresh product is beyond impressive. Unfortunately, so is 175 watts peak power draw...
Last edited by warriorpoet on Mon Jan 23, 2006 2:39 pm, edited 2 times in total.
I posted that to Mats. I didn't saw your post then.warriorpoet wrote:Uh huh, that's exactly what I said above.rpsgc wrote:Driver improvements help, but they don't do magic.
Here's some research material for you: pay special attention to #1
Also, a 30% FPS increase in any situation for a refresh product is beyond impressive. Unfortunately, so is 175 watts peak power draw...
ATi did promised that a single X1900XT (or XTX whatever) would be better than two 7800GTX 512 in SLI. Somehow I doubt that...
-
- Posts: 323
- Joined: Wed Jul 20, 2005 10:56 am
- Location: USA
More reading material here.
My favorite part: a stock Opteron 170 (2.0GHz 2x 1Mb cache) equals the score of an FX-60 (2.6GHz 2x 1Mb cache)!
My favorite part: a stock Opteron 170 (2.0GHz 2x 1Mb cache) equals the score of an FX-60 (2.6GHz 2x 1Mb cache)!
-
- Posts: 323
- Joined: Wed Jul 20, 2005 10:56 am
- Location: USA
The biggest CPU-bound differences are observed with changes in chip architechture (i.e. # of cores, amt. of cache, Intel/AMD, etc.) due to driver efficiencies with different chips and newer cards. Although a few extra marks can be bought with higher CPU clocks, it's less efficient than upping the GPU/GDDR speeds.rpsgc wrote: AFAIK the CPU never played a major role on the result (3DMarks).
edit: also, 3dMark06 is far more CPU dependant than '05
Last edited by warriorpoet on Mon Jan 23, 2006 3:11 pm, edited 1 time in total.
Guys, I think I'm staying away from this battle. It's possible that I have a lot smart links and comments to give you and to keep us all amazed with for a while....
...but when I realized that this is actually posted in the "Cool & Quiet VGA" I felt like something is wrong here. I think you know what I mean. Please be kind to each other.
...but when I realized that this is actually posted in the "Cool & Quiet VGA" I felt like something is wrong here. I think you know what I mean. Please be kind to each other.
-
- Posts: 323
- Joined: Wed Jul 20, 2005 10:56 am
- Location: USA
So, um, how do I keep my new card "cool and quiet". Any ideas?Mats wrote:Guys, I think I'm staying away from this battle. It's possible that I have a lot smart links and comments to give you and to keep us all amazed with for a while....
...but when I realized that this is actually posted in the "Cool & Quiet VGA" I felt like something is wrong here. I think you know what I mean. Please be kind to each other.
I think the two items of greatest interest to the die-hard hardcore silencing fraternity are these:
1. the x1900xt (not xtx, which will probably be higher) consumes 175w under peak load
2. the stock cooler is LOUD, see x1800xt
To many, this immediately precludes a purchase; I however, sense a challenge and the opportunity to review more VGA cooling equipment!
Anyone here use a Thermalright V-1 Ultra on an x1800 yet?
BTW, a truce has been declared privately to keep our "street cred" intact and minimize collateral.
Back on topic! HAHA!warriorpoet wrote:So, um, how do I keep my new card "cool and quiet". Any ideas?
I think the two items of greatest interest to the die-hard hardcore silencing fraternity are these:
1. the x1900xt (not xtx, which will probably be higher) consumes 175w under peak load
2. the stock cooler is LOUD, see x1800xt
1. - You gotta be kidding? 175 W for the card alone? (Edit: Just saw it in the review now) I've seen around 200 W AC draw on 7800 systems, so I'm guessing it's for the whole system, but you can say that I'm living in the pink VIA C3 world or something if you like...
2. - What's new. It's (almost) always like this. I'm waiting for the VF900, looks good.
I would, but I don't think it'll fit: http://kotisivu.mtv3.fi/igor/kone_024.jpgwarriorpoet wrote: Anyone here use a Thermalright V-1 Ultra on an x1800 yet?
Waiting for the VF900 (too)
I think I will wait for the "better" (anandtech, xbit, etc...) hardware sites to be released from their NDA, and pass the judgement after I see their reviews of x1900 XTX, which I'm sure will mob the floor with the 7800GTX512, which btw has as bad availability as ATIs Phantom Editions in the past. Availability is a bad term anyway as there isn't any available.
If we are to believe ATI on this x1900 cards should be available from day 1 of the release, like the 7800GTX and GT -were.
If we are to believe ATI on this x1900 cards should be available from day 1 of the release, like the 7800GTX and GT -were.
Here you have links to 7 reviews, including Hothardwares own in the text above.
Nobody is interested in non-AA/AF figures nowadays...
My criterias for "good" reviews:
- at least X1900 XTX, XT vs 7800 GTX / GTX 512
- 1280x1024 and 1600x1200
- 4x AA and 16 AF
- Transparency Antialiasing enabled
- nvidia: high-quality in driver panel enabled, all optimizations disabled
- ATI A.I. low, HQ-AF disabled
- power consumption
- noise
- overclockability
- same driver every card
My criterias for "good" reviews:
- at least X1900 XTX, XT vs 7800 GTX / GTX 512
- 1280x1024 and 1600x1200
- 4x AA and 16 AF
- Transparency Antialiasing enabled
- nvidia: high-quality in driver panel enabled, all optimizations disabled
- ATI A.I. low, HQ-AF disabled
- power consumption
- noise
- overclockability
- same driver every card
As a graphic card the x1900 looks very good. To bad it consumes more power then anything seen before and has a cooler that at it slowest is noisier then the GTX 512 on its highest. Cheaper then the 512 GTX tough.
With a good cooler this would be a monster card. But I suspect it will be very hard to cool this card silently.
With a good cooler this would be a monster card. But I suspect it will be very hard to cool this card silently.
I was expecting about 125W, considering that the clocks didn't rise, just the die area. Maybe Ati had to rise the core voltage?
Even at 125W it would be difficult to cool quietly with existing aftermarket coolers. I think the best solution would be an 'expanded' AC Silencer, that uses one more PCI slot. With 50% more heatsink area and 50% more pusher fan area it could run at 1,500 rpm (2D) - 3,000 rpm (3D) and cool a 175W beast quietly (in 2D at least).
I do expect x1900xt performance to increase in OpenGL once Ati improves the drivers.
Even at 125W it would be difficult to cool quietly with existing aftermarket coolers. I think the best solution would be an 'expanded' AC Silencer, that uses one more PCI slot. With 50% more heatsink area and 50% more pusher fan area it could run at 1,500 rpm (2D) - 3,000 rpm (3D) and cool a 175W beast quietly (in 2D at least).
I do expect x1900xt performance to increase in OpenGL once Ati improves the drivers.
Uhm... I guess someone should create a new topic to discusse the new "proper" reviews eh?
And the 7900 is supposed to consume less and run cooler than the 7800 while offering much more performance (maybe more than the X1900).
EDIT: OUCH! and DOUBLE OUCH!
But then again, It simply depends on your point of view don't you agree?
Anyways, buying a card because of the specific game you play is dumb
Oh and bye bye X1800...
And the 7900 is supposed to consume less and run cooler than the 7800 while offering much more performance (maybe more than the X1900).
EDIT: OUCH! and DOUBLE OUCH!
But then again, It simply depends on your point of view don't you agree?
Anyways, buying a card because of the specific game you play is dumb
Oh and bye bye X1800...
Yep, this is exactly what I meant by waiting for the "better" hardware sites, before passing judgement.rpsgc wrote:Uhm... I guess someone should create a new topic to discusse the new "proper" reviews eh?
And the 7900 is supposed to consume less and run cooler than the 7800 while offering much more performance (maybe more than the X1900).
EDIT: OUCH! and DOUBLE OUCH!
But then again, It simply depends on your point of view don't you agree?
Anyways, buying a card because of the specific game you play is dumb
Oh and bye bye X1800...
Not that I will ever buy a gfx of these prices. I am even satisfied with the IGP I am currently using. Maybe, if Vista will need it to run more smoothly...