What is the fastest onboard video, and how fast is it?
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
-
- Patron of SPCR
- Posts: 1069
- Joined: Thu Aug 04, 2005 3:31 pm
- Location: Munich, Bavaria, Europe
What is the fastest onboard video, and how fast is it?
Hello,
Not much to explain here, I haven't looked at onboard video cards since I bought my motherboard, what's the fastest these days?
Is it the intel x3500, or the GeForce 8200, or does AMD have something good going?
And how fast is it, in comparison to say GeForces 8000 series? Is there anything onboard that can match a 8600GS?
Not much to explain here, I haven't looked at onboard video cards since I bought my motherboard, what's the fastest these days?
Is it the intel x3500, or the GeForce 8200, or does AMD have something good going?
And how fast is it, in comparison to say GeForces 8000 series? Is there anything onboard that can match a 8600GS?
Re: What is the fastest onboard video, and how fast is it?
Highly doubt that.klankymen wrote:Is there anything onboard that can match a 8600GS?
AMD has some sort of onboard X1250 deal going on, dunno how fast that is or what chipsets it's in (690G?).
-
- Posts: 1406
- Joined: Tue Feb 13, 2007 7:28 pm
- Location: USA
Is this for an Intel or AMD CPU? Obviously, you're only going to be able to get the X3500 for Intel. You can barely get get the AMD RS690 chipset for Intel and I think the forthcoming RS780 will be completely exclusive to AMD CPU.
Which is faster depends somewhat on application: 3D gaming or video decode? AMD 690 beats nVidia 7050 on many games but the reverse is true for video decode. Intel generally brings up the rear, but the X3500 seems at least in the same class as 690 and 7050. Of course, the nVidia 8200 and AMD RS780 are both almost out and will likely kick X3500 completely to the curb. The newest nVidia IGP will also support hybrid SLI/Crossfire, which gives a good upgrade path if your CPU/MB combo outlasts your current video needs. Something better from Intel will likely come down the pipe soon, but it seems like nVidia/AMD have the same lead on Intel in IGP that Intel has on nVidia in CPU.
Which is faster depends somewhat on application: 3D gaming or video decode? AMD 690 beats nVidia 7050 on many games but the reverse is true for video decode. Intel generally brings up the rear, but the X3500 seems at least in the same class as 690 and 7050. Of course, the nVidia 8200 and AMD RS780 are both almost out and will likely kick X3500 completely to the curb. The newest nVidia IGP will also support hybrid SLI/Crossfire, which gives a good upgrade path if your CPU/MB combo outlasts your current video needs. Something better from Intel will likely come down the pipe soon, but it seems like nVidia/AMD have the same lead on Intel in IGP that Intel has on nVidia in CPU.
well considering the only thing onboard video is good for is decoding...
Any add-on card will be faster in games. Even a slow 8400gs would be faster than the fastest onboard video as far as 3d framerate is concerned. You could maybe run 5-10 yo software with decent framerates with onboard video. Things like starcraft would probably run smooth.
Any add-on card will be faster in games. Even a slow 8400gs would be faster than the fastest onboard video as far as 3d framerate is concerned. You could maybe run 5-10 yo software with decent framerates with onboard video. Things like starcraft would probably run smooth.
I'll disagree with that.. I played all of Half-Life 2 with a E6600 processor and GMA950 graphics - frame-rates were fine, as far as I was concerned (I have a 3850 now, and playing Crysis, the "enjoyability" isn't THAT much better, to be honest!)Aris wrote:well considering the only thing onboard video is good for is decoding...
Any add-on card will be faster in games. Even a slow 8400gs would be faster than the fastest onboard video as far as 3d framerate is concerned. You could maybe run 5-10 yo software with decent framerates with onboard video. Things like starcraft would probably run smooth.
How old is HL2? (Wikipedia - Just over 3 years)
-Dan
played all of Half-Life 2 with a E6600 processor and GMA950 graphics - frame-rates were fine, as far as I was concerned
HL2 is a "light" game compared to today's games or even to Doom 3 and Fear for the same year (2004) with results varied, depending on screen resolution and detail level.
try running any game from the 'post Doom3 era' on an onboard chip and you will most likely fail to enjoy it - not to say - fail to pass the first level.
I failed to run Stalker - a year old game - on my old 6800GS AGP on a P4.
bottom line, as said in recent SPCR article and other articles - on board chips are not for gamers.
If you want to play my advice is get a 8800GT or higher = prepare to spend $300 US or more
or get Xbox360
-
- *Lifetime Patron*
- Posts: 2000
- Joined: Tue May 15, 2007 1:39 am
- Location: Finland
I stumbled upon a nice table for mobile graphics when I was considering the choice of graphics for my new laptop. You can find it here. If you click the link it'll be organised by the 3DMark06 score.
It's not desktop, but I found it a good guideline, a pointer as to what integrated graphics can do. In my experience an add-on card will beat the snot out of integrated any day of the week. A good CPU and abundant RAM can compensate for shortcomings(hell, they DO compensate - the chip works the CPU AFAIK), but it'll never be as good.
I'm not getting into a debate about what's heavy and what's light, but the Crysis experience seems to depend a lot on the CPU, leaving only the pretty details and combat FPS to the GPU. My desktop ran it on Low, but it was slowed down(not as much as Supreme Commander, another CPU/memory intensive one). CoD4 on laptop was not only ugly but had permanent-ish bullet-time. Only thing that kept FPS from cutting out was the Intel dualcore. I have the 2300 HD (Mobility Radeon) and a T5250 in my laptop with 2GB of 667 RAM.
It's not desktop, but I found it a good guideline, a pointer as to what integrated graphics can do. In my experience an add-on card will beat the snot out of integrated any day of the week. A good CPU and abundant RAM can compensate for shortcomings(hell, they DO compensate - the chip works the CPU AFAIK), but it'll never be as good.
I'm not getting into a debate about what's heavy and what's light, but the Crysis experience seems to depend a lot on the CPU, leaving only the pretty details and combat FPS to the GPU. My desktop ran it on Low, but it was slowed down(not as much as Supreme Commander, another CPU/memory intensive one). CoD4 on laptop was not only ugly but had permanent-ish bullet-time. Only thing that kept FPS from cutting out was the Intel dualcore. I have the 2300 HD (Mobility Radeon) and a T5250 in my laptop with 2GB of 667 RAM.
-
- Patron of SPCR
- Posts: 1069
- Joined: Thu Aug 04, 2005 3:31 pm
- Location: Munich, Bavaria, Europe
Hello,
Thanks for the list Das_Saunamies, I had seen that before a while ago, it's pretty useful to place the x3100, but unfortunately no news on the new GeForce 8200. (actually, I just noticed it's not out yet)
To clarify the original post, I am talking about gaming speed, and not any sort of video tasks.
And as for the comments in the vein of anything worse than a 8800gts is useless for gaming - that's not the type of gaming I'm looking to do (but really no offence to those who like that type of gaming, it's just not something I want to blow money on). I don't feel my counter-strike-e-penis is bigger when I get 3000 frames per second, and I don't care if I have to interpolate a smaller resolution, or turn off details in order to play at a decent frame rate, I just want to have a decent frame rate once I turn down the details.
And as for the discussion for which games take more or less ressources, this is of course a valid discussion topic, as they do vary, but not exactly what I was looking for.
I play games for the fun, not for the graphics, I'd rather play half-life 2 on a laptop, than some kind of far-cry 2 graphic'ed game, with crappy gameplay (not that FC2 would neccessarily be so, but there are plenty of games that invest heavily in eye candy, but are just plain boring).
Half Life 2 is actually a good benchmark, the last games I played were HL2 and Pro Evolution Soccer. Both of which ran totally fine on my GF6600GT, although I upgraded to a 8600GTS, due to fitting problems in my new case, and because I got a 1920x1200 monitor. BTW the card runs crysis fine, for the record.
However it does pump several dozen watts, and until nvidia gets their hybrid SLI thingy working (and especially working on XP) I was wondering if it would be reasonable to downgrade to onboard (since penryn has got me thinking about upgrading (even though I swore to wait for nehalem)....)
Thanks for the list Das_Saunamies, I had seen that before a while ago, it's pretty useful to place the x3100, but unfortunately no news on the new GeForce 8200. (actually, I just noticed it's not out yet)
To clarify the original post, I am talking about gaming speed, and not any sort of video tasks.
And as for the comments in the vein of anything worse than a 8800gts is useless for gaming - that's not the type of gaming I'm looking to do (but really no offence to those who like that type of gaming, it's just not something I want to blow money on). I don't feel my counter-strike-e-penis is bigger when I get 3000 frames per second, and I don't care if I have to interpolate a smaller resolution, or turn off details in order to play at a decent frame rate, I just want to have a decent frame rate once I turn down the details.
And as for the discussion for which games take more or less ressources, this is of course a valid discussion topic, as they do vary, but not exactly what I was looking for.
I play games for the fun, not for the graphics, I'd rather play half-life 2 on a laptop, than some kind of far-cry 2 graphic'ed game, with crappy gameplay (not that FC2 would neccessarily be so, but there are plenty of games that invest heavily in eye candy, but are just plain boring).
Half Life 2 is actually a good benchmark, the last games I played were HL2 and Pro Evolution Soccer. Both of which ran totally fine on my GF6600GT, although I upgraded to a 8600GTS, due to fitting problems in my new case, and because I got a 1920x1200 monitor. BTW the card runs crysis fine, for the record.
However it does pump several dozen watts, and until nvidia gets their hybrid SLI thingy working (and especially working on XP) I was wondering if it would be reasonable to downgrade to onboard (since penryn has got me thinking about upgrading (even though I swore to wait for nehalem)....)
EDITED: to more clearly specify processor.
I have a 690v motherboard, a Gigabyte GA-MA69VM-S2 to be exact, which includes an ATI Radeon Xpress 1200. It's paired up with the cheapest dual-core I could find when I purchased it, an AMD Athlon 64 X2 3600+ and 2x 1GB of DDR2 667 RAM.
Yesterday I set myself up with a 10-day free trial account for World of Warcraft. Even though it's "trial" this is the full game.
Part of me thought, "no way, Blizzard is going to tell me I need a "real" video card to play this".
No.
Currently if I play the game in 1280x1024 in 24-bit color, 1x sampling with all settings turned down, the game averages 20 fps in Goldshire (open area with 15 to 20 people in my view, some of them dueling) to 60 fps in smaller, indoor areas.
Keeping the same 1280x1024, 24-bit, 1x sampling but this time with absolutely every settings turn on and at its highest, the frame rate in Goldshire, running around, remains between 10 and 12 fps. The game is gorgeous at these settings, and the full screen glow effect makes higher levels of anti-aliasing hard to justify. As soon as I go inside a building, it shoots up toward 40 fps.
I prefer playing the game at 1280x1024 because that's my Windows desktop resolution, and it makes alt-tabbing back and forth much snappier than if the resolution had to be refreshed.
Keep in mind WoW is not an first person shooter, so one doesn't need ultra low pings and ultra high frames-per-second to play relatively well. Feel free to experiment with different settings to achieve your prefered balance between frame rate and video settings depending on your game.
I have a 690v motherboard, a Gigabyte GA-MA69VM-S2 to be exact, which includes an ATI Radeon Xpress 1200. It's paired up with the cheapest dual-core I could find when I purchased it, an AMD Athlon 64 X2 3600+ and 2x 1GB of DDR2 667 RAM.
Yesterday I set myself up with a 10-day free trial account for World of Warcraft. Even though it's "trial" this is the full game.
Part of me thought, "no way, Blizzard is going to tell me I need a "real" video card to play this".
No.
Currently if I play the game in 1280x1024 in 24-bit color, 1x sampling with all settings turned down, the game averages 20 fps in Goldshire (open area with 15 to 20 people in my view, some of them dueling) to 60 fps in smaller, indoor areas.
Keeping the same 1280x1024, 24-bit, 1x sampling but this time with absolutely every settings turn on and at its highest, the frame rate in Goldshire, running around, remains between 10 and 12 fps. The game is gorgeous at these settings, and the full screen glow effect makes higher levels of anti-aliasing hard to justify. As soon as I go inside a building, it shoots up toward 40 fps.
I prefer playing the game at 1280x1024 because that's my Windows desktop resolution, and it makes alt-tabbing back and forth much snappier than if the resolution had to be refreshed.
Keep in mind WoW is not an first person shooter, so one doesn't need ultra low pings and ultra high frames-per-second to play relatively well. Feel free to experiment with different settings to achieve your prefered balance between frame rate and video settings depending on your game.
Lowest settings ??? Cmon do you recomend to play games in lowest settings...just buy decent add in card or don't play..m^2 wrote:Something's wrong with your setup. CPU?ame wrote:I failed to run Stalker - a year old game - on my old 6800GS AGP on a P4.
With lowest details, Stalker works smoothly on Intel 945G and Pentium D @3ghz.
No problems with Q3 and UT 2k4 either
onboard video is fine for watching movies and/or surfing/photoshop and such it does not have any place in games yet and it won't for a long time ahead.
m^2 wrote
When it just came out I played UT2k4 on 5200fx with decent frame rates (30s). it don't mean ShI@@@
Todays games are demanding and 10-15 fps is unacceptable even in 1994 standards a.k.a doom (1)
...yeah that old setup - P4 2.4 with 1GB and the 6800gs with 128MB and a noizy one too....Something's wrong with your setup. CPU?
With lowest details, Stalker works smoothly on Intel 945G and Pentium D @3ghz.
No problems with Q3 and UT 2k4 either
When it just came out I played UT2k4 on 5200fx with decent frame rates (30s). it don't mean ShI@@@
Todays games are demanding and 10-15 fps is unacceptable even in 1994 standards a.k.a doom (1)
Yes, I do. For me games are almost nothing about the look. Actually I don't like the fact that they are pushing game graphics so quickly because I prefer playing to waiting for levels to load. This is one of reasons I don't like any 3d shooter newer than Q3 (though I played only c.a. 5 newer titles).Redzo wrote:Lowest settings ??? Cmon do you recomend to play games in lowest settings...just buy decent add in card or don't play..m^2 wrote:Something's wrong with your setup. CPU?ame wrote:I failed to run Stalker - a year old game - on my old 6800GS AGP on a P4.
With lowest details, Stalker works smoothly on Intel 945G and Pentium D @3ghz.
No problems with Q3 and UT 2k4 either
onboard video is fine for watching movies and/or surfing/photoshop and such it does not have any place in games yet and it won't for a long time ahead.
-
- Patron of SPCR
- Posts: 1069
- Joined: Thu Aug 04, 2005 3:31 pm
- Location: Munich, Bavaria, Europe
Yeah, I kinda agree with you there - gameplay is a much higher priority than graphics - there's a reason monopoly is a classicm^2 wrote:Yes, I do. For me games are almost nothing about the look. Actually I don't like the fact that they are pushing game graphics so quickly because I prefer playing to waiting for levels to load. This is one of reasons I don't like any 3d shooter newer than Q3 (though I played only c.a. 5 newer titles).Redzo wrote:Lowest settings ??? Cmon do you recomend to play games in lowest settings...just buy decent add in card or don't play..m^2 wrote: Something's wrong with your setup. CPU?
With lowest details, Stalker works smoothly on Intel 945G and Pentium D @3ghz.
No problems with Q3 and UT 2k4 either
onboard video is fine for watching movies and/or surfing/photoshop and such it does not have any place in games yet and it won't for a long time ahead.
That may be true for first person shooters, but even at times playing World of Warcraft with as little as 10 frames per seconds absolutely does not take away from my enjoyment nor my capability to play.ame wrote:Todays games are demanding and 10-15 fps is unacceptable even in 1994 standards a.k.a doom (1)
IGPs really suck for 3D performance, but they can be somewhat playable in games at the lowest resolutions and settings. My 690G IGP will run the Crysis demo at the lowest settings, but it can barely manage 10fps. I haven't tried my MCP73 (GeForce 7100 IGP) Intel motherboard yet (other than noticing it refuses to run my E2140 any faster than 2.3GHz), but considering it only has single channel DDR2, I don't expect it to perform better.klankymen wrote:However it does pump several dozen watts, and until nvidia gets their hybrid SLI thingy working (and especially working on XP) I was wondering if it would be reasonable to downgrade to onboard (since penryn has got me thinking about upgrading (even though I swore to wait for nehalem)....)
Current 3D performance has pretty much come down to shader power. At best current IGPs have 4 shader pipelines. The 8600 = 32 pipes + fast clock, 2600 = 120 pipes + slower clock. Both are decent low-end gaming cards.
For Crysis (and perhaps most future games) a $60 2600XT is your best buy--better power efficiency than a 8600GTS. Also an overclocked 8600GT will match a 8600GTS at most newer games. The 8600 cards have a fast 16X FSAA mode for older games, while AA really kills performance on the low-end ATI cards. The HD 3650 looks to be a complete disappointment too--as in no more efficient than a 2600XT at idle and generally slower.
Something old and cheap (<<$50) like a 12-pixel-pipeline GeForce 7600 or Radeon X800GTO way outperform any current or future IGPs. For HL2 and other Source based games (TF2) either of those old cards are great.
I doubt there will ever be an IGP faster than a Radeon 9700 Pro (even if the RS780 has 40 pipelines or whatever.)
Edit: Okay, some early benchmarks show the RS780 to be faster than a Radeon 9700 Pro (3DMark05=2495).
-
- Posts: 26
- Joined: Mon Oct 02, 2006 12:37 am
MMORPGs can actually be one of the most demanding applications for the video card. Keyword being "raids" .LAThierry wrote:That may be true for first person shooters, but even at times playing World of Warcraft with as little as 10 frames per seconds absolutely does not take away from my enjoyment nor my capability to play.ame wrote:Todays games are demanding and 10-15 fps is unacceptable even in 1994 standards a.k.a doom (1)