Nvidia cards more prone to frame tearing than ATI?
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
Nvidia cards more prone to frame tearing than ATI?
i never noticed that really before, when i had my ATI card (X1300Pro) while in Windows (WinXP SP2), but since i switched brand (Nvidia-chipped video card) i think Nvidia cards are more prone to screen tearing that ATI cards.
i heard a lot of things about ATI being better for 2D than Nvidia, is that the thing i heard about? or maybe that thing i have is due to the drivers and that will be fixed in the future? should i stop being ultra-anxious about that or something?
i heard a lot of things about ATI being better for 2D than Nvidia, is that the thing i heard about? or maybe that thing i have is due to the drivers and that will be fixed in the future? should i stop being ultra-anxious about that or something?
-
- *Lifetime Patron*
- Posts: 2000
- Joined: Tue May 15, 2007 1:39 am
- Location: Finland
The better image quality of having enabled VSync will certainly make up for any loss in FPS -- it limits yours to match your refresh rate. So do it, if you are bothered about tearing.
And the long-touted claim by ATI fanboys has been the better image quality. There was a minor yet noticeable difference between an early Radeon and an early GeForce, but I fail to see any today.
And what you have sounds like a fresh buyer's anxiety rather than a serious problem. Tearing on desktop? Come ooon.
And the long-touted claim by ATI fanboys has been the better image quality. There was a minor yet noticeable difference between an early Radeon and an early GeForce, but I fail to see any today.
And what you have sounds like a fresh buyer's anxiety rather than a serious problem. Tearing on desktop? Come ooon.
tearing on desktop when your wallpaper is a bit dark, and you had a full-screen browser with light colors and you close it, sometimes i can see some "tearing" like in that fraction of second it closes, i can see "tearing lines". Must be that tearing occuring because my monitor is 8ms.
i heard there's some kind of hidden Vsync option in Windows XP itself.
i also heard that triple-buffering helps when you enable Vsync in a game.
but i heard that tearing can also occur on CRT monitors, is that true?
i heard there's some kind of hidden Vsync option in Windows XP itself.
i also heard that triple-buffering helps when you enable Vsync in a game.
but i heard that tearing can also occur on CRT monitors, is that true?
yeah it is. I used to get it on my CRTs. All it is really is your graphics card sending lower or higher amounts of frames to your monitor than the refresh it runs at. eg my tft runs at 60Hz... send more than 60 frames per second and tearing occurs as it doesnt sync the frames with the monitor refresh.
Vsync sends the frames to the monitor at the start of its refresh cycle so that you see no visible tearing.
On older graphics cards it had a noticeable drop in performance, but modern ones seem happy enough.
Vsync sends the frames to the monitor at the start of its refresh cycle so that you see no visible tearing.
On older graphics cards it had a noticeable drop in performance, but modern ones seem happy enough.
-
- *Lifetime Patron*
- Posts: 2000
- Joined: Tue May 15, 2007 1:39 am
- Location: Finland
ayjay put it just right. The display can only show so many frames being drawn, and if the draws happen at a rate not in sync, you get shown images that are not in sync.
And Windows XP defaults to 60Hz as a refresh rate, and it was persistent in maintaining it. Don't know if an update fixed the "lock" on the refresh rate, but there's certainly no forced VSync methinks.
Triple-buffering just adds headroom for the drawing process, so it may or may not make up for the slower draws. I've found it useful to use, as it often provided more fluid graphics even if at a lower pace, but not completely necessary, especially today. I leave it on whenever I can.
I've had tearing mostly in FPS games, but VSync makes it all go away.
And Windows XP defaults to 60Hz as a refresh rate, and it was persistent in maintaining it. Don't know if an update fixed the "lock" on the refresh rate, but there's certainly no forced VSync methinks.
Triple-buffering just adds headroom for the drawing process, so it may or may not make up for the slower draws. I've found it useful to use, as it often provided more fluid graphics even if at a lower pace, but not completely necessary, especially today. I leave it on whenever I can.
I've had tearing mostly in FPS games, but VSync makes it all go away.
If you are into gaming, consider this:
http://en.wikipedia.org/wiki/Input_lag.
With a LCD, there is a delay between the information about a pixel being sent to the LCD from the computer, and that pixel actually being drawn.
It's because the LCD does buffering and preprocessing and whatnot to the image.
This delay can be as high as 3 frames on some 60Hz LCDs.
Now, triple buffering in the graphics card will also add one frame of delay compared to double buffering.
All this adds up to you seeing an image much later than you could.
You might get good and constant fps, but the image you see is a bit behind all the time.
This is especially bad in apps that need precise timing of user input.
http://en.wikipedia.org/wiki/Input_lag.
With a LCD, there is a delay between the information about a pixel being sent to the LCD from the computer, and that pixel actually being drawn.
It's because the LCD does buffering and preprocessing and whatnot to the image.
This delay can be as high as 3 frames on some 60Hz LCDs.
Now, triple buffering in the graphics card will also add one frame of delay compared to double buffering.
All this adds up to you seeing an image much later than you could.
You might get good and constant fps, but the image you see is a bit behind all the time.
This is especially bad in apps that need precise timing of user input.
-
- *Lifetime Patron*
- Posts: 2000
- Joined: Tue May 15, 2007 1:39 am
- Location: Finland
-
- Posts: 524
- Joined: Sun Oct 22, 2006 7:39 pm
- Location: Denver, Colorado USA
- Contact:
This is the only part I'm pretty sure I disagree with. The buffering happens in GPU memory; while it might technically take more time, this amount of time is often much less than 1/60th of a second. It doesn't make the image displayed one frame behind.Now, triple buffering in the graphics card will also add one frame of delay compared to double buffering.
When v-sync is enabled, the only difference is how the card handles the information being sent to the display, and it only sends pixels, lines, and frames that are contiguous in the frame buffer (GPU memory). To make sure that there is a whole, complete frame, the GPU tries to generate three frames ahead of time (instead of two).
This actually improves the performance of most games when v-sync is enabled by, in a way, rendering an additional frame ahead. Technically, the frame is delayed, but it's usually only behind by 1/180-1/120th of a second, an amount of time not really perceivable by a person.
This "old" third frame, as well as the "old" second frame, are discarded if for the next frame cycle, the card has the next "on time" frame ready.
i tried Fear with and without Vsync.
with: lower fps (30-40fps?), but a lot less tearing
without: frequent tearing, but higher, constant framerate around 50-60fps i would say
running Fear at 1024x768 at maximum details (without FSAA though, since running the game in 1024x768 is doing a "faux anti-aliasing")
ok, the thing is...it seem like i notice that tearing more than before. It that because i'm getting used to LCD monitors? i even see some tearing in Windows when i go from a white-colored webpage to a black-bolored webpage. Does that mean that i have very good eyes?
with: lower fps (30-40fps?), but a lot less tearing
without: frequent tearing, but higher, constant framerate around 50-60fps i would say
running Fear at 1024x768 at maximum details (without FSAA though, since running the game in 1024x768 is doing a "faux anti-aliasing")
ok, the thing is...it seem like i notice that tearing more than before. It that because i'm getting used to LCD monitors? i even see some tearing in Windows when i go from a white-colored webpage to a black-bolored webpage. Does that mean that i have very good eyes?
-
- Posts: 524
- Joined: Sun Oct 22, 2006 7:39 pm
- Location: Denver, Colorado USA
- Contact:
-
- *Lifetime Patron*
- Posts: 2000
- Joined: Tue May 15, 2007 1:39 am
- Location: Finland
One of the things included in the CoolBits registry tweak was the ability to choose how many frames got pre-rendered. I think I set that to 0 and got an improvement in most games, but later had to raise it to 1 for UT and other FPS games.
And now that I've actually had time to think about it, all that latency jazz wasn't worth freaking out about. It's nice to know, but doesn't seem like it'll affect me on the ground level.
--
And to get back on topic, I'd say you're just sensitive to it, either now or forever, or there's something wrong with your hardware. VSync on, if it bothers you, and the rest of the settings will need to follow suit.
No idea about your display's internal reporting mechanics. If it's just reporting what it can do, fine, but if it thinks it's receiving 75Hz while in fact it's not, sounds like a culprit to me. Mine reports 59.9Hz for vertical frequency and 63kHz(:shock:) as the horizontal frequency. That's what I assume the H. and V. mean anyway.
And now that I've actually had time to think about it, all that latency jazz wasn't worth freaking out about. It's nice to know, but doesn't seem like it'll affect me on the ground level.
--
And to get back on topic, I'd say you're just sensitive to it, either now or forever, or there's something wrong with your hardware. VSync on, if it bothers you, and the rest of the settings will need to follow suit.
No idea about your display's internal reporting mechanics. If it's just reporting what it can do, fine, but if it thinks it's receiving 75Hz while in fact it's not, sounds like a culprit to me. Mine reports 59.9Hz for vertical frequency and 63kHz(:shock:) as the horizontal frequency. That's what I assume the H. and V. mean anyway.
-
- *Lifetime Patron*
- Posts: 2000
- Joined: Tue May 15, 2007 1:39 am
- Location: Finland
-
- Posts: 524
- Joined: Sun Oct 22, 2006 7:39 pm
- Location: Denver, Colorado USA
- Contact:
At this point I think you have a "false" 75Hz display; they use some software to create the extra FPS from a lesser actual refresh rate. I only vaguely remember this deal from an article I read a while back on the mechanics of LCDs and how manufacturers claim things that are unrealistic (2ms G-t-G, for instance, at one point was not technically possible, yet some models magically had that response time), and how this related to so-called 120Hz LCDs.
I'll try to dig it up. In the meantime, invite a friend over with a display; see if it continues to happen.
I'll try to dig it up. In the meantime, invite a friend over with a display; see if it continues to happen.
ok i unchecked the option "hide all unsupported refresh rates" in Windows, i switched to 75Hz, but later changed back to 60Hz (Second Life doesn't like the full-screen at 75Hz refresh rate in 1280x1024, strange...)
right now it's at 64.0kHz, 60Hz PP
when i was playing Fear, it was at 1024x768, 48.2kHz 60Hz NN.
and yes i still get tearing unless i activate Vsync in the game, and the performance difference is felt.
right now it's at 64.0kHz, 60Hz PP
when i was playing Fear, it was at 1024x768, 48.2kHz 60Hz NN.
and yes i still get tearing unless i activate Vsync in the game, and the performance difference is felt.
-
- *Lifetime Patron*
- Posts: 2000
- Joined: Tue May 15, 2007 1:39 am
- Location: Finland
-
- *Lifetime Patron*
- Posts: 2000
- Joined: Tue May 15, 2007 1:39 am
- Location: Finland
-
- Posts: 524
- Joined: Sun Oct 22, 2006 7:39 pm
- Location: Denver, Colorado USA
- Contact:
I've done some not-so-casually competitive gaming. LCDs are the display of choice on account of the better contrast and brightness. I can't really stress the fact that it's not physically possible to see much higher than 30Hz, although your brain can combine images at a higher frequency into composite images of a lower frequency, which is why a full 60FPS looks smoother than 30FPS; if the games had any realistic motion blur whatsoever, the human eye would not be able to discern the difference between 30FPS and a higher frequency.If you're a genetically fresh specimen, pumped full of Red Bull, then that little extra rate you get from a CRT might just be worth it in an intense FPS. For those living in the casual gaming world, doubtfully so.
Given the right circumstances, a particularly trained eye can see the light reflecting off a bullet in flight, because of the way your eye interprets motion blur.
When you really get down to it, video games have, in a way, been designed from the ground up to make their relatively few artifacts noticeable.
-
- *Lifetime Patron*
- Posts: 2000
- Joined: Tue May 15, 2007 1:39 am
- Location: Finland
If you can't simulate everything, do try and divert attention to the things you can. Or that's how I see it. Also, games focus on their given theme or objective, so the artifacts are usually case-relevant and for that reason alone get some sort of attention boost, both from makers and players.
I think a case in point would be the reviews of the Max Payne engine when it first came out. Great awe was had at how MaxFX could create individual scorch marks in spent shell casings, but when you played the game, the shells were pretty much the only interactive background.
But to be more precise, games are object(ive) driven, so no wonder they get so much attention, even to the degree of isolation(from surroundings).
And on the "natural frequency" and motion blur, I agree. Films have been shot at 24 FPS or so for all eternity, and they seem realistic to their audiences from the start, meaning that the natural rate of observation can't be far off that mark. But most graphics-intensive games do seem sluggish or jerky at below 40. Why is that?
I find this to be because of the other aspects involved: the controls stop responding well, the game client isn't as fast to display objects or predict events as the higher performance ones and the whole experience suffers. While the smoothness may be the brain's composite of scattered images, the entire experience is a more complex composite of all these little factors. Abundant data allows for natural redundancy, which translates to a smoother experience. If I can do 30 things in a second, and I miss one, I notice it. If I can do 60 and miss one again, it's only half as big a deal.
That's my theory anyway. I don't know how motion blurring or realistic visuals with sharper and blurred areas in the field of view would change our experience of the rates, but I do know that both add to a game's immersion, drawing attention away from the technical details. And the fact that you're in a simulation and not a real environment. They at least do make FPS gaming a more though-out affair -- á la Call of Cthulhu, FEAR, Battlefield 2142 -- instead of an adrenaline-fueled instagib zoomfest.
Sorry for the extra-long rant, I rewrote it twice, but you just have to say it all to explain your viewpoints.
I think a case in point would be the reviews of the Max Payne engine when it first came out. Great awe was had at how MaxFX could create individual scorch marks in spent shell casings, but when you played the game, the shells were pretty much the only interactive background.
But to be more precise, games are object(ive) driven, so no wonder they get so much attention, even to the degree of isolation(from surroundings).
And on the "natural frequency" and motion blur, I agree. Films have been shot at 24 FPS or so for all eternity, and they seem realistic to their audiences from the start, meaning that the natural rate of observation can't be far off that mark. But most graphics-intensive games do seem sluggish or jerky at below 40. Why is that?
I find this to be because of the other aspects involved: the controls stop responding well, the game client isn't as fast to display objects or predict events as the higher performance ones and the whole experience suffers. While the smoothness may be the brain's composite of scattered images, the entire experience is a more complex composite of all these little factors. Abundant data allows for natural redundancy, which translates to a smoother experience. If I can do 30 things in a second, and I miss one, I notice it. If I can do 60 and miss one again, it's only half as big a deal.
That's my theory anyway. I don't know how motion blurring or realistic visuals with sharper and blurred areas in the field of view would change our experience of the rates, but I do know that both add to a game's immersion, drawing attention away from the technical details. And the fact that you're in a simulation and not a real environment. They at least do make FPS gaming a more though-out affair -- á la Call of Cthulhu, FEAR, Battlefield 2142 -- instead of an adrenaline-fueled instagib zoomfest.
Sorry for the extra-long rant, I rewrote it twice, but you just have to say it all to explain your viewpoints.
i tried Counter-Strike today (the normal one, not the Source one) and i didn't noticed any tearing.
maybe the frequent framerate drop can be the cause of frame tearing?
i noticed tearing in Colin McRae Dirt and in Second Life (
Second Life have more-or-less framerate drop, depending on what you see on-screen. And framerate drops in Colin McRae Dirt are frequent. (even some reviews like Gamespot says this, and i'm pretty sure they use top-notch computers to review their games)
update: i've been checking the kHz and the letters beside it, here are the results (with 1280x1024 60Hz setting)
1- 63.8 kHz PP
2- 64.0 kHz PN
3- 64.0 kHz PP
what does "PP" and "PN" mean?
it is normal, sometimes, to see "segments" of a window closing when in Windows when you have a 8ms LCD monitor? (for example, when you click on the X button, when it closes, sometimes you can see 2 or 3 "segments" of that window in that tiny "while closing" fraction of second (maybe it's normal since it's an LCD and it's normally slower than a CRT)
using Windows XP SP2 by the way, with latest patches. Using latest Nvidia drivers (May 31, 2007), and the only official driver for my Samsung Syncmaster 730b (somewhere in 2005 i believe).
i also get some tearing in videos sometimes, fast action mostly.
so does that mean the tearing is in part due to fast-action and black-white color change?
does that mean i've grown too demanding for my "slow" 8ms LCD and i should switch to a faster one? (yeah right, like i have the cash to cough for a new monitor right now ) i really like the fact that LCDs have more brightness and are easier on the eyes. I think gamers tend to be too demanding on monitor speed.
and tell me....is there a hidden option in Windows XP to enable Vsync or something while you're in Windows?
maybe the frequent framerate drop can be the cause of frame tearing?
i noticed tearing in Colin McRae Dirt and in Second Life (
Second Life have more-or-less framerate drop, depending on what you see on-screen. And framerate drops in Colin McRae Dirt are frequent. (even some reviews like Gamespot says this, and i'm pretty sure they use top-notch computers to review their games)
update: i've been checking the kHz and the letters beside it, here are the results (with 1280x1024 60Hz setting)
1- 63.8 kHz PP
2- 64.0 kHz PN
3- 64.0 kHz PP
what does "PP" and "PN" mean?
it is normal, sometimes, to see "segments" of a window closing when in Windows when you have a 8ms LCD monitor? (for example, when you click on the X button, when it closes, sometimes you can see 2 or 3 "segments" of that window in that tiny "while closing" fraction of second (maybe it's normal since it's an LCD and it's normally slower than a CRT)
using Windows XP SP2 by the way, with latest patches. Using latest Nvidia drivers (May 31, 2007), and the only official driver for my Samsung Syncmaster 730b (somewhere in 2005 i believe).
i also get some tearing in videos sometimes, fast action mostly.
so does that mean the tearing is in part due to fast-action and black-white color change?
does that mean i've grown too demanding for my "slow" 8ms LCD and i should switch to a faster one? (yeah right, like i have the cash to cough for a new monitor right now ) i really like the fact that LCDs have more brightness and are easier on the eyes. I think gamers tend to be too demanding on monitor speed.
and tell me....is there a hidden option in Windows XP to enable Vsync or something while you're in Windows?
-
- Friend of SPCR
- Posts: 2887
- Joined: Mon Feb 28, 2005 4:21 pm
- Location: New York City zzzz
- Contact:
ick.
I duno. I mean my vsync is at 85mhz because im using a pro 22 inch CRT.
I keep wanting to get an LCD but the cost for a 24 inch along with timing issues and refresh makes me wonder. However, it would use a LOT less wattage and less radiation.
im a fan of low levels of radiation though. I think however, that this form is not a good one.
I duno. I mean my vsync is at 85mhz because im using a pro 22 inch CRT.
I keep wanting to get an LCD but the cost for a 24 inch along with timing issues and refresh makes me wonder. However, it would use a LOT less wattage and less radiation.
im a fan of low levels of radiation though. I think however, that this form is not a good one.