Nvidia cards more prone to frame tearing than ATI?

Got a shopping cart of parts that you want opinions on? Get advice from members on your planned or existing system (or upgrade).

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

RaptorZX3
Posts: 867
Joined: Sat Feb 11, 2006 11:57 pm
Location: Montreal, Quebec (Canada)

Nvidia cards more prone to frame tearing than ATI?

Post by RaptorZX3 » Tue Jul 03, 2007 2:58 am

i never noticed that really before, when i had my ATI card (X1300Pro) while in Windows (WinXP SP2), but since i switched brand (Nvidia-chipped video card) i think Nvidia cards are more prone to screen tearing that ATI cards.

i heard a lot of things about ATI being better for 2D than Nvidia, is that the thing i heard about? or maybe that thing i have is due to the drivers and that will be fixed in the future? should i stop being ultra-anxious about that or something? :shock:

Redzo
Posts: 464
Joined: Thu Jan 26, 2006 1:51 am
Location: Sweden, Stockholm

Post by Redzo » Tue Jul 03, 2007 3:17 am

Activate VSync

ayjay
Posts: 48
Joined: Mon Apr 23, 2007 11:01 am

Post by ayjay » Tue Jul 03, 2007 3:20 am

Yup that's pretty much it ;) Also you'll notice tearing more on a tft, so if you've switched screens recently that doesn't help. Not sure if actually happens more, but I certainly notice it more myself.

Das_Saunamies
*Lifetime Patron*
Posts: 2000
Joined: Tue May 15, 2007 1:39 am
Location: Finland

Post by Das_Saunamies » Tue Jul 03, 2007 3:24 am

The better image quality of having enabled VSync will certainly make up for any loss in FPS -- it limits yours to match your refresh rate. So do it, if you are bothered about tearing.

And the long-touted claim by ATI fanboys has been the better image quality. There was a minor yet noticeable difference between an early Radeon and an early GeForce, but I fail to see any today.

And what you have sounds like a fresh buyer's anxiety rather than a serious problem. Tearing on desktop? Come ooon. :lol:

RaptorZX3
Posts: 867
Joined: Sat Feb 11, 2006 11:57 pm
Location: Montreal, Quebec (Canada)

Post by RaptorZX3 » Tue Jul 03, 2007 3:37 am

tearing on desktop when your wallpaper is a bit dark, and you had a full-screen browser with light colors and you close it, sometimes i can see some "tearing" like in that fraction of second it closes, i can see "tearing lines". Must be that tearing occuring because my monitor is 8ms.

i heard there's some kind of hidden Vsync option in Windows XP itself.

i also heard that triple-buffering helps when you enable Vsync in a game.

but i heard that tearing can also occur on CRT monitors, is that true?

ayjay
Posts: 48
Joined: Mon Apr 23, 2007 11:01 am

Post by ayjay » Tue Jul 03, 2007 3:45 am

yeah it is. I used to get it on my CRTs. All it is really is your graphics card sending lower or higher amounts of frames to your monitor than the refresh it runs at. eg my tft runs at 60Hz... send more than 60 frames per second and tearing occurs as it doesnt sync the frames with the monitor refresh.
Vsync sends the frames to the monitor at the start of its refresh cycle so that you see no visible tearing.
On older graphics cards it had a noticeable drop in performance, but modern ones seem happy enough.

Das_Saunamies
*Lifetime Patron*
Posts: 2000
Joined: Tue May 15, 2007 1:39 am
Location: Finland

Post by Das_Saunamies » Tue Jul 03, 2007 3:55 am

ayjay put it just right. The display can only show so many frames being drawn, and if the draws happen at a rate not in sync, you get shown images that are not in sync.

And Windows XP defaults to 60Hz as a refresh rate, and it was persistent in maintaining it. Don't know if an update fixed the "lock" on the refresh rate, but there's certainly no forced VSync methinks.

Triple-buffering just adds headroom for the drawing process, so it may or may not make up for the slower draws. I've found it useful to use, as it often provided more fluid graphics even if at a lower pace, but not completely necessary, especially today. I leave it on whenever I can.

I've had tearing mostly in FPS games, but VSync makes it all go away.

lm
Friend of SPCR
Posts: 1251
Joined: Wed Dec 17, 2003 6:14 am
Location: Finland

Post by lm » Tue Jul 03, 2007 5:27 am

If you are into gaming, consider this:

http://en.wikipedia.org/wiki/Input_lag.

With a LCD, there is a delay between the information about a pixel being sent to the LCD from the computer, and that pixel actually being drawn.

It's because the LCD does buffering and preprocessing and whatnot to the image.

This delay can be as high as 3 frames on some 60Hz LCDs.

Now, triple buffering in the graphics card will also add one frame of delay compared to double buffering.

All this adds up to you seeing an image much later than you could.

You might get good and constant fps, but the image you see is a bit behind all the time.

This is especially bad in apps that need precise timing of user input.

Das_Saunamies
*Lifetime Patron*
Posts: 2000
Joined: Tue May 15, 2007 1:39 am
Location: Finland

Post by Das_Saunamies » Tue Jul 03, 2007 5:47 am

Holy hell, that's news to me! :shock:

Thanks for bringing that up. If all I can get is 60 FPS, I'd rather get the FULL amount and on time, thank you. Now that you think about it, no wonder it's fluid if it's generated a few hundred cycles behind in time. :D

rei
Posts: 967
Joined: Wed Dec 08, 2004 11:36 am

Post by rei » Tue Jul 03, 2007 12:11 pm

ATI drivers also lack aspect-ratio preservation in their drivers when stretching non-native image on LCDs through DVI connector. This, frankly, sucks.

Max Slowik
Posts: 524
Joined: Sun Oct 22, 2006 7:39 pm
Location: Denver, Colorado USA
Contact:

Post by Max Slowik » Tue Jul 03, 2007 12:34 pm

Now, triple buffering in the graphics card will also add one frame of delay compared to double buffering.
This is the only part I'm pretty sure I disagree with. The buffering happens in GPU memory; while it might technically take more time, this amount of time is often much less than 1/60th of a second. It doesn't make the image displayed one frame behind.

When v-sync is enabled, the only difference is how the card handles the information being sent to the display, and it only sends pixels, lines, and frames that are contiguous in the frame buffer (GPU memory). To make sure that there is a whole, complete frame, the GPU tries to generate three frames ahead of time (instead of two).

This actually improves the performance of most games when v-sync is enabled by, in a way, rendering an additional frame ahead. Technically, the frame is delayed, but it's usually only behind by 1/180-1/120th of a second, an amount of time not really perceivable by a person.

This "old" third frame, as well as the "old" second frame, are discarded if for the next frame cycle, the card has the next "on time" frame ready.

RaptorZX3
Posts: 867
Joined: Sat Feb 11, 2006 11:57 pm
Location: Montreal, Quebec (Canada)

Post by RaptorZX3 » Tue Jul 03, 2007 1:44 pm

i tried Fear with and without Vsync.

with: lower fps (30-40fps?), but a lot less tearing

without: frequent tearing, but higher, constant framerate around 50-60fps i would say

running Fear at 1024x768 at maximum details (without FSAA though, since running the game in 1024x768 is doing a "faux anti-aliasing")


ok, the thing is...it seem like i notice that tearing more than before. It that because i'm getting used to LCD monitors? i even see some tearing in Windows when i go from a white-colored webpage to a black-bolored webpage. Does that mean that i have very good eyes?

Max Slowik
Posts: 524
Joined: Sun Oct 22, 2006 7:39 pm
Location: Denver, Colorado USA
Contact:

Post by Max Slowik » Tue Jul 03, 2007 5:02 pm

What kind of monitor, or did I miss that? It sounds like you're noticing display artifacts over video card ones. Try running it at 75Hz, as eyes operate around 30Hz, and multiples of 30Hz are associated with optical illusions.

RaptorZX3
Posts: 867
Joined: Sat Feb 11, 2006 11:57 pm
Location: Montreal, Quebec (Canada)

Post by RaptorZX3 » Tue Jul 03, 2007 6:08 pm

i can't set it at 75Hz, the only one i can choose is 60Hz

it's a 8ms monitor, Samsung Syncmaster 730b

RaptorZX3
Posts: 867
Joined: Sat Feb 11, 2006 11:57 pm
Location: Montreal, Quebec (Canada)

Post by RaptorZX3 » Tue Jul 03, 2007 8:38 pm

ok that's strange....

in my Nvidia control panel, in my Windows control panel, it says that it's set at 60Hz, but when i check my monitor OSD, it says 75Hz

any idea what could cause this?

Das_Saunamies
*Lifetime Patron*
Posts: 2000
Joined: Tue May 15, 2007 1:39 am
Location: Finland

Post by Das_Saunamies » Tue Jul 03, 2007 8:41 pm

One of the things included in the CoolBits registry tweak was the ability to choose how many frames got pre-rendered. I think I set that to 0 and got an improvement in most games, but later had to raise it to 1 for UT and other FPS games.

And now that I've actually had time to think about it, all that latency jazz wasn't worth freaking out about. It's nice to know, but doesn't seem like it'll affect me on the ground level.
--
And to get back on topic, I'd say you're just sensitive to it, either now or forever, or there's something wrong with your hardware. VSync on, if it bothers you, and the rest of the settings will need to follow suit.

No idea about your display's internal reporting mechanics. If it's just reporting what it can do, fine, but if it thinks it's receiving 75Hz while in fact it's not, sounds like a culprit to me. Mine reports 59.9Hz for vertical frequency and 63kHz(:shock:) as the horizontal frequency. That's what I assume the H. and V. mean anyway.

RaptorZX3
Posts: 867
Joined: Sat Feb 11, 2006 11:57 pm
Location: Montreal, Quebec (Canada)

Post by RaptorZX3 » Tue Jul 03, 2007 9:09 pm

it says:
Digital
80.0kHz 75Hz PP
1280x1024

Das_Saunamies
*Lifetime Patron*
Posts: 2000
Joined: Tue May 15, 2007 1:39 am
Location: Finland

Post by Das_Saunamies » Tue Jul 03, 2007 9:22 pm

Try and turn that down, since the kHz are higher as well. Consult your manual to find out why there's a difference, and if there should be.

I assume the PP stands for post-processing. Maybe that has something to do with it.

Max Slowik
Posts: 524
Joined: Sun Oct 22, 2006 7:39 pm
Location: Denver, Colorado USA
Contact:

Post by Max Slowik » Tue Jul 03, 2007 9:26 pm

At this point I think you have a "false" 75Hz display; they use some software to create the extra FPS from a lesser actual refresh rate. I only vaguely remember this deal from an article I read a while back on the mechanics of LCDs and how manufacturers claim things that are unrealistic (2ms G-t-G, for instance, at one point was not technically possible, yet some models magically had that response time), and how this related to so-called 120Hz LCDs.

I'll try to dig it up. In the meantime, invite a friend over with a display; see if it continues to happen.

RaptorZX3
Posts: 867
Joined: Sat Feb 11, 2006 11:57 pm
Location: Montreal, Quebec (Canada)

Post by RaptorZX3 » Tue Jul 03, 2007 10:10 pm

i don't know anyone else who could take his LCD monitor here

all i have here is 2 crappy CRT and my LCD monitor.

i contacted Samsung Canada, asking them if it's normal or if it's a defect or something.

cos i don't think it's normal to see 75Hz when it's set at 60Hz

RaptorZX3
Posts: 867
Joined: Sat Feb 11, 2006 11:57 pm
Location: Montreal, Quebec (Canada)

Post by RaptorZX3 » Wed Jul 04, 2007 7:28 pm

ok i unchecked the option "hide all unsupported refresh rates" in Windows, i switched to 75Hz, but later changed back to 60Hz (Second Life doesn't like the full-screen at 75Hz refresh rate in 1280x1024, strange...)

right now it's at 64.0kHz, 60Hz PP

when i was playing Fear, it was at 1024x768, 48.2kHz 60Hz NN.

and yes i still get tearing unless i activate Vsync in the game, and the performance difference is felt.

Das_Saunamies
*Lifetime Patron*
Posts: 2000
Joined: Tue May 15, 2007 1:39 am
Location: Finland

Post by Das_Saunamies » Thu Jul 05, 2007 2:02 am

Guess you'll have to get used to either one, unless you plan on going back to CRTs.

RaptorZX3
Posts: 867
Joined: Sat Feb 11, 2006 11:57 pm
Location: Montreal, Quebec (Canada)

Post by RaptorZX3 » Thu Jul 05, 2007 2:28 am

brand-new, good quality CRTs are really hard to find now.

2 big stores around are Best Buy and Future Shop, and they don't carry CRTs anymore.

thejamppa
Posts: 3142
Joined: Mon Feb 26, 2007 9:20 am
Location: Missing in Finnish wilderness, howling to moon with wolf brethren and walking with brother bears
Contact:

Post by thejamppa » Thu Jul 05, 2007 2:32 am

I honestly think, CRT is best choice for gaming. Call me nuts.

Das_Saunamies
*Lifetime Patron*
Posts: 2000
Joined: Tue May 15, 2007 1:39 am
Location: Finland

Post by Das_Saunamies » Thu Jul 05, 2007 2:46 am

If you're a genetically fresh specimen, pumped full of Red Bull, then that little extra rate you get from a CRT might just be worth it in an intense FPS. For those living in the casual gaming world, doubtfully so. :D

thejamppa
Posts: 3142
Joined: Mon Feb 26, 2007 9:20 am
Location: Missing in Finnish wilderness, howling to moon with wolf brethren and walking with brother bears
Contact:

Post by thejamppa » Thu Jul 05, 2007 2:47 am

Das_Saunamies wrote:If you're a genetically fresh specimen, pumped full of Red Bull, then that little extra rate you get from a CRT might just be worth it in an intense FPS. For those living in the casual gaming world, doubtfully so. :D
RedBull, Battery, Dr Pepper, Coca Cola, you name it ^^

Max Slowik
Posts: 524
Joined: Sun Oct 22, 2006 7:39 pm
Location: Denver, Colorado USA
Contact:

Post by Max Slowik » Thu Jul 05, 2007 12:22 pm

If you're a genetically fresh specimen, pumped full of Red Bull, then that little extra rate you get from a CRT might just be worth it in an intense FPS. For those living in the casual gaming world, doubtfully so. :D
I've done some not-so-casually competitive gaming. LCDs are the display of choice on account of the better contrast and brightness. I can't really stress the fact that it's not physically possible to see much higher than 30Hz, although your brain can combine images at a higher frequency into composite images of a lower frequency, which is why a full 60FPS looks smoother than 30FPS; if the games had any realistic motion blur whatsoever, the human eye would not be able to discern the difference between 30FPS and a higher frequency.

Given the right circumstances, a particularly trained eye can see the light reflecting off a bullet in flight, because of the way your eye interprets motion blur.

When you really get down to it, video games have, in a way, been designed from the ground up to make their relatively few artifacts noticeable.

Das_Saunamies
*Lifetime Patron*
Posts: 2000
Joined: Tue May 15, 2007 1:39 am
Location: Finland

Post by Das_Saunamies » Thu Jul 05, 2007 1:27 pm

If you can't simulate everything, do try and divert attention to the things you can. Or that's how I see it. Also, games focus on their given theme or objective, so the artifacts are usually case-relevant and for that reason alone get some sort of attention boost, both from makers and players.

I think a case in point would be the reviews of the Max Payne engine when it first came out. Great awe was had at how MaxFX could create individual scorch marks in spent shell casings, but when you played the game, the shells were pretty much the only interactive background. :lol:

But to be more precise, games are object(ive) driven, so no wonder they get so much attention, even to the degree of isolation(from surroundings).

And on the "natural frequency" and motion blur, I agree. Films have been shot at 24 FPS or so for all eternity, and they seem realistic to their audiences from the start, meaning that the natural rate of observation can't be far off that mark. But most graphics-intensive games do seem sluggish or jerky at below 40. Why is that?

I find this to be because of the other aspects involved: the controls stop responding well, the game client isn't as fast to display objects or predict events as the higher performance ones and the whole experience suffers. While the smoothness may be the brain's composite of scattered images, the entire experience is a more complex composite of all these little factors. Abundant data allows for natural redundancy, which translates to a smoother experience. If I can do 30 things in a second, and I miss one, I notice it. If I can do 60 and miss one again, it's only half as big a deal.

That's my theory anyway. I don't know how motion blurring or realistic visuals with sharper and blurred areas in the field of view would change our experience of the rates, but I do know that both add to a game's immersion, drawing attention away from the technical details. And the fact that you're in a simulation and not a real environment. They at least do make FPS gaming a more though-out affair -- á la Call of Cthulhu, FEAR, Battlefield 2142 -- instead of an adrenaline-fueled instagib zoomfest.

Sorry for the extra-long rant, I rewrote it twice, but you just have to say it all to explain your viewpoints. :oops:

RaptorZX3
Posts: 867
Joined: Sat Feb 11, 2006 11:57 pm
Location: Montreal, Quebec (Canada)

Post by RaptorZX3 » Thu Jul 05, 2007 3:04 pm

i tried Counter-Strike today (the normal one, not the Source one) and i didn't noticed any tearing.

maybe the frequent framerate drop can be the cause of frame tearing?

i noticed tearing in Colin McRae Dirt and in Second Life (

Second Life have more-or-less framerate drop, depending on what you see on-screen. And framerate drops in Colin McRae Dirt are frequent. (even some reviews like Gamespot says this, and i'm pretty sure they use top-notch computers to review their games)



update: i've been checking the kHz and the letters beside it, here are the results (with 1280x1024 60Hz setting)

1- 63.8 kHz PP
2- 64.0 kHz PN
3- 64.0 kHz PP

what does "PP" and "PN" mean?

it is normal, sometimes, to see "segments" of a window closing when in Windows when you have a 8ms LCD monitor? (for example, when you click on the X button, when it closes, sometimes you can see 2 or 3 "segments" of that window in that tiny "while closing" fraction of second (maybe it's normal since it's an LCD and it's normally slower than a CRT)

using Windows XP SP2 by the way, with latest patches. Using latest Nvidia drivers (May 31, 2007), and the only official driver for my Samsung Syncmaster 730b (somewhere in 2005 i believe).

i also get some tearing in videos sometimes, fast action mostly.

so does that mean the tearing is in part due to fast-action and black-white color change?

does that mean i've grown too demanding for my "slow" 8ms LCD and i should switch to a faster one? (yeah right, like i have the cash to cough for a new monitor right now :cry: ) i really like the fact that LCDs have more brightness and are easier on the eyes. I think gamers tend to be too demanding on monitor speed.

and tell me....is there a hidden option in Windows XP to enable Vsync or something while you're in Windows?

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Fri Jul 06, 2007 8:51 am

ick.

I duno. I mean my vsync is at 85mhz because im using a pro 22 inch CRT.

I keep wanting to get an LCD but the cost for a 24 inch along with timing issues and refresh makes me wonder. However, it would use a LOT less wattage and less radiation.

im a fan of low levels of radiation though. I think however, that this form is not a good one.

Post Reply