Sign in with
Sign up | Sign in
Your question

Vsync on or Vsync off?? How do you roll?

Last response: in Graphics & Displays
Share

Vsync on or Vsync off?? How do you roll?

Total: 139 votes (39 blank votes)

  • Vsync on...
  • 53 %
  • Vsync off :O
  • 47 %
April 7, 2008 11:13:04 PM

It's one of those epic questions, Vsync on, or Vsync off.

Vsync on:
-Perfectly aligned frames, no tearing.
-Less likely to crash.
-Locked at you're monitor's refresh rate. Usually can only go to 60 FPS (mine goes to 75 FPS though... somehow)
-FPS can bog down more then vsync off.

Vsync off:
-Virturally unlimited FPS....
-More likely to crash, but the difference is generally pretty low. (It depends on the card and the game)
-FPS does not bog down.

I'm actually interested to see what the results are. I personally keep vsync on, as I like the frames to be smooth as silk. As far as I know the human eye can't see more then 40FPS, but I'm posting this to see other people's point of view. Maybe there's something I'm missing.

More about : vsync vsync roll

April 7, 2008 11:20:32 PM

I leave it off. It really serves no purpose. When frame rate is low enough that you see tearing with vsync off, turning it on will further drop frame rate, which makes no sense. When frame is high, you won't see tearing anyway, so there's no point in restricting frame rate. :p 
April 7, 2008 11:35:16 PM

doomsdaydave11 said:
It's one of those epic questions, Vsync on, or Vsync off.
Maybe there's something I'm missing.



You are missing the fact that what you ask is a fallacy based on a false dilemma

Let your immediate needs determine if you need V-sync on or off, based on the advantages and disadvantages.

It takes mere seconds and zero effort to turn V sync on or off and judge the results accordingly.

Related resources
Can't find your answer ? Ask !
April 7, 2008 11:37:59 PM

as far as i know the eye can notice the difference between 0 and about 120fps (i know i can tell the difference between 60 fps and 85fps). This is an interesting read... http://www.100fps.com/how_many_frames_can_humans_see.ht...

my personal favorite is vsync on because of all of the positives that have been stated in the top post + i think 60 fps is alot more playable than tearing. Infact i consider tearing to be barely playable it is that horrible to look at

doomsdaydave11 said:
Locked at you're monitor's refresh rate. Usually can only go to 60 FPS (mine goes to 75 FPS though... somehow)

maybe your monitors refresh rate is 75Hz :kaola:  i used to run @ 85Hz on me old crt therefore i had 85fps.

dagger said:
I leave it off. It really serves no purpose. When frame rate is low enough that you see tearing with vsync off, turning it on will further drop frame rate, which makes no sense. When frame is high, you won't see tearing anyway, so there's no point in restricting frame rate. :p 

bollix.

EDIT: I think this is an interesting topic and can't wait to see the end results for the poll. :lol: 
April 7, 2008 11:43:14 PM

bf2gameplaya said:
You are missing the fact that what you ask is a fallacy based on a false dilemma

Let your immediate needs determine if you need V-sync on or off, based on the advantages and disadvantages.

It takes mere seconds and zero effort to turn V sync on or off and judge the results accordingly.
That's why I stated "maybe there's something I'm missing". When I turn Vsync off on CSS, it looks like crap. There is horozontal tearing everywhere. When I turn it back on, it looks fine. Sure the frame rate goes up to 120 on occasion, but the 75 FPS that I normally get instead is fine... I can't tell the difference. Like some people have said here, they can't see the tearing, but I can. In other games I can see the tearing too.

a c 171 U Graphics card
April 7, 2008 11:53:55 PM

In my drivers, its set for application managed. Can't say I've ever messed with it one way or another.
April 8, 2008 12:22:02 AM

Vsync brings noticeable improvements for me on COD4 at hi res, but I go back and forth.
a b U Graphics card
April 8, 2008 12:36:07 AM

I had a game that would tear horribly if V-Sync was off, so I turned it on. I was lazy, so it never got turned off, so my vote is for on.
April 8, 2008 12:46:38 AM

Sharft6 said:
maybe your monitors refresh rate is 75Hz :kaola:  i used to run @ 85Hz on me old crt therefore i had 85fps.


Yeah my old CRT did too. I was just assuming that everyone had LCD's these days ;) 
a c 171 U Graphics card
April 8, 2008 1:00:13 AM

Not everyone. I can't stand the buggers, CRT until my flaming eyeballs fall out.
April 8, 2008 1:07:03 AM

But if the monitor can only refresh 60 times per second (or 75 or 85) does it actually make a difference that your video card is outputting 120fps? I'm missing something here.
a b U Graphics card
April 8, 2008 1:09:48 AM

No you're not. You are absolutely right. Sort of blows a big hole in the "I can see the difference between 100FPS and 120FPS" argument. You can really only say that if your monitor can output at 120Hz, which few can.
April 8, 2008 1:12:22 AM

chris312 said:
But if the monitor can only refresh 60 times per second (or 75 or 85) does it actually make a difference that your video card is outputting 120fps? I'm missing something here.


yeah thats a good point. the only reason i could notice the diff between 60 and 85 was because my monitors max refresh rate was 85.
April 8, 2008 1:18:18 AM

Yeah I have an LCD monitor with a refresh rate of only 60hz. I'm no videophile, though, so anything above 30fps looks good enough for me.

On-topic, though, I have never had any problems with tearing so I completely ignore Vsync unless I want to see exactly how much more I can turn up the graphics settings or do some kind of benchmark.
April 8, 2008 1:19:00 AM

EXT64 said:
No you're not. You are absolutely right. Sort of blows a big hole in the "I can see the difference between 100FPS and 120FPS" argument. You can really only say that if your monitor can output at 120Hz, which few can.



That's what I always thought.... then why do people turn of Vsync if it doesn't really change anything, except horizontal tearing... which makes it look worse?
a c 107 U Graphics card
April 8, 2008 1:42:51 AM

most games on....source games do not like it sometimes(seems to make lag spots near fire oddly enough) so its off there :( 

and to some above...high fps DO show the page tear effect..
April 8, 2008 1:50:29 AM

i use vsync in every game i play. The tearing looks awful sometimes...

The arguments on here about your monitor only being capable of 60 fps anyway are right on. And anyway, whether or not the eye can tell the difference between 50 and 60 frames per second, I'd choose 50 any day if it meant the image wasn't literally being torn to hell.
a c 358 U Graphics card
April 8, 2008 2:32:37 AM

Depends on the game I suppose. I recall turning v-sync off in a space sim and the graphics became distorted. Switched it back on and the graphics were fine again.
April 8, 2008 4:14:17 AM

rayzor said:
i use vsync in every game i play. The tearing looks awful sometimes...

The arguments on here about your monitor only being capable of 60 fps anyway are right on. And anyway, whether or not the eye can tell the difference between 50 and 60 frames per second, I'd choose 50 any day if it meant the image wasn't literally being torn to hell.


Exactly my thoughts on the matter.
April 8, 2008 4:35:12 AM

I have vsync on.
a b U Graphics card
April 8, 2008 4:50:57 AM

I generally run with VSync app controlled, BUT, if Im disappointed or not satisfied with my gaming performance, I may try to adjust (on/off) it . Ill do this because after reading this from Anands "Every time I set up a system, because I want to ensure maximum performance, the first thing I do is force VSYNC off in the driver. I also generally run without having the graphics card scale output for my panel; centered timings allow me to see what resolution is currently running without having to check. But I was in a hurry on Sunday and I must have forgotten to check the driver after I set up an 8800 Ultra SLI for testing Crysis.

Low and behold, when I looked at the numbers, I saw a huge performance increase. No, it couldn’t be that VSYNC was simply not forced off in the driver could it? After all, Crysis has a setting for VSYNC and it was explicitly disabled; the driver setting shouldn’t matter.

But it does.

Forcing VSYNC off in the driver can decrease performance by 25% under the DX10 applications we tested. We see a heavier impact in CPU limited situations. Interestingly enough, as we discussed last week, with our high end hardware, Crysis and World in Conflict were heavily CPU and system limited
" Heres the link from the 9800GTX review http://www.anandtech.com/video/showdoc.aspx?i=3275&p=4 So evidently it can make a huge difference, especially in certain situations
April 8, 2008 5:22:53 AM

Always on, even in Crysis! Any tearing is bad news to me
April 8, 2008 5:56:28 AM

I agree, tearing fails more than low FPS. I think it's harder to see what's happening when the picture busts in half vs. the slideshow effect of low FPS. I'd rather have it on regardless.
a c 130 U Graphics card
April 8, 2008 8:18:11 AM

doomsdaydave11 said:
That's what I always thought.... then why do people turn of Vsync if it doesn't really change anything, except horizontal tearing... which makes it look worse?


Its a little more than just locking the framerate to 60 it also locks it to sub divisions or intergers of 60 so if you have a card that can dance along at 120 FPS then you have spare horses and can enable V-Sync and maybe tripple buffering as well if you like.(This helps to ensure the card has a frame ready when the monitor asks for one) But if your hardware is maxing out at about 60 FPS then its likley that when things get busy in game the frames will drop below that but they can only jump between the set numbers with V-Sync on but is free to go any where between with V-Sync off.
Mactronix
April 8, 2008 10:13:56 AM

I agree with most people, tearing looks horrible in most games, so Vsync has to be on when I play.

I also don't understand why people say tearing doesn't happen at higher frame rates, because thats BS. If Vsync is off I see tearing at any frame rate in all the games I play.
April 8, 2008 11:23:35 AM

I don't bother with it.
Occasionally I do see tearing, but it is very minor and unless you're actively looking for it, I don't think it's particularly noticeable.
April 8, 2008 12:23:06 PM

I see it constantly. I have to have Vsync on. It happens at all times, at all framerates. There's always a differential between the upper and lower halves of the frame. It drives me nuts.
April 8, 2008 12:37:09 PM

Very rarely see it and since I'm too engrossed in fps games it never bothers me.
Halo 2 on pc alright I switched it on for but otherwise always off.
April 8, 2008 9:39:04 PM

I'll say it again, your specific and immediate circumstances determine if you need V-sync on or off, based on your gear, your game and if you are playign the game on-line or not. You can change it on the fly, or force it in drivers, you can change back and forth as often as you want, whenever you want. There are no restrictions, just considerations.

What's more, you LCD users who have never had a CRT that can hit refresh rates up to 160Hz have absolutely no idea what you are missing in terms of a rock-solid, vibrant and color accurate image.

V-sync was designed in mind with CRTs and its operation is well understood, that *you* do not understand how it works or when it should be applied is your failing, not V-sync.
April 8, 2008 10:40:03 PM

Hey bf2gameplaya,
What about those of us who play on a LCD w/120hz refresh rate Sony Bravia 40" XBR4?
So "us LCD users" eh?, and you want to play the dichotomy between LCD users and CRT.. Well, you can take that 160hz CRT screen and shove it.. (if it will fit)
I think we know what we're missing... especially since you need enough space on your desk and/or enjoy looking at the side of a refrigerator.. keep it buddy and send me a .jpeg of your happy face during your happiest hour on it, and we'll try to make you an affiliate/representative for the company you got it from.

Easy on the brown clown brick packed bud your token..

BW11
April 8, 2008 10:55:01 PM

Yes, because the screen updating faster makes your colors more vibrant. Also, to get all 64 bits on your processor working, you have to have at least 768MB VRam.
April 9, 2008 2:21:35 AM

jkflipflop98 said:
Yes, because the screen updating faster makes your colors more vibrant. Also, to get all 64 bits on your processor working, you have to have at least 768MB VRam.


come again??

Anyway, in response to bf2gameplaya, you are correct. We can all debate the technical issues til we are blue in the face. However, for me (an ignorant lcd user:) ) i've found tearing to be so frequent that i'd rather just leave it on as a rule. There honestly seems to be no point in ever disabling it unless you experience some weird performance-killing bug that is really messing with your frame rate.
a c 107 U Graphics card
April 9, 2008 2:33:25 AM

bf2gameplaya is right CRT's display more colors then 90% of the lcd's on the market....and they IMO generally look much better for games(more then anything due to excellent response times), but since they are getting harder to find, i am stuck the lcd route. I do not miss the heat and power my crt used tho :) 

Blackwater11, I sure hope sony is better now with there technology. i had a very expensive Sony screen(the more expensive thing they had at the time, just because i wanted to make sure what i got was good) and both the colors and response sucked hard(its looked great[the darkest blacks i ever did see on LCD, but it took allot of power for this since they did it with a very dark cover on the screen and needed lots of light to make it work] for TV/DVD[looks are deceiving until you pull out photoshop and game on it], but for a computer screen it sucked). I stuck that back to the store. It was a great idea the way they made the blacks truly black and the fact the screen was shiny and looked very bright, but the banding colors and low response hurt it.

jkflipflop98 said:
Yes, because the screen updating faster makes your colors more vibrant. Also, to get all 64 bits on your processor working, you have to have at least 768MB VRam.

Clear sarcasm, Its the screen thats bright and vibrant, but those attributes to not make true accurate color.
April 10, 2008 6:45:28 AM

either my lcd ownz (syncmaster 2232bw) or my CRT sucked (the best crt i've had by far). sure my lcd has its draw backs but the clarity, size, shape and colors are wayyyyyyyyyyyyyyyyyyyy better.

also bf2gamepla what do you mean vsync was designed with CRT in mind? i thought it was designed with fps>refresh rate in mind. if i have your confusing sentences mixed up then feel free to further elaborate.
April 10, 2008 8:07:35 AM

Why would you not?

60 FPS is fine, and it the image quality is way better without tearing.
April 10, 2008 9:47:34 AM

If i'm consistently running more than 60FPS i have vsink on to avoid tearing, but if i've got low FPS vsink decreases my fps so then i disable it and just live with the tearing, personally i prefer vsink if i can get my fps nice and high.
April 10, 2008 1:56:31 PM

indeed....vsync makes games look good....but 4 me its a very high price keepin it on. on ma 9600gt assasins creed runs at 130fps witout vsync n 25 fps wit vsync......is dat normal
a b U Graphics card
April 10, 2008 1:58:28 PM

Sometimes itll halve your lowest fps, so yea possibly
April 10, 2008 3:01:47 PM

depends on the game. Some games, you can barely notice the tearing, and the penalty for turning it on is huge. Some games, the tearing is unbearable, and Vsync on runs just as fast.
April 10, 2008 5:01:32 PM

RE "what the eye can see" i was watching that human body show on the discovery channel, and they in fact said under normal conditions the human eye only interprets 30 fps. On the other hand they were explaining the phenomenon that people in stressful situations say that something that maybe took a few seconds seemed so much longer. Under extreme stress your body will interpret 60fps and beyond, the reason behind this is so you can process visual information faster and decrease the time it takes you to make a decision. They did a test with this, they handed a screen to potential bungee jumpers that were going on their first jump were to hold. It was counting through an array of numbers and it was impossible for them to make out the numbers or what order they were in. So when they jumped (reading the screen on the way down) they were stressed and able to tell the individual numbers and what order they came in. (the screen was like a digital clock repeating the same array of numbers)

As for that link from a previous poster, I am not saying its wrong, but I see no scientific backing for it on there. I would tend to agree with the Discovery channel (assuming they wouldnt want to get called out putting false information on a learning channel)
April 11, 2008 1:51:30 AM

i think interpreting each frame in enough detail to identify a specific number is different to noticing how fast a picture is changing.

Also i just did a quick test with source SDK base - orange box and my lowest fps was higher by 1 with vsync on but in general vsync definatly took my fps down by up to about 5-10fps. I'm just curious to know what causes this.. anyone??
April 11, 2008 3:06:20 AM

i saw that episode tsd16. it was extremely interesting

i always have vsync on, even though the nvidia control panel says it might decrease performance.
screen tearing blows, so vsync is the way to go

also, i have never seen an lcd that could display blacks like my old crt could
a b U Graphics card
April 11, 2008 3:11:03 AM

Vsync matches the framerate from the card with the possible refresh rates of your monitor. Since it can't speed the card up to match, the only thing it can do is slow it down.
April 11, 2008 5:48:04 AM

Sharft6 said:
either my lcd ownz (syncmaster 2232bw) or my CRT sucked (the best crt i've had by far). sure my lcd has its draw backs but the clarity, size, shape and colors are wayyyyyyyyyyyyyyyyyyyy better.

also bf2gamepla what do you mean vsync was designed with CRT in mind? i thought it was designed with fps>refresh rate in mind. if i have your confusing fother mucking sentences mixed up then feel free to further express your opinions.


Your CRT sucked. But that's not important. Neither is your appalling lack of education, since I can educate you on this subject:

http://en.wikipedia.org/wiki/Vertical_synchronization

You're welcome.
April 11, 2008 6:19:31 AM

i come to these forums to be educated :D  maybe you could help me with my question bf2gameplaya? i understand that the card needs to either have a frame ready for the refresh or not. where does the big fps drop come from?

EXT64 said:
Vsync matches the framerate from the card with the possible refresh rates of your monitor. Since it can't speed the card up to match, the only thing it can do is slow it down.


apart from slowing the fps down slightly i don't understand how taking it down by 5-10 fps helps. I'm not afraid of walls of text so feel free to be a bit more keyboard happy on your explanations :) 
a c 130 U Graphics card
April 11, 2008 8:17:37 AM

Sharft6 said:
i come to these forums to be educated :D  maybe you could help me with my question bf2gameplaya? i understand that the card needs to either have a frame ready for the refresh or not. where does the big fps drop come from?



apart from slowing the fps down slightly i don't understand how taking it down by 5-10 fps helps. I'm not afraid of walls of text so feel free to be a bit more keyboard happy on your explanations :) 


This is a brief explanation that i posted earlier in the thread if you still have questions i will be happy to elabourate a bit more.
Its a little more than just locking the framerate to 60 it also locks it to sub divisions or intergers of 60 so if you have a card that can dance along at 120 FPS then you have spare horses and can enable V-Sync and maybe tripple buffering as well if you like.(This helps to ensure the card has a frame ready when the monitor asks for one) But if your hardware is maxing out at about 60 FPS then its likley that when things get busy in game the frames will drop below that but they can only jump between the set numbers with V-Sync on but is free to go any where between with V-Sync off.
Mactronix
April 11, 2008 10:50:59 AM

yeah i kinda missed the purpose of locking the fps into sub devisions.

i'm glad your prepared to elaborate :D 
a c 130 U Graphics card
April 11, 2008 11:29:00 AM

Sharft6 said:
yeah i kinda missed the purpose of locking the fps into sub devisions.

i'm glad your prepared to elaborate :D 


Rather than type it all out from memory and probably miss some things out here is an extract from "Tweakguides".
I have supplied the link to the whole article at the bottom.

There is however a more fundamental problem with enabling VSync, and that is it can significantly reduce your overall framerate, often dropping your FPS to exactly 50% of the refresh rate. This is a difficult concept to explain, but it just has to do with timing. As we know, when VSync is enabled, your graphics card pretty much becomes a slave to your monitor. If at any time your FPS falls just below your refresh rate, each frame starts taking your graphics card longer to draw than the time it takes for your monitor to refresh itself. So every 2nd refresh, your graphics card just misses completing a new whole frame in time. This means that both its primary and secondary frame buffers are filled, it has nowhere to put any new information, so it has to sit idle and wait for the next refresh to come around before it can unload its recently completed frame, and start work on a new one in the newly cleared secondary buffer. This results in exactly half the framerate of the refresh rate whenever your FPS falls below the refresh rate.



As long as your graphics card can always render a frame faster than your monitor can refresh itself, enabling VSync will not reduce your average framerate. All that will happen is that your FPS will be capped to a maximum equivalent to the refresh rate. But since most monitors refresh at 60Hz or above, and in most recent games it is difficult to achieve 60FPS consistently at your desired resolution and settings, enabling VSync usually ends up reducing your FPS. Fortunately, because this problem is pretty much caused by the frame buffers becoming filled up, there is a solution and that's to enable a third frame buffer to allow more headroom. However this is not a straightforward solution, and to read more about this see the Triple Buffering section.

http://www.tweakguides.com/Graphics_9.html

Happy reading
Mactronix :) 
April 12, 2008 12:39:47 AM

mactronix said:
This is a brief explanation that i posted earlier in the thread if you still have questions i will be happy to elabourate a bit more.
Its a little more than just locking the framerate to 60 it also locks it to sub divisions or intergers of 60 so if you have a card that can dance along at 120 FPS then you have spare horses and can enable V-Sync and maybe tripple buffering as well if you like.(This helps to ensure the card has a frame ready when the monitor asks for one) But if your hardware is maxing out at about 60 FPS then its likley that when things get busy in game the frames will drop below that but they can only jump between the set numbers with V-Sync on but is free to go any where between with V-Sync off.
Mactronix


That isn't true at all. My 8800GT doesn't drop from 60 to 30 ever. It does go down to 55 sometimes when things get really demanding, but that's about it.
April 12, 2008 3:20:24 AM

Many Thanks!

i noticed in WoW when i have vsync enabled but not triple buffering my fps are almost halved in some place. enabling triple buffering immediately bumped the fps back up to 50-60.

i'm still not sure where the whole locked into sub devisions of 60fps comes from because the writer dismisses it as a difficult concept to explain. I don't REALLY want to know the answer otherwise i would do the research myself. i just bought it up thinking somebody might already know what's going on and would be able to share the explanation with us.
!