Sign in with
Sign up | Sign in
Your question
Closed

Vsync on a 60Hz

Last response: in Graphics & Displays
Share
a b U Graphics card
September 8, 2012 8:39:36 AM

Hello,

I have a 60Hz monitor and a GPU that can push well over 60 fps in most games. My question is: since my monitor cannot display more than 60 fps, do I need to enable vsync in my games? I understand that there is tearing without vsync but I've seen that only on 120Hz monitors that can display more than 120 fps. So if my monitor is incapable of displaying more than 60 fps, is my GPU able to push the frames beyond that, thus creating tearing? I'm really confuzzled about this...

More about : vsync 60hz

a b U Graphics card
September 8, 2012 8:53:32 AM

Yea definitely, are you not noticing any tearing while gaming?....
a b U Graphics card
September 8, 2012 9:07:53 AM

No, not at all. The latest game I played was Syndicate and it was always pressed at 59-60 fps. I'd imagine it would have been 80-85 w/o the 60 Hz limitation but I still didn't notice tearing... Vsync surely wasn't on because the frames weren't dipping.
Related resources
a b U Graphics card
September 8, 2012 10:10:21 AM

Im not sure i follow you.....

The 60hz limitation you speak of is Vsync.... You will only see tearing if your framerate is above the refresh rate of your monitor.

If you were hovering around 59-60fps then Vsync was enabled.
a c 289 U Graphics card
a b C Monitor
September 8, 2012 11:01:47 AM

auntarie said:
No, not at all. The latest game I played was Syndicate and it was always pressed at 59-60 fps. I'd imagine it would have been 80-85 w/o the 60 Hz limitation but I still didn't notice tearing... Vsync surely wasn't on because the frames weren't dipping.


Why would the frames be dipping with vsync?
a b U Graphics card
September 9, 2012 11:16:17 AM

paddys09 said:
Im not sure i follow you.....

The 60hz limitation you speak of is Vsync.... You will only see tearing if your framerate is above the refresh rate of your monitor.

If you were hovering around 59-60fps then Vsync was enabled.


Are you sure? I thought it was locked at 60 fps because 60Hz monitors can't display more...
a b U Graphics card
September 9, 2012 11:20:10 AM

Sunius said:
Why would the frames be dipping with vsync?


It was explained to me that Vsync made the frames dip to about 30 or so whenever they exceed the refresh rate of the monitor.
September 9, 2012 12:05:30 PM

auntarie said:
It was explained to me that Vsync made the frames dip to about 30 or so whenever they exceed the refresh rate of the monitor.


No expert, but as long as the card always has a frame ready during the vertical blank interrupt (or whatever you call it with modern monitors,) you should run at 60 frames / second.

Where I think you run into artificial frame limiting is when your video card can't keep up with the monitor's display rate. In this case, you can have a lower than expected frame rate, because the monitor could be ready to display a frame, but the video card isn't ready, so the monitor will display the existing frame buffer. And the next frame has to wait to be displayed until the next refresh "window." I seem to recall that triple buffering can maybe fix this.

There's an in depth article on this topic I read sometime ago.
September 9, 2012 12:12:00 PM

@ MichaelJHuman
Like I understood it tearing happens when the GPU is too fast for the monitor. This means that while the monitor is still displaying, lets say frame 1, frame 2 and 3 are already rendered so as frame 1 is refreshed and frame 2 gets displayed frame 3 is already sent to the monitor and the monitor starts displaying frame 3 without frame 2 being displayed. This means that both frames are trying to be displayed at the same time. At least thats sort of how I understood it.
September 9, 2012 12:14:49 PM

Right, that's what causes tearing. I was trying to explain how vsync could limit fps if the GPU can't keep up with the monitor refresh, not the opposite issue which causes tearing.
September 9, 2012 12:18:52 PM

Oh sorry didn't understand it that way. Yes, I agree with you that's the only way I can think that Vsync will drop framerates. But then you wouldn't need Vsync anyway so the OP shouldn't worry about Vsync anyway?

Best solution

a b U Graphics card
September 9, 2012 12:23:04 PM
Share

auntarie said:
Are you sure? I thought it was locked at 60 fps because 60Hz monitors can't display more...


Yes I'm sure.... If you disable Vsync you'll will start getting more than 60fps but will start experiencing screen tearing.

Vsync can also be enabled in the Nvidia or AMD control panel and if it is.... disabling it in game won't make a difference.

Vsync may lock you down to 30fps if your card isnt capable of providing a constant 60fps to prevent lag spikes, this is why Nvidia developed Adaptive Vsync.

Its not just a setting to randomly drop your fps to 30 every now and again, why would that exist?




a b U Graphics card
September 9, 2012 1:12:19 PM

paddys09 said:
Yes I'm sure.... If you disable Vsync you'll will start getting more than 60fps but will start experiencing screen tearing.

Vsync can also be enabled in the Nvidia or AMD control panel and if it is.... disabling it in game won't make a difference.

Vsync may lock you down to 30fps if your card isnt capable of providing a constant 60fps to prevent lag spikes, this is why Nvidia developed Adaptive Vsync.

Its not just a setting to randomly drop your fps to 30 every now and again, why would that exist?


I see. Thanks. By the way, my vsync is application controlled at the moment, would you recommend setting it to "on" in the control panel?
a b U Graphics card
September 9, 2012 1:24:18 PM

No, you can assign it to applications or just use the built in vsync in games.

No worries glad to clear it up for you :) 
a c 289 U Graphics card
a b C Monitor
September 9, 2012 1:35:09 PM

I personally force adaptive vsync in control panel.
a b U Graphics card
September 9, 2012 1:41:35 PM

Sunius said:
I personally force adaptive vsync in control panel.


I used to force it in the control panel too, but i noticed in minecraft that i get no tearing at all even though im getting 300+ fps at some points?

Anyone know why that is?
a c 289 U Graphics card
a b C Monitor
September 9, 2012 1:43:19 PM

It really depends on how the game is coded.

Running 300 fps is pretty useless, unless your room is too cold to be in :D 
a b U Graphics card
September 9, 2012 1:55:05 PM

lol, i know but im still curious why this happens, it appears only to be in minecraft.

It definitely does feel alot more fluid at 150 compared to 60 and its definitely displaying all the frames as fraps and afterburner seem to agree with it.

Is it to do with how the frames are loaded?
a c 289 U Graphics card
a b C Monitor
September 9, 2012 2:02:26 PM

As for being more fluid - I bet that's placebo effect. Your monitor cannot display more than 60 fps even though the card may generate more of them.
a b U Graphics card
September 9, 2012 2:13:00 PM

Yea.... Its the only occasion that Vsync has confused me though and thought it would be worth posting.

Maybe forcing vsync in minecraft causes some sort of lag due to the way its coded? Im not going to let it bother me because it works but still don't understand why, some sort of buffering or frame selection?

I guess i'll never figure it out so ill give it the only plausible answer....

Aliens!
a c 289 U Graphics card
a b C Monitor
September 9, 2012 2:20:11 PM

It might cause input lag, but not real lag for sure.
a b U Graphics card
September 9, 2012 2:34:45 PM

Yea it must be because im having the same issue when forcing vsync on my 120hz monitor instead of the tv. Definitely input lag as the frames do stay around 60/120.

Ah well guess i'll never know
September 9, 2012 3:32:40 PM

What if its because 300 is a multiple of 60? i.e. every 5th frame is getting displayed? Just one of those random thoughts....
a c 79 U Graphics card
a b C Monitor
September 9, 2012 3:56:30 PM

bigbasedrum said:
What if its because 300 is a multiple of 60? i.e. every 5th frame is getting displayed? Just one of those random thoughts....

it doesnt really work quite like that.. (welll it does if the game is opengl based and uses tripple buffering, directX doesn't support it natively)
With tripple buffering the monitor gets the latest full image every time it refreshes. Without tripple buffering when the fps is higher than the refesh rate of the monitor the card will eventually overwrite the image buffer while the previous image is being sent, causing the image shown on the monitor to consist of two different images with slight time difference between them, the upper part is from the older image and the bottom is from the new one so stuff has moved about and this will show up as tearing. Tripple buffering adds another image buffer which contents get sent to the monitor and once the transmission is done the buffer refreshes itself from the other buffer which is constantly updated by the gpu eliminating tearing.

vsync just tells the card to wait for the transmission to complete before it updates the buffer. With some games this can cause slight stuttering if the in-game time between the shown images varies considerably. Also input lag because of the waiting. Now if the card isn't fast enough to keep up with say 60Hz refresh rate the old image from the buffer gets sent twice (or even three etc times) in a row, effectively halving the fps
a b U Graphics card
September 9, 2012 8:02:58 PM

Thanks Kari, had the feeling it was something along those lines, wasn't 100% about how triple buffering actually worked.

Really cleared that up for me and a few other things I had been thinking about. Cheers :) 
a b U Graphics card
September 10, 2012 11:50:32 AM

Best answer selected by auntarie.
a c 271 U Graphics card
a b C Monitor
September 10, 2012 2:12:13 PM

This topic has been closed by Mousemonkey
!