V-Sync NOT switching between 60 and 30 fps?

Deus Gladiorum

Distinguished
Hey guys, I've always been curious about something. Now, I know V-Sync prevents screen tearing by locking your frame rate to your screen's refresh rate.

However, what I've also heard is that it has to do this by locking your frame rate to the numerical factors of your refresh rate if you fall under your maximum, i.e. if you have a 60 Hz monitor and you fall below 60, V-Sync allegedly will cap your frame rate to 30 fps, if you fall below that then 20 fps, and below that 15, etc.

However, this is only what I've heard occurs, but I've never experienced that. Of all the times I can remember, anytime I've enabled V-Sync it never locks it to anything except 60 fps. If I fall below that, I simply fall below it, and that's it. I've tested this on a very capable Nvidia-based gaming PC which I know has adaptive V-Sync as an option, however I've also tested this on incredibly under powered AMD and Intel GPU based systems and they don't have anything like adaptive V-Sync from what I know of, so how is this still possible that when my fps falls under 60 fps, I never lock to 30?
 
Solution
^ michaelmk86 is correct.

There is another phenomenon which used to be common, but since the use of triple buffering it is very rare.

Without triple buffering, if your system cannot generate frames in 16ms of time on a consistent basis, your FPS will drop to near half. Normally, if your frames take 20ms to generate, you'd get 50 FPS, and they'll be displayed as soon as they are created, but with V-sync and a two buffer system, the GPU cannot start rendering a new frame until the previous one is displayed. That means every frame will end up taking 33.3ms before they are displayed, resulting in 30 FPS.

The near 30 FPS issue when you can't get 60 FPS is rare. EXTREMELY rare today, but 10 years ago, it happened all the time.

Now we...

I used to have a Radeon 5850 card and when vsync was enabled it didn't seem to lock to 60 or 30 fps.
I have now upgraded to a GTX770. With vsync enabled, if the frame rate drops below 60 it seems to drop to 30 fps.
I don't know if this is specific to nvidia cards or a different game I've only played on the new card.

 

adimeister

Honorable
Really...? I want to hear more about this too. haha Because I have my Vsync ON, playing batman arkham origins it runs at 55 to 60 FPS. I always look at the frame rate counter when in a battle. LOL Saw it go down to 30fps because that's when I enter the doors or go out in the open world. I have a weak CPU that's why it dips to 30fps when loading new areas. haha
 

Deus Gladiorum

Distinguished
Yea I have a GTX 770 as well, VincentP but in any game I play I don't lock to 30 fps. And like you adimeister, when I play Arkham City the exact same things happens because I have a powerful GPU paired with a weak FX-6300, but again, I fluctuate and never lock. However, this was how it's reported to be.

I know this is something that's alleged to happen, as the whole purpose behind Adaptive V-Sync from Nvidia is to get rid of that, but as I said, it doesn't lock on GPUs from AMD or Intel either.
http://www.geforce.com/hardware/technology/adaptive-vsync/technology

It's also something I've read elsewhere on sites other than Nvidia, though I can't find it at the moment.
 
I have heard this to but have never seen it happen on any of my 3 rigs. Most of the time my games run at full 60fps Vsync'ed but on a very few where the frames have dropped abit below that like 50-60 frames but never has it dropped to 30 or so. And I have a combination of AMD/ATI and intel/Nvidia based rigs.
 

Deus Gladiorum

Distinguished
Just an update guys, for anyone who might be interested:
It's 7 am in my time zone and I just got my hands on a copy of ACIV. I boot it up, and I realize that I'm just between 30 fps and 60 fps with absolutely no middle ground between the two. So, I'm guessing that it's pretty safe to say that this is a completely game dependent thing. I'm about to turn on adaptive V-Sync, but in case you guys are curious, one game that will assuredly lock your frame rate to multiples of your monitor's refresh rate is ACIV. It's quite interesting.
 

Deus Gladiorum

Distinguished
Yea I turned on Adaptive V-Sync but there's so many dips below 60 fps with max settings on (aside from PhysX which I keep completely off because it reduces my fps to sub-20) that there's constantly periods of screen tearing. It's making me crave G-Sync badly :\ Ah, the spectrum of human wants is limitless, isn't it? Ah well, the game looks gorgeous anyhow
 

michaelmk86

Distinguished
Dec 9, 2008
647
1
19,015
V-Sync ON and 60fps or more fps

16.7ms, 16.7ms, 16.7ms, 16.7ms, 16.7ms, 16.7ms, 16.7ms, and so on
meaning that in every second(1000ms) 60 frames are going to be display at the screen.

V-Sync ON and 52fps

33.4ms, 16.7ms, 16.7ms, 33.4ms, 16.7ms, 16.7ms, 33.4ms, and so on
meaning that in every second(1000ms) 52 frames are going to be display at the screen.
But the frames in the individual second are not going to have equal time intervals between them resulting the infamous stuttering, meaning that 52fps(with V-Sync) will not preserved as smooth as 52fps(without V-Sync), despite that in both cases 52 individual frames are displaying on the screen.


More examples:

V-Sync ON and 45fps
33.4ms, 16.7ms, 33.4ms, 16.7ms, 33.4ms, 16.7ms, 33.4ms, and so on

V-Sync OFF and 45fps
22.2ms, 22.2ms, 22.2ms, 22.2ms, 22.2ms, 22.2ms, 22.2ms, and so on
 
^ michaelmk86 is correct.

There is another phenomenon which used to be common, but since the use of triple buffering it is very rare.

Without triple buffering, if your system cannot generate frames in 16ms of time on a consistent basis, your FPS will drop to near half. Normally, if your frames take 20ms to generate, you'd get 50 FPS, and they'll be displayed as soon as they are created, but with V-sync and a two buffer system, the GPU cannot start rendering a new frame until the previous one is displayed. That means every frame will end up taking 33.3ms before they are displayed, resulting in 30 FPS.

The near 30 FPS issue when you can't get 60 FPS is rare. EXTREMELY rare today, but 10 years ago, it happened all the time.

Now we get what michaelmk86 described. Triple buffering allows the GPU to start rendering a new frame while waiting to display the most currently ready frame. So if the display forces the GPU to wait on displaying a frame, it doesn't bring everything to a standstill.
 
Solution

Deus Gladiorum

Distinguished
FINALLY, AN ANSWER TO THIS QUESTION! Thank you greatly, I've always wondered why this is the case. I'd like to thank you too, michaelmk86, both of your answers were excellent in helping me understand this, it's just that bystander provided that little extra bit which made it more relevant to the rest of my question. Still, thank you both.

However, could you explain exactly how triple buffering ensures that V-Sync can be kept at variable frame rates? The logic in that eludes me.
 


As you know, with V-sync, a frame can only be displayed between refreshes. This small time between refreshing is called vertical blanking mode. With a two frame buffer system, the front buffer is what the display is showing, and the back buffer is the one the GPU writes to. With only a front and back buffer, the GPU can only write to one back buffer. If that buffer is waiting for vertical blanking mode to be displayed, so it won't disrupt the refresh cycle, the GPU cannot start creating a new frame, so it stops working, and no work towards a new frame is started.

If your frame times are beyond 16.7ms, it will force them to wait 33.3ms to be displayed. Because the GPU can't start on the next frame, and most frames will be similar in time to create, it will create a repeating cycle of having to wait two refreshes for every frame created. This results in 30 FPS.

Triple buffering fixes this by having two back buffers in addition to the front buffer. Now if one created frame is waiting for vertical blanking mode, the GPU can start creating another frame in the 2nd back buffer. This prevents the GPU front sitting idle while waiting for a finished frame to be displayed. It allows the GPU to be part way through a new frame when the current one is displayed, allowing that one a chance to be finished before the next vertical blanking mode.

Most modern games use triple buffering with v-sync. This results in what michaelmk86 described.
 

adimeister

Honorable
Oh shoot, too much info for 10mins, and I just woke up. My brain hurts. hahaha Thanks for the info guys! :D Learned something new again today. So v-sync uses triple buffering system. 1 front, then 2 buffers at the back - this is for modern games, right? But for older ones, they just use 1 back buffer, resulting to a locked 30fps.. I hope I got all of your points right. :D
 


Pretty much. I'm not certain if all DirectX games have triple buffering with V-sync by default, or if it is something that dev's have to enable, or maybe it started with a particular DirectX version. I do know that at the time that most games used double buffering, OpenGL was more popular than DirectX. This may be why there is a force triple buffering option for OpenGL built into most drivers these days.
 

Deus Gladiorum

Distinguished
That's awesome! Thanks a lot for completely and thoroughly explaining this information, I totally get it now. Seriously, bystander, kudos to you because I have never been able to come across information concerning how and why V-Sync operates the way it does. I feel like my computer information knowledge has become a little more wholesome. As for the DirectX thing, I think it's something the devs have to enable since Assassin's Creed IV was built on Direct X and it's the first game I noticed which had the problem of frame limiting between 30 and 60 fps with V-Sync enabled.

I also see that my Nvidia Control Panel has an option to force Triple Buffering in games (I believe AMD's CCC might have a similar option) but I'm almost positive that this doesn't actually work for DirectX and only applies to OpenGL, at least, that's what I've read previously. Back when I wasn't sure what the connection between triple buffering and V-Sync was, I just assumed triple buffering somehow increased performance in games. Because of my misconception, I looked up ways to enable triple buffering for games that I assumed had no triple buffering, similar to how there are ways to "inject" FXAA or MSAA in certain games. While the information was of little use then, it provides some relevance now:

Using RivaTuner you can force triple buffering in DirectX games. The details escape me, but a quick google search should provide the answer to anyone curious. Personally, I'm quite content just having screen tearing without V-Sync on Assassin's Creed IV and I feel too lazy to try to enable triple buffering for it, but for anyone wanting to play around with the game because they can't bear the screen tearing and they can't bear the drastic drops from 30 to 60 fps, you might want to give RivaTuner and that google search a go. I think the game actually benefits from the lack of input lag so I don't care to enable it, but for anyone else, or for any other games you might find that this would benefit from, give it a go.

Thanks again, bystander!
 
Glad to have helped.

Oh, there is one last bit of info I should probably mention about DirectX games. If you can manage to maintain 60 FPS on a 60hz monitor, double buffering is preferred due to lower latency. Triple buffering is great when you cannot reach your refresh rate, but double buffering is best if you can. The reason is due to a DirectX rule that all rendered frames must be displayed. With two back buffers, and rendering times fast enough to beat your refresh rate, the two back buffers will begin to fill up with 2 complete frames. When there are two complete frames, DirectX forces the oldest of the 2 to be displayed first, making the most recently created frame to have to wait. This process repeats over and over, assuming you can maintain 60 FPS or more, resulting in 16.7ms of latency.