Sign in with
Sign up | Sign in
Your question
Closed

Vsync vs Microstuttering: Fight!

Last response: in Graphics & Displays
Share
October 30, 2011 5:07:08 PM

I've been in the market for a new system/video card setup for a while. One of the main things influencing my decision to go single/multi (nvidia/ati) has been Microstutter.

If I'm understanding everything correctly:

1. Vsync makes your graphics card wait until your monitor is ready to refresh before sending in a new frame
2. Microstuttering is when your graphics card is putting out frames at varying/unequal intervals of time (frame A is on your screen for 20ms, frame B is on your screen for 60ms, etc), which gives the illusion of choppy game play.

So, if that is correct, wouldn't turning Vsync on when your FPS is higher than your monitor's refresh rate completely remove all microstutter? Consider this incredibly scientific example:



It seems like, with Vsync ON and FPS higher than your monitor's refresh rate, all microstuttering should go away completely. If this is correct, then I'd like to mention one more thing that could use some verification or debunking by the community here:

* With FPS > Refresh rate, input lag is non existent. True/False?
a c 216 U Graphics card
a c 128 C Monitor
October 30, 2011 5:44:36 PM

nukewarm said:
I've been in the market for a new system/video card setup for a while. One of the main things influencing my decision to go single/multi (nvidia/ati) has been Microstutter.

If I'm understanding everything correctly:

1. Vsync makes your graphics card wait until your monitor is ready to refresh before sending in a new frame
2. Microstuttering is when your graphics card is putting out frames at varying/unequal intervals of time (frame A is on your screen for 20ms, frame B is on your screen for 60ms, etc), which gives the illusion of choppy game play.

So, if that is correct, wouldn't turning Vsync on when your FPS is higher than your monitor's refresh rate completely remove all microstutter? Consider this incredibly scientific example:

http://i.imgur.com/hBMK4.png

It seems like, with Vsync ON and FPS higher than your monitor's refresh rate, all microstuttering should go away completely. If this is correct, then I'd like to mention one more thing that could use some verification or debunking by the community here:

* With FPS > Refresh rate, input lag is non existent. True/False?


That last question can cause very small input lag, because it'll take your input, then drawn the screen, but then has to wait for vertical retrace mode before it's sent to the screen. If the GPU is working much faster than the monitors refresh, that can result in input lag (it'll never be more than your refresh with a single back buffer method). However, often times when you use V-sync, you use triple buffering, so it can delay you up to slightly more than a refresh 1/hz. It's smaller on 120hz monitors. I personally don't notice issues if the FPS is greater than the refresh rate, other claim otherwise.

Now here is the part that I wonder about. If non v-sync situations also use a triple buffer system, wouldn't it also get input lag? I assume it would, but that would depend on the game and whether it uses triple buffering without v-sync.

EDIT: one more thing. Even without v-sync on, there is input lag due to the frame is created, then sent to the screen. In many cases, it may not be any different, unless one uses triple buffering and the other doesn't, then the input lag can be as much as a frame different.

One thing I've heard people say, and I just don't want to do this, but people claim that they see more input with screen tearing, because you get the image that was meant to be displayed and part of the next image. They claim this adds more info for their minds to better aim, giving them better reactions. It may be true, but I don't like tearing.
Score
0
a b U Graphics card
October 30, 2011 6:21:18 PM

This is how micro stuttering looks like


Its a problem that CFx and sli got. The two Gpus need to do load balancing which isn't easy since they use AFR. Then the data need to copy from on gpu to another and from the sli/cf interface. frame data are typically transferred between the GPUs via the proprietary SLI and CrossFire interfaces built into high-end video cards. Dual-GPU cards like the Radeon HD 6990 and GeForce GTX 590 include an onboard version of that same interface. (Some low-end multi-GPU solutions do without a custom data channeland transfer frame data via PCIe.)


As you see here long frame latencys.




Even the 560ti show signs of it in bulletstorm

Quote:
Naturally, we contacted the major graphics chip vendors to see what they had to say about the issue. Somewhat to our surprise, representatives from both AMD and Nvidia quickly and forthrightly acknowledged that multi-GPU micro-stuttering is a real problem, is what we measured in our frame-time analysis, and is difficult to address. Both companies said they've been studying this problem for some time, too. That's intriguing, because neither firm saw fit to inform potential customers about the issue when introducing its most recent multi-GPU product, say the Radeon HD 6990or the GeForce GTX 590.


That is just part of a complete article of micro stuttering.
You can read the full article here
http://techreport.com/articles.x/21516

Score
0
Related resources
a c 143 U Graphics card
a b C Monitor
October 30, 2011 6:43:44 PM

I read somewhere that VSync can help to prevent "microstutter" by enforcing a fixed regular frame update rate regardless how quickly the next frame is or isn't ready the framerate will always be at 60 or a fixed divider of 60.

Even though, for fairly complex reasons VSync and multi GPU don't tend to work too nice together resulting in fairly obvious input latency.
Score
0
October 30, 2011 6:49:11 PM

Thank you for that great info. What are your thoughts on vsync as it relates to microstuttering?
Score
0
a c 216 U Graphics card
a c 128 C Monitor
October 30, 2011 6:52:42 PM

I've used 6950 (unlocked) in crossfire, and 470's in SLI. I always use v-sync if it is possible. I have have never noticed microstutter, nor any more input lag than with a single card. Usually the input lag is higher with a single card in those cases, as the crossfire/SLI setup gets higher than 60 FPS, so v-sync doesn't induce low FPS, nor requires triple-buffering.
Score
0
a b U Graphics card
October 30, 2011 6:59:36 PM

nukewarm said:
Thank you for that great info. What are your thoughts on vsync as it relates to microstuttering?

read that article I linked. V-sync can help reduce it sometimes as micro stuttering doesn't occur when there's constant low fps. It occurs when the fps is high. Especially in two gpu setups frame latency and the gpu struggling to sync with each other plus the overhead of the copying of data all are factors contributing to it.

The problem is we have all sorts of versions with the technologies like PCI-e 2.0 Pci-e 3.0 Usb 3.0. Those are upgraded and paid attention to all the time to increase bandwidth and reduce overhead etc etc. But with CFx and sli there's no version number nada. It looks like the gpu ,manufacturers decided that its their solution, it works why bother upgrading it or trying to find a better method to do it. They been banging on with the same method for ages and like everything else you are going to start running into problems because nothing is perfect. They know it but they chose to keep quiet about and just carried on. Only when confronted they started talking about after how many people wasted their money on cards with such issues.
Score
0
a c 271 U Graphics card
a b C Monitor
October 30, 2011 7:01:55 PM

gnomio said:
read that article I linked. V-sync can help reduce it sometimes as micro stuttering doesn't occur when there's constant low fps. It occurs when the fps is high. Especially in two gpu setups frame latency and the gpu struggling to sync with each other plus the overhead of the copying of data all are factors contributing to it.

The problem is we have all sorts of versions with the technologies like PCI-e 2.0 Pci-e 3.0 Usb 3.0. Those are upgraded and paid attention to all the time to increase bandwidth and reduce overhead etc etc. But with CFx and sli there's no version number nada. It looks like the gpu ,manufacturers decided that its their solution, it works why bother upgrading it or trying to find a better method to do it. They been banging on with the same method for ages and like everything else you are going to start running into problems because nothing is perfect. They know it but they chose to keep quiet about and just carried on. Only when confronted they started talking about after how many people wasted their money on cards with such issues.

So Mr expert, what dual card rig do you run?
Score
0
a b U Graphics card
October 30, 2011 7:07:08 PM

Mousemonkey said:
So Mr expert, what dual card rig do you run?

got 2 x 5870s. They been serving me faithfully since they were released. Seems like ages ago now but still struggling to get them to run on 3dmark 01
Score
0
a c 271 U Graphics card
a b C Monitor
October 30, 2011 7:09:06 PM

gnomio said:
got 2 x 5870s. They been serving me faithfully since they were released. Seems like ages ago now but still struggling to get them to run on 3dmark 01

So is that the experience that you are basing all your expert opinions on?
Score
0
a c 143 U Graphics card
a b C Monitor
October 30, 2011 7:20:49 PM

@ Bystander
I really hoped to experience a microstutter or screen tearing when V.sync enabled on low FPS games such as BF3, but i really didn't
Maybe i'm getting stuttering but i don't notice it but no visuals for screen tearing at all.
For the Triple Buffering thing, i alwasy force it in the CCC but wherever i read about it it's only for Open GL games only and the majority is DX3D games and to force it in games you have to follow up one of those methods
http://www.tweakguides.com/Graphics_10.html

EDIT: Some games support T.Buffering such as DXHR and i noticed that in the video options settings but the majority don't.
Score
0
October 30, 2011 8:11:34 PM

gnomio said:
read that article I linked. V-sync can help reduce it sometimes as micro stuttering doesn't occur when there's constant low fps


Thanks again for all of the info. I actually read that whole article, and unless I missed something, they dedicated a small paragraph to vsync, in which they concluded that any implications it might have with regard to microstuttering would be beyond their level of comprehension.

Why do you say microstuttering doesn't occur with constant, low fps? When i read that article, it seemed like the lower the FPS, the larger the variation in frame times (larger microstutter). Not to mention that common sense and user experience says that games become unplayable with multiGPU systems at low FPS because of stuttering.

I'm seeing a lot of people linking to articles about vsync, and articles about stuttering, but seldom do those sources speak about the effects one system has on the other.

Here is a talking point that I'd love to kick start, if anyone has the knowledge to discuss it:



I've marked where, on your picture, a 120hz monitor will refresh over the course of that benchmark. Two Notes: That benchmark shows 75 frames per second. A 60hz monitor will only refresh once, using the top line at around 16ms). if I turn on Vsync, would, in the 60hz or 120hz case, the game display the same frame 2+ times in a row?

In the 120hz case, it looks like it would for the larger bars. In the 60hz case (Using only the top line as a reference), it seems like it would only show 1 frame per refresh, but it would delay the smaller bars quite a bit (keep them on the screen longer).

What do you guys think?
Score
0
a b U Graphics card
October 30, 2011 9:44:22 PM

Mousemonkey said:
So is that the experience that you are basing all your expert opinions on?

no. I'm basing it on facts.

Quote:
AMD's David Nalasco identified micro-stuttering as an issue with the rate at which frames are dispatched to GPUs, and he said the problem is not always an easy one to reproduce. Nalasco noted that jitter can come and go as one plays a game, because the relative timings between frames can vary.

Nalasco told us there are several ideas for dealing with the jitter problem. As you probably know, vsync, or vertical refresh synchronization, prevents the GPU from flipping to a different source buffer (in order to show a new frame) while the display is being painted. Instead, frame buffer flips are delayed to happen between screen redraws. Many folks prefer to play games with vsync enabled to prevent the tearing artifacts caused by frame buffer flipsduring display updates. Nalasco noted that enabling vsync could "probably sometimes help" with micro-stuttering. However, we think the precise impact of vsync on jitter istough to predict; it adds another layer of timing complexity on top of several other such layers. More intriguing is another possibility Nalasco mentioned: a "smarter" version of vsync that presumably controls frame flips with an eye toward ensuring a user perception of fluid motion. We think that approach has potential, but Nalasco was talking only of a future prospect, not a currently implemented technology. He admitted AMD can't say it has "a water-tightsolution yet."

Nalasco did say AMD may be paying more attention to these issues going forward because of its focus on exotic multi-GPU configurations like the Dual Graphics feature attached to the Llano APU. Because such configs involve asymmetry between GPUs, they're potentially even more prone to jitter issues than symmetrical CrossFireX or SLI solutions.

In fact, in a bit of a shocking revelation, Petersen told us Nvidia has "lots of hardware" in its GPUs aimed at trying to fix multi-GPU stuttering. The basic technology, known as frame metering, dynamically tracks the average interval between frames.Those frames that show up "early" are delayed slightly—in other words, the GPU doesn't flip to a new buffer immediately—in order to ensure a more even pace of frames presented for display. The lengths of those delays are adapted depending on the frame rate at any particular time. Petersen told us this frame-metering capability has been present in Nvidia's GPUs since at least the G80 generation, if not earlier. (He offered to find out exactly when it was added, but we haven't heard back yet.)

That's from the mouth of Amd and Nvidia over the issue.

Score
0
a c 271 U Graphics card
a b C Monitor
October 30, 2011 9:50:01 PM

So you're a quote expert? As opposed to actually knowing anything about it?
Score
0
a b U Graphics card
October 30, 2011 10:09:33 PM

Mousemonkey said:
So you're a quote expert? As opposed to actually knowing anything about it?

no its called providing proof.

The whole pc system is based on timings and syncrotation. v-sync is just one them. And to think LCDs are really not limited by a refresh rate like a CRT but they need to act like a CRT on a pc.
Score
0
a b U Graphics card
October 30, 2011 10:26:47 PM

@ mousemonkey Why you seem so mad at him? He just give some information to the OP and to us, nothing wrong right? He also provide proof and I dont want to be a jerk here but I see you kind of harsh to Gnomio.
Score
0
a c 271 U Graphics card
a b C Monitor
November 5, 2011 9:04:49 AM

acerace said:
@ mousemonkey Why you seem so mad at him? He just give some information to the OP and to us, nothing wrong right? He also provide proof and I dont want to be a jerk here but I see you kind of harsh to Gnomio.

Unlike you I know who Gnomio is, the name may have changed but the IP address remains the same.
Score
0
!