Micro-Stuttering: Is It Real?
You've probably heard the term micro-stuttering used to describe an artifact experienced by owners of certain multi-GPU configurations. In short, it's caused by the rendering of frames at short but irregular time intervals, resulting in sustained high average FPS, but gameplay that still doesn't feel smooth.
The most common cause of stuttering is turning on v-sync when your hardware can't maintain a stable 60 FPS. The same applies to games that forcibly enable v-sync, the best example of which is Skyrim. This phenomenon has nothing to do with SLI, but it will manifest in SLI-based systems as well. In those cases, the output jumps between 30 and 60 FPS in order to maintains synchronization with the screen refresh, meaning some frames are displayed once, while others appear twice. The result is perceived stuttering. The workarounds are using Nvidia's Adaptive V-Sync setting in the graphics control panel, which causes some tearing, V-Sync (smooth), which prevents tearing but limits the frame rate to 30. Of course, if you own a newer G-Sync-capable display, enabling that feature will circumvent the problem altogether.
Micro-stuttering is a different phenomenon altogether. It is evident even when v-sync is disabled. What causes the issue is a variance in so-called frame times. That is, different frames are rendered (and displayed) using different amounts of time, which in turn appears as FPS values that are high (say, above 30-40), but gameplay that is not perceived as smooth. The data defining micro-stuttering is thus the variance in frame times for a given test run. The higher the variance, the less smooth the experience. While frame times do depend on frame rates overall (100 FPS = 10 milliseconds average frame time), frame time variance expressed in relative terms does not.
Sometimes micro-stuttering is caused by a game's engine optimization issues, irrespective of multi-GPU configurations. Don Woligroski tested Middle-earth: Shadow of Mordor, for instance, and observed that game's issues upon release. The problems with AMD cards persisted until it was patched. See below pre- and post-patch frame time variance for single cards. Clearly, there was an issue that needed to be fixed.
Any multi-GPU-equipped system faces a challenge in trying to minimize frame time variance while maximizing average frames per second and diminishing input lag. In the past, older combinations of hardware and software were really hampered by micro-stuttering. And it wasn't until Nvidia and AMD made an effort to meter the rate at which frames appeared did it start getting better.
Middle-earth: Shadow of Mordor's benchmark at 1440p outputs extraordinarily consistent frame times without SLI. Even with the technology enabled, the game behaves very well.
No doubt, this is also attributable to how well Middle-earth and its associated SLI profile are optimized. Again, it took a major game patch before initial issues associated with gameplay smoothness were addressed.
No we'll increase this title's resolution to 2160p.
This is the first time we see less than ideal performance in SLI. Shadow of Mordor just doesn't feel smooth at 4K, even with SLI, and despite an average frame rate that would suggest otherwise.
In our tests with Elite: Dangerous, frame time variance at 1440p is actually lower in SLI. This is possible because overall frame times are lower in SLI versus single-GPU mode. Frontier's fourth-generation COBRA engine appears to be really well-optimized for operation in SLI.
Unlike Shadow of Mordor, performance in Elite: Dangerous' highest-detail preset at 4K appears just as flawless in SLI as it is with a single GPU. Frame time variance is well below what could be identified as micro-stutter.
Thief also behaves extremely well in SLI at 1440p, at least as far as frame time variance is concerned.
By contrast, Thief struggles at 4K, even with the power of two GeForce GTX 980s behind it. Frame time variance rises to levels where micro-stutter would be noticed.
Over the past two generations of graphics architectures, Nvidia has made a concerted effort to minimize frame time variance in SLI configurations. We didn't encounter any micro-stutter in any of the games we tested at 2560x1440. It's more of an issue at 4K however, as our two 980s in SLI posted much higher variances in Middle-earth: Shadow of Mordor and Thief.