Performance Versus Smoothness
Quantifying Experience In The Game World
Moving beyond the idea of average performance, or even performance over time, we break apart the smallest unit of measure, the second, to evaluate frame pacing, consistency, and stutter, all of which affect your perception of smoothness. Obviously, we want to see all of the frames that go into a frame-per-second average to render at even intervals, yielding the most pleasant pacing possible for a given performance level.
But that's not always what we observe. Sometimes high frame rates are accompanied by problematic stuttering. And that's why we have to evaluate performance separately from smoothness, even though they're inherently related.
A Simple Performance Analysis
First, let's compare two cards under DirectX 11 and 12. In the following charts, frame times are on the X axis and percentage points are on the Y axis. Ideally, you want to see the numbers to the left (shorter frame times) represented by the highest percentage possible. Anything above 33 ms drops below the 30 FPS mark. That's just ugly, so the far-right line should be as short as possible.
Interestingly, we see DirectX 12 shift the Radeon's line to the left on the enthusiast and mainstream PCs, while Nvidia isn't affected as profoundly.
A Simple Smoothness Analysis
To achieve suspension of disbelief and really draw you in to the gaming experience, individual frames must be delivered as smoothly as possible with minimal differences between render times. Swings as small as 10 to 20 milliseconds are perceived by our brains as micro-stutter, and these interruptions negatively affect our gaming experience. Of course, some of us are more sensitive to this phenomenon than others.
Again, we're looking at percentage on the Y axis. But instead of raw frame times on the X axis, we're looking at frame to frame differences up to 10 ms. Above that threshold, we'll have to zoom out for more detail. For now, though, those spikes at the end of the chart are large enough to tell us we aren't getting as smooth of an experience as we'd want on either system, using DirectX 11 or 12. Ideally, we'd want to see the whole line shifted as far left as possible. The RX 480 in our high-end config comes closest to achieving this under DirectX 11. Its behavior under DX 12 isn't as compelling.
The preceding charts alluded to what gets spelled out more explicitly in these graphics: despite overall higher performance (especially from the mainstream FX-based machine), our DirectX 12 runs suffer from greater frame time differences than DX 11, which appeared smoother, at least according to our benchmark results. It's time to go deeper...
Frame Rate Versus Frame Time Difference
As they say, many roads lead to Rome. First, we interpolate the frame rate curve to match the frame time output, allowing us to compare them directly. But the differences in render times are not a simple subtraction problem between frames. Rather, we perform a more complex calculation that reveals values most likely to affect your level of immersion.
Let us first consider the faster Core i7-based machine:
Talk about an illustrative example. We see very clearly that higher frame rates don't necessarily correspond to a more immersive experience. Particularly on the Radeon, spikes up above 100 ms aren't just micro-stutter; those are significant hitches in the action, and you're definitely going to notice them. Nvidia isn't immune either. Its GTX 1060 also suffers a higher frequency of pauses shifting from DX 11 to 12.
Frame rates on the slower system improve more dramatically under DX 12, particularly with AMD's Radeon RX 480 installed. There's an obvious trade-off in the form of disturbing frame time variance, though.
DirectX 12 only really helps the GeForce GTX 1060 on our lower-end platform. The Nvidia card's peaks and valleys aren't as pronounced, and its frame time variance over time isn't as prone to wild swings.
The Incorruptible "Stuttering Index"
At first, we split the run into one-second intervals and calculate the frame rate of each one. This is the basis for our index listings. Each interval gets a value based on its FPS result. Anything under 30 FPS is unplayable, between 30 and 60 FPS ranges from playable, good, and very good. The index goes lower the faster the frame is rendered.
This is still too rough, though. A brief burst of slow frames may show up within a smooth 70 FPS interval. You wouldn't know it from the frame rate, but you'll definitely see it during real-world gaming. Therefore, we explore the render times of individual frames and the differences between respective frames, too.
To prevent accidental misinterpretations, we use an intelligent filter that catches transitions between the cut scenes you often see in built-in benchmarks. If one sequence is not complex and runs faster, and the second is more challenging, slowing performance, this artifact is filtered out (frame preview, block-wise comparison); it's not real stuttering, but rather a scene change.
In this way, a fairly accurate forecast can be made whether stuttering or dropped frames are visually perceptible to the gamer. If the score for a single frame is higher (that means worse) than the base FPS value, the whole interval is marked with the higher/worse index value.
With all of those calculations complete, we end up with a subjective integer-based "Stuttering/Uneveness index," free from fractional values. This rating ranges from a score of zero (perfect, no interfering influences) to a score of five (the limit of acceptance) up to level 10 (real stuttering and dropped frames). The most sensitive enthusiasts will perceive micro-stuttering at a score of three or four.
In the chart above, a faster CPU helps maintain fairly smooth playback, though the Radeon RX 480 struggles under DirectX 12 with some visible stuttering.
This becomes even clearer under the power of a lower-end CPU. Although the GeForce isn't as fast, it does facilitate a smoother-looking picture.
MORE: Best Graphics Cards
MORE: All Graphics Content