Results: Far Cry 3
Performance-wise, these settings favor Nvidia's cards, which yield identical numbers for our actual and practical frame rates. The Radeon boards see 3.4 FPS between those same two measurements. Correlating with Fraps gives us a results that'd predictably come close to what the Radeon cards are actually rendering.
As before, we see spikes from what AMD's cards are actually rendering, which includes the dropped and runt frames.
The spikiness from AMD's Radeon cards can be demonstrated by looking at frame time variance. At the 95th percentile, we can see that frames aren't being delivered as consistently. I did notice the game felt a little laggier, too.
FCAT isn't for end users, it's for review sites. The tech is supplied by hardware manufacturers, Nvidia just makes the scripts. They gave them to us for testing.
The problem i have with the hardware you picked for this reviews is that even though, RAW FPS are not the main idea behind the review, you are giving a Tool for every troll on the net to say AMD hardware or drivers are crap. The idea behind the review is good though.
But as great as the review is, I feel one thing that review sites have dropped the ball on is the lack of v-sync comparisons. A lot of people play with v-sync, and while a 60hz monitor is going to limit what you can test, you could get a 120hz or 144hz monitor and see how they behave with v-sync on.
And the toughest thing of all, is how can microstutter be more accurately quantified. Not counting the runt frames gives a more accurate representation of FPS, but does not quantify microstutter that may be happening as a result.
It seems the more info we get, the more questions I have.