Results: Tomb Raider
In perhaps the most dramatic finish of our nine-game suite, the Radeon cards encounter their largest drop in experiential performance, sacrificing 16.5 FPS on average.
Comparatively, the GeForce GTX 660 Tis turn back the same result for actual and practical frame rate.
Frame rate over time shows us just how many dropped and runt frames must be discarded for us to reach our practical frame rate.
Frame time variance is low from the GeForce cards, whereas the Radeon boards encounter notably more variance. Additionally, we noticed a bit of this during game play.
FCAT isn't for end users, it's for review sites. The tech is supplied by hardware manufacturers, Nvidia just makes the scripts. They gave them to us for testing.
The problem i have with the hardware you picked for this reviews is that even though, RAW FPS are not the main idea behind the review, you are giving a Tool for every troll on the net to say AMD hardware or drivers are crap. The idea behind the review is good though.
But as great as the review is, I feel one thing that review sites have dropped the ball on is the lack of v-sync comparisons. A lot of people play with v-sync, and while a 60hz monitor is going to limit what you can test, you could get a 120hz or 144hz monitor and see how they behave with v-sync on.
And the toughest thing of all, is how can microstutter be more accurately quantified. Not counting the runt frames gives a more accurate representation of FPS, but does not quantify microstutter that may be happening as a result.
It seems the more info we get, the more questions I have.