Lately I have been absolutely intrigued with the new methods of measuring GPU performance.
Question about runt frames. What constitutes a "runt frame" and who actually came up with the definition of what a runt frame actually is? Is any frame with 21 scan lines or less considered a runt frame? If so, is any frame with 22 or more lines consider a full frame? What exactly is a full frame?
How many scan lines do you need in a frame to make observable frame variance nil? Are there any benchmarks out there that consider full frames to be much higher than just 21 scan lines? According to most benchmarks, Nvidia doesn't have runt frames, only AMD does, but how low do the scan lines in Nvidia frames actually go?
What's the difference between tom's method and pcper's method of testing frame variance? If they're the same, then why such a huge difference in results? Tom's review was released on the 5th, and pcper's on the 30th. If the methods are the same, perhaps drivers/updates (from Crysis 3 and/or Nvidia) were released in between the 5th and the 30th that drastically changed the performance of the Titan, 680, & 690? Could a difference in hardware result in benchmarks that are that different?
More about :fraps fps observed fps runt frames frame rating variance latency methods understand
I think the question of # of lines needs some more empirical testing, but I the PCPER article might have touched on it. I haven't re-read it again yet, since I don't care to risk my head exploding. Later in the work week, I may try to go back through it, looking for that.