Lately I have been absolutely intrigued with the new methods of measuring GPU performance.
Question about runt frames. What constitutes a "runt frame" and who actually came up with the definition of what a runt frame actually is? Is any frame with 21 scan lines or less considered a runt frame? If so, is any frame with 22 or more lines consider a full frame? What exactly is a full frame?
How many scan lines do you need in a frame to make observable frame variance nil? Are there any benchmarks out there that consider full frames to be much higher than just 21 scan lines? According to most benchmarks, Nvidia doesn't have runt frames, only AMD does, but how low do the scan lines in Nvidia frames actually go?
I'm also confused about the difference in the testing methods. For example, here on tom's hardware the GTX Titan, GTX 680, & GTX 690 are shown to have some pretty terrible frame variance while running Crysis 3 ( http://www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-6.html ), but on PC Perspective they are shown to have very low frame variance ( http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-GeForce-GTX-Titan-GeForce-GTX-690-Radeon-HD-7990-HD-7970-Cross-1 & http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Test-5 ).
What's the difference between tom's method and pcper's method of testing frame variance? If they're the same, then why such a huge difference in results? Tom's review was released on the 5th, and pcper's on the 30th. If the methods are the same, perhaps drivers/updates (from Crysis 3 and/or Nvidia) were released in between the 5th and the 30th that drastically changed the performance of the Titan, 680, & 690? Could a difference in hardware result in benchmarks that are that different?
Question about runt frames. What constitutes a "runt frame" and who actually came up with the definition of what a runt frame actually is? Is any frame with 21 scan lines or less considered a runt frame? If so, is any frame with 22 or more lines consider a full frame? What exactly is a full frame?
How many scan lines do you need in a frame to make observable frame variance nil? Are there any benchmarks out there that consider full frames to be much higher than just 21 scan lines? According to most benchmarks, Nvidia doesn't have runt frames, only AMD does, but how low do the scan lines in Nvidia frames actually go?
I'm also confused about the difference in the testing methods. For example, here on tom's hardware the GTX Titan, GTX 680, & GTX 690 are shown to have some pretty terrible frame variance while running Crysis 3 ( http://www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-6.html ), but on PC Perspective they are shown to have very low frame variance ( http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-GeForce-GTX-Titan-GeForce-GTX-690-Radeon-HD-7990-HD-7970-Cross-1 & http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Test-5 ).
What's the difference between tom's method and pcper's method of testing frame variance? If they're the same, then why such a huge difference in results? Tom's review was released on the 5th, and pcper's on the 30th. If the methods are the same, perhaps drivers/updates (from Crysis 3 and/or Nvidia) were released in between the 5th and the 30th that drastically changed the performance of the Titan, 680, & 690? Could a difference in hardware result in benchmarks that are that different?