Page 1:Going Beyond Performance Testing
Page 2:High-Level Test Results: Manhattan
Page 3:High-Level Test Results: T-Rex
Page 4:Low-Level Test Results: ALU
Page 5:Low-Level Test Results: Alpha Blending
Page 6:Low-Level Test Results: Driver Overhead
Page 7:Low-Level Test Results: Fill
Page 8:Special Test Results: Render Quality
Page 9:Special Test Results: Battery Life And Performance
Page 10:A Much Needed Benchmark, Just In Time
Special Test Results: Render Quality
It is using the peak signal-to-noise ratio (computed from the mean-square error), and the metric is mB - milliBel.
The formula we use is described here: http://en.wikipedia.org/wiki/Peak_signal-to-noise_ratio
At long last, we have a test that will (hopefully) facilitate a direct comparison between graphics performance and output quality. This benchmark measures visual fidelity in a high-end gaming scenario. It compares a single rendered frame against a reference image. Differences are calculated using a Peak Signal to Noise Ratio (PSNR) based on the Mean Square Error (MSE), and reported in millibels.
Like I mentioned on the first page, the idea here is that GPUs can optimize for performance or for quality, and device vendors further tune voltage and clocks to affect frame rates against battery life. You can't improve all three simultaneously, though. So, by quantifying fidelity and battery life right next to frame rates, we get a better sense of how variables are being balanced.
GFXBench's Standard Precision test shows the same image presented by the device in T-Rex, where high precision is not artificially forced and performance matters most. This is the most common use case for mobile GPUs.
When high precision is needed for a specific task or computation, mobile GPUs typically just don't match desktop-class hardware, which isn't a power-constrained. If they can, there's a notable performance hit.
For all of the grief our performance benchmarks give Samsung's Galaxy Note 10.1” 2014 Edition, this test exonerates the tablet's Mali-T628MP6 GPU somewhat by demonstrating a bias favoring quality.
EVGA's Tegra Note 7 doesn't finish far behind in second place, noticeably ahead of the rest of the field. The Meizu MX3 takes third place.
Everything else clumps up at the bottom of the chart, exhibiting quality trade-offs in the name of performance. Apple's iPhone 5s (A7), Oppo's N1 (Snapdragon 600), Nexus 5 (Snapdragon 800), and Nexus 7 (S4 Pro) all appear equally guilty of dialing back image quality in favor of higher frame rates.
Let's get back to the first set of results, though. It's particularly interesting that a Android-based device from Samsung tops this list. Now, the company has been caught cheating on benchmarks in the past, artificially targeting tests and increasing performance compared to performance in more real-world workloads. However, its strong finish here isn't a stroke of benevolence; rather, you should notice the same quality level from any Mali-T628-based SoC.
Forced to use higher precision, Samsung's Galaxy Note 10.1” 2014 Edition improves even further. Thanks to its Mali-T628MP6 graphics engine, the Note 10.1" seems tuned for fidelity, which has to hold back its potential somewhat in the performance-oriented tests.
EVGA's Tegra Note 7 barely budges, suggesting its shaders were already running at higher precision than the rest of the field. But that also means all of the other devices surpass the Tegra 4-based tablet, leaving it in last place.
Hardware that appeared problematic in the previous test delivers higher precision in this second benchmark, achieving an otherwise-even degree of image quality compared to GFXBench's reference.
- Going Beyond Performance Testing
- High-Level Test Results: Manhattan
- High-Level Test Results: T-Rex
- Low-Level Test Results: ALU
- Low-Level Test Results: Alpha Blending
- Low-Level Test Results: Driver Overhead
- Low-Level Test Results: Fill
- Special Test Results: Render Quality
- Special Test Results: Battery Life And Performance
- A Much Needed Benchmark, Just In Time