Background Information On Our Benchmarks
Before we start our rather rigorous benchmarking process, we disable dynamic brightness because it prevents us from getting an accurate and reproducible measurement of the display’s potential. Second, brightness is set to the highest value. If you don't use the same settings, your color gamut is going to look smaller than what our benchmarks show.
With respect to gamma, understand that it doesn’t affect black or white performance, only midtones. If gamma is set too high, the midtones appear dark. If it's set too low, they're pale. Adobe, Apple, and Microsoft all recommend a gamma of 2.2. It's an arbitrary value carried over from the NTSC standard, but it was originally chosen because it allows colors to appear more natural in slightly dim environments.
Battery Life & Recharge Time
Testing a tablet’s battery life tends to be highly variable unless you control the entire experience from beginning to end. Cumulatively, touch gestures don’t have a great impact on battery life. The biggest factors are CPU/GPU processing, screen brightness, volume, and Wi-Fi use. In order to accurately measure battery life, I coded a script that automatically plays MP3s at 50% volume while browsing different Wikipedia pages every 12 minutes. This benchmark is probably overkill, but it gives you an idea of a worst-case scenario.
Very few sites go through the trouble of benchmarking recharge time. However, in my view, it's as important as battery life. Though, it's not necessarily desirable to have a fast recharging time. Ideally, you want a nice slow charge so that your battery lasts more than a few hundred charges. Rapid charge times keep you away from the wall socket longer, but in the long run it cuts down on the health of the battery. Usually, the rate of charge starts to slow down somewhere in the 80% to 95% range, which is why the charging time from 0% to 10% is faster than 90% to 100%.
Early on, we discovered how difficult it is to benchmark tablets.
Benchmarking responsiveness with a camera is the easiest approach. Of course, normal cameras don't cut it, since they only shoot at 29 FPS. That's unacceptable if you're trying to measure precise time differences. Going the stopwatch route is no better, due to human-introduced errors. As a result, we're using a 1000 FPS high-speed camera to measure performance. Since one frame equals one millisecond, it’s possible to measure timing with a high degree of accuracy.
Boot and launch times aren't as important as input lag in our view. However, we're defining input lag in a slightly different way from the manner it’s used in discussing display technology. Our focus is more on real-world usability. As such, we define input lag "as the time between pressing a key and text appearing on-screen." This tells you how fast a tablet is registering an action. Ideally, you want low input lag so that you don't feel the tablet stuttering as you type or click on buttons. The average college student has a reaction time of 200 milliseconds for visual stimuli, so that's really the target for which we look.
Earlier this year, the lab overhauled the process of evaluating Wi-Fi performance. For background information, check out page 10 of Acer Iconia Tab A500: A Tablet With Honeycomb 3.1. Moving forward, I'm going to focus mostly on throughput, which is why I've elected to exclude response time scores. Generally, these two metrics go hand in hand, so I feel that it's somewhat redundant to publish both.