Skip to main content

Display Testing Explained: How We Test PC Monitors

Brightness, Contrast and Calibration

Brightness and contrast are, in our opinion, the two most important factors in perceived image quality.

High contrast versus low contrast

Higher contrast ratios are preferable since the lower a display's contrast ratio, the more washed out the picture appears. Given the data we’ve collected over the past 8 years, we’ve settled on a ratio of 1,000:1 as a benchmark for PC monitors; most can come close to or slightly exceed this value. (For comparison, consider that HDTVs usually post far higher contrast numbers, with some as high as 20,000 to 1). To arrive at the final result, we simply divide the maximum white level by the minimum black level. Obviously, the best monitor in the comparison group is the one with the highest contrast ratio.

To read about that concept in depth, please check out Display Calibration 201: The Science Behind Tuning Your Monitor for a brief treatise on imaging science.

Our tests begin with the panel in its factory default configuration. Before making any color adjustments whatsoever, we measure 0-100% brightness signals at both ends of the brightness control range.

Uncalibrated: Maximum Backlight Level

With the display's brightness control turned all the way up, this test uses full-field white and black patterns to measure white level, black level and contrast. The contrast ratio is calculated by dividing the black level into the white (W / B = CR). We do not raise the contrast control past the clipping point. While doing this would increase a monitor’s light output, the brightest signal levels would not be visible, resulting in crushed highlight detail. Our numbers show the maximum light level possible with no clipping of the signal.

What we’re looking for in this test is whether the panel meets the manufacturer’s spec and if it’s bright enough for its intended use. For instance, we like to see lots of light from gaming monitors, especially those with a blur-reducing backlight strobe, which cuts output by at least half. Meanwhile, professional studio screens don’t need to be as bright; however, photographers who need to use them on location would consider a brighter screen to be a better fit.

  • Patterns used: Full White Field, Full Black Field
  • Monitor should meet or exceed manufacturer's stated maximum brightness value
  • Better displays will exceed 1,000:1 contrast

Uncalibrated: Minimum Backlight Level

For the minimum brightness tests, we turn the backlight to its lowest setting and measure the white and black field patterns again. There really isn’t a better or worse result here. We believe 50 cd/m2 is a practical lower limit. Anything under that results in a dim picture that can cause eye fatigue even in a completely dark room. The purpose of this test is to see if the monitor’s contrast ratio remains constant throughout the entire luminance range. Some monitors become too dark for practical use. In those cases, we suggest a minimum setting for the brightness control that results in 50 cd/m2 output.

  • Patterns used: Full White Field, Full Black Field
  • The minimum white level should be at or close to 50 cd/m2
  • Contrast ratio should remain the same regardless of the backlight setting

After Calibration to 200 cd/m2

Since we consider 200 cd/m2 to be an ideal point for peak output, we calibrate all of our test monitors to that value. In a room with some ambient light, like an office, this brightness level provides a sharp, punchy image with maximum detail and minimal eye fatigue. On some monitors, 200 nits also the sweet spot for gamma and grayscale tracking.

In a dark room, many professionals prefer a 120 cd/m2 calibration. If a monitor’s contrast is consistent, it makes little to no difference on the calibrated black level and contrast measurements.

Calibration often reduces contrast slightly. If we measure a significant difference, we weigh the reduction against the improvement in color accuracy. A few monitors are color-accurate without adjustment and, therefore, best left uncalibrated to maximize contrast.

  • Patterns used: Full White Field, Full Black Field
  • Calibrated contrast should be as close as possible to uncalibrated contrast

ANSI Contrast Ratio

Another important gauge of contrast is ANSI. To perform this test, a checkerboard pattern of 16, 0 and 100% squares is measured. This is somewhat more real-world than on/off readings because it tests a display’s ability to simultaneously maintain low black and full white levels, factoring in screen uniformity. The average of the eight full-white measurements is divided by the average of the eight full-black measurements to arrive at the ANSI result.

The ANSI pattern is designed to test intra-image contrast. Its overall average level is 50 percent, representing a typical picture. It’s a good indicator of the quality of a display’s grid polarizer. That is the part most directly responsible for controlling light bleed between pixels. Even in the best monitors, the ANSI value is usually a little lower than the calibrated one.

  • Pattern used: Checkerboard (8 Full-White, 8 Full-Black)
  • Test performed after calibration to 200 cd/m2
  • The ANSI contrast ratio should be nearly equal to the on/off value
  • dputtick
    Hi! For the input lag tests for monitors with refresh rates greater than 60hz, do you test multiple times and take the average (to rule out variability coming from the USB driver, buffering in the GPU, etc)? Have you investigated how much latency all of that adds? Would be nice to have an apples-to-apples comparison with the tests done via the pattern generator. Also curious if you've looked into getting a pattern generator that pushes more than 60hz, or if such a thing exists. Thanks so much for doing all of these tests!
    Reply