Skip to main content

Display Testing Explained: How We Test Monitors and TVs

Grayscale Tracking, Gamma Response and Color Gamut

Grayscale Tracking

The majority of monitors, especially newer models, display excellent grayscale tracking (even at stock settings). It’s important that the color of white be consistently neutral at all light levels from darkest to brightest. Grayscale performance impacts color accuracy with regard to the secondary colors: cyan, magenta, and yellow. Since computer monitors typically have no color or tint adjustment, accurate grayscale is key. Grayscale tracking is the one thing that’s adjustable on nearly every computer monitor and HDTV. Even the least-expensive products typically have a set of RGB sliders. HDTVs usually have a high and low range, known as a two-point control, and a few have a 10-point adjustment.

Our standard is 6500 Kelvins, which matches all computer- and video-based content mastered with the Adobe RGB or sRGB color space.

The chart shows RGB levels at every brightness point from zero to 100 percent. A perfect result places all the bars level at the center position. If one bar is higher than the others, that’s the tint color you might see if the error is great enough. The Delta E graph shows the amount of error at each brightness point. It’s generally accepted that errors less than three are not visible to the naked eye.

Depending on the monitor’s color temp and picture modes, we may show several charts to give you an idea of a product’s uncalibrated performance. We always calibrate when possible, even if the gains are small, to show you a display’s full potential.

When we compare monitors, the lowest average Delta E value comes out on top.

  • Patterns used: Gray Step, Gray Fields 0-100 percent
  • Lower average Delta E values mean more accurate grayscale tracking

Gamma Response

Gamma is the measurement of luminance levels at every step in the brightness range from 0 to 100 percent. It's important because poor gamma can either crush detail at various points or wash it out, making the entire picture appear flat and dull. Correct gamma produces a more three-dimensional image, with a greater sense of depth and realism. Meanwhile, incorrect gamma can negatively affect image quality, even in monitors with high contrast ratios.

In our gamma charts, the yellow line represents 2.2, representing the most widely used standard for television, film, and computer graphics production. The closer the white measurement trace comes to 2.2, the better.

In professional monitor and HDTV reviews, we also test for compliance to the BT.1886 gamma standard. Introduced in 2011, it is slowly replacing the 2.2 power function in film and television content. The differences are subtle. But in practice, it shows just a bit more shadow detail and greater contrast in mid-tones and highlights.

To compare displays, we chart the gamma tracking (difference between highest and lowest values), and deviation from the standard in percent.

  • Patterns used: Gray Step, Gray Fields 0-100 percent
  • Computer application standard: 2.2 power function
  • Video content standard: BT.1886

Color Gamut And Performance

Color gamut is measured using a saturation sweep that samples the six main colors (red, green, blue, cyan, magenta, and yellow) at five saturation levels (20, 40, 60, 80, and 100 percent). This provides a more realistic view of color accuracy.

Like the grayscale tests, we show color charts that illustrate each product’s different picture modes, as well as a final one with calibration results. It’s easy to see in the example which colors are closer to or further from their targets. The goal is to have the dots within the squares.

The luminance chart shows a third dimension of color, which is how bright it should be. If a bar is above zero, that color is too bright. Below the line is too dark. An ideal chart would show no bars at all!

The final Delta E value is calculated from the CIE and luminance results. Obviously, lower is better. You can track the accuracy of all six colors at five different saturation levels on each of the three charts.

  • Patterns used:Color Bars; Red, Green, Blue, Cyan, Magenta, and Yellow Full-Fields
  • Lowest average Delta E error means more accurate color
  • Standards: sRGB, Rec.709, Adobe RGB, DCI & Rec.2020 where appropriate

Gamut Volume: Adobe RGB 1998 And sRGB

Gamut volume is a specification used by the photo industry to benchmark monitors. It’s just another way to measure color accuracy, so we include the metric in addition to the gamut tests above. Manufacturers usually quote a volume number in their specs like 100 percent of sRGB or 70 percent of Adobe RGB. Our test simply verifies that claim.

Using CalMAN and QuickMonitorProfile, we create an ICC profile from our measurements. Then we use Gamutvision to calculate the volume with respect to both the sRGB and AdobeRGB gamuts.

  • ICC profile generated with QuickMonitorProfile using x & y coordinates from our gamut measurements
  • Rendered percentage of sRGB and Adobe RGB 1998 gamuts
  • Standard: 100 percent
  • Chetou
    Thank you for this writeup. Though I do find 200 cd/m2 retina scorching on any monitor and viewing conditions, especially with large screens.

    Do you take your measurements in a dimmed/dark room? Even with the meter flush with screen, some light can pass through the glass on the sides.
    Reply
  • Chetou
    double post
    Reply
  • cpm1984
    Gotta ask about your 'total lag' measurements - they seem to be generally much higher than what TFT Central measures. For example, the Dell UP3214Q has 97ms total lag with Toms':
    http://www.tomshardware.com/reviews/benq-pg2401pt-24-inch-monitor,3848-10.html

    but it has only 29ms lag at TFT Central (and only 25ms in 'game mode'):
    http://www.tftcentral.co.uk/reviews/dell_up3214q.htm

    Most of the monitors you review seems to have total lag around 80-100ms, which seems really slow. Slow enough that you'd notice the mouse lagging when you move it around. Yet I can't feel any appreciable lag on my Dell 2713HM (but I have no way of measuring....)
    Reply
  • cpm1984
    Gotta ask about your 'total lag' measurements - they seem to be generally much higher than what TFT Central measures. For example, the Dell UP3214Q has 97ms total lag with Toms':
    http://www.tomshardware.com/reviews/benq-pg2401pt-24-inch-monitor,3848-10.html

    but it has only 29ms lag at TFT Central (and only 25ms in 'game mode'):
    http://www.tftcentral.co.uk/reviews/dell_up3214q.htm

    Most of the monitors you review seems to have total lag around 80-100ms, which seems really slow. Slow enough that you'd notice the mouse lagging when you move it around. Yet I can't feel any appreciable lag on my Dell 2713HM (but I have no way of measuring....)
    Reply
  • ceberle
    14105809 said:
    Gotta ask about your 'total lag' measurements - they seem to be generally much higher than what TFT Central measures. For example, the Dell UP3214Q has 97ms total lag with Toms':
    http://www.tomshardware.com/reviews/benq-pg2401pt-24-inch-monitor,3848-10.html

    but it has only 29ms lag at TFT Central (and only 25ms in 'game mode'):
    http://www.tftcentral.co.uk/reviews/dell_up3214q.htm

    Most of the monitors you review seems to have total lag around 80-100ms, which seems really slow. Slow enough that you'd notice the mouse lagging when you move it around. Yet I can't feel any appreciable lag on my Dell 2713HM (but I have no way of measuring....)

    TFT Central uses the SMTT software which measures only the actual display lag. Our test indicates the total time it takes for a user input to translate to the screen. We account for the lag inherent in the input source which in our case is a pattern generator. While this device may be slower than the average mouse or keyboard it is completely consistent and repeatable. If we used a mouse-to-PC test, timing could shift if a driver were updated or we changed video boards. Every monitor we've measured over the past two years has been tested with the same Accupel signal generator.

    Our principal goal is consistency from monitor to monitor. By having the exact same equipment and test parameters, we can ensure that results from last year are comparable to more recent tests.

    -Christian-
    Reply
  • KevinAr18
    The response time testing method is done wrong and is very misleading. The tesst do not always reflect the real response of the monitors.

    Why?
    Monitors have thousands of different response times, not a single number. Testing just one or two transitions can easily give very innaccurate results; the one transition you tested may be much slower or faster than the other thousand or more transitions that exist. In order to find out if the monitor is any good, you must test a wide range of transitions.

    Want proof?
    See:
    http://www.xbitlabs.com/articles/monitors/display/viewsonic-fuhzion-vx2268wm_4.html#sect1
    In "standard" mode, the black to white transition is low, while almost all the others are high. (Note: the 255 to 0 transiyon is hidden behind the higher ones.) If you had tested this monitor using your current method, you would have concluded it is VERY fast, when, in fact, "standard" mode is actually slow! This is how they used to lie on the box specs: by using only a single number that is fast, while the rest are slow... and your tests only support the lie!


    Examples of how to test response times:
    Xbitlabs tests a range of 72 transitions spread out evenly; the also show the overshoot error rate for all those transitions:
    http://www.xbitlabs.com/articles/monitors/display/samsung-sa850_8.html#sect0
    TftCentral tests 30 transitions & the corresponding overshoot errors:
    http://www.tftcentral.co.uk/reviews/content/benq_bl3200pt.htm#gaming


    If you want, compare your tests of the BenQ BL3200 to tftcentral:
    http://www.tomshardware.com/reviews/benq-bl3200pt-qhd-monitor,3898-9.html
    The tftcentral tests show that the monitor has response time problems of 41ms for the 0-50 transition, but better numbers (6-10ms) otherwise. Your single number tests do not reveal any of these issues.
    Reply
  • KevinAr18
    Sorry for the aggressive tone in my previous comment. I should have taken more time to write it up nicely.

    This issue with response time testing has always been a problem on tomshardware (even before you ever started writing reviews here), however this is the first time I got a good chance to contact the person that writes the articles. I hope you will be able to look into the response time tests at some point. Sadly, there is literally only two sites on the internet (that I know of) that test response times correctly: xbitlabs & tftcentral. In fact, even tftcentral used to test response times wrong (for many years). xbitlabs was the original site I know of that began testing response times correctly.

    I apologize for not being able to write in more detail right now, but here's two helpful links:
    Response times:
    http://www.xbitlabs.com/articles/monitors/display/lcd-testmethods_5.html
    Overshoot errors:
    http://www.xbitlabs.com/articles/monitors/display/lcd-testmethods_6.html
    Reply
  • dovah-chan
    You can't really fully benchmark response times accurately as it varies amongst monitors such as the overall overclocking headroom varies by CPUs that are even of the same model.

    Also thanks for this article. It's n-not like I was one of the people who requested it. I'm just glad to see that we are listened to. >__<
    Reply
  • Drejeck
    I sense a disturbance in the force.
    Is some monitor being reviewed right now? Hope it's a G-sync near 300 euros
    Reply
  • Blazer1985
    Hi christian!
    I have a request :-)
    Would it be possible for you to compare some cheap chinese tv and monitors to the expensive western branded ones?
    I know that you usually get what you pay for but there might be a sweet spot somewhere in between and would be nice to know :-)
    Especially today that we are beginning the switch to 4k and the prices vary so much (ex. Haier 50" uhd @600€).
    Thanks!
    Reply