Results: Grayscale Tracking and Gamma Response
The majority of monitors, especially newer models, display excellent grayscale tracking (even at stock settings). It’s important that the color of white be consistently neutral at all light levels from darkest to brightest. Grayscale performance impacts color accuracy with regard to the secondary colors: cyan, magenta, and yellow. Since computer monitors typically have no color or tint adjustment, accurate grayscale is key.
The BL3200PT comes out of the box set to the Standard picture mode, which, as you can see, has pretty good grayscale accuracy. None of the errors are visible, even at 100-percent brightness. In fact, reducing the Contrast control a couple of clicks cleans up that level.
BenQ markets this display as a CAD/CAM monitor, and it has a picture preset with a corresponding designation. We’re including it in our results so you can compare the differences. Obviously, the color temperature is a bit cooler than D65, though not significantly so. It gives the impression of a little extra brightness without actually increasing the contrast or backlight levels. In this preset, the color temp and gamma settings cannot be changed. The average error is 3.68 Delta E.
For the most accurate grayscale performance, select the User mode and adjust the RGB sliders. We also lowered the Contrast to 45 to achieve almost perfect results across the board.
Here is our comparison group:
The BL3200PT’s out-of-box grayscale accuracy in Standard mode is comfortably under the visible level of three Delta E. That bodes well for the majority of users who won’t be calibrating their monitor.
If you plan to use this display for color-critical work, an OSD calibration produces professional-quality results. An average error of less than one Delta E puts the BenQ in elite territory.
Gamma is the measurement of luminance levels at every step in the brightness range from 0 to 100 percent. It's important because poor gamma can either crush detail at various points or wash it out, making the entire picture appear flat and dull. Correct gamma produces a more three-dimensional image, with a greater sense of depth and realism. Meanwhile, incorrect gamma negatively affects image quality, even in monitors with high contrast ratios.
In the gamma charts below, the yellow line represents 2.2, which is the most widely used standard for television, film, and computer graphics production. The closer the white measurement trace comes to 2.2, the better.
Gamma is the only area where the BL3200PT could use improvement. The default Gamma 3 setting results in an average value of over 3.0, which is much too dark for typical content and puts the trace off of our chart. Video and gaming looks flat and dull with poor detail. Changing to Gamma 1 improves image quality significantly. The trace is still a little darker than 2.2, but it’s pretty close.
High gamma values can make a monitor appear to have greater contrast, but this display doesn’t need any help in that department.
If you use the CAD/CAM preset, the gamma control is locked to a very high setting. The graph above shows what you end up with. While this may work fine for industrial design and CAD applications, it is not practical for use with typical computing tasks or entertainment.
Here is our comparison group again:
Even though the gamma runs a tad dark, it tracks reasonably well. Thanks to a tremendous amount of available contrast, the BL3200PT’s image pops better than the TN or IPS displays we’ve seen run through our labs lately.
We calculate gamma deviation by simply expressing the difference from 2.2 as a percentage.
The average gamma value is 2.34. Obviously, that's a little darker than 2.2. The error is fairly consistent from 10- to 100-percent brightness and runs to a maximum of 5.5 cd/m2. We still love the BL3200PT's image quality, but wish there was one more gamma preset to match the 2.2 standard.
I can't understand why I would need a monitor with lower pixel density? Why not just zoom the text a notch in your word processor or whatever software you are using? Of two otherwise similar monitors I would always choose the one with higher PPI, even if I used it only for word processing.
The days of 60Hz are almost over with..
That's why I don't understand people saying 1080p is crap and has to go away. I've always find that even at 1080p, the fonts are really small, and icons and interfaces in general are very tiny. In my case, it's not even a case of not being able to read, it's just that everything looks so out of place and hideous, like, Windows wasn't meant for such resolutions.
I can't imagine 1440p. Must be ridiculous to look at. It's just aesthetically not nice.
Bring on the downvotes...