The majority of monitors, especially newer models, display excellent grayscale tracking (even at stock settings). It’s important that the color of white be consistently neutral at all light levels from darkest to brightest. Grayscale performance impacts color accuracy with regard to the secondary colors: cyan, magenta, and yellow. Since computer monitors typically have no color or tint adjustment, accurate grayscale is key.

The BL3200PT comes out of the box set to the Standard picture mode, which, as you can see, has pretty good grayscale accuracy. None of the errors are visible, even at 100-percent brightness. In fact, reducing the Contrast control a couple of clicks cleans up that level.

BenQ markets this display as a CAD/CAM monitor, and it has a picture preset with a corresponding designation. We’re including it in our results so you can compare the differences. Obviously, the color temperature is a bit cooler than D65, though not significantly so. It gives the impression of a little extra brightness without actually increasing the contrast or backlight levels. In this preset, the color temp and gamma settings cannot be changed. The average error is 3.68 Delta E.

For the most accurate grayscale performance, select the User mode and adjust the RGB sliders. We also lowered the Contrast to 45 to achieve almost perfect results across the board.
Here is our comparison group:

The BL3200PT’s out-of-box grayscale accuracy in Standard mode is comfortably under the visible level of three Delta E. That bodes well for the majority of users who won’t be calibrating their monitor.

If you plan to use this display for color-critical work, an OSD calibration produces professional-quality results. An average error of less than one Delta E puts the BenQ in elite territory.
Gamma Response
Gamma is the measurement of luminance levels at every step in the brightness range from 0 to 100 percent. It's important because poor gamma can either crush detail at various points or wash it out, making the entire picture appear flat and dull. Correct gamma produces a more three-dimensional image, with a greater sense of depth and realism. Meanwhile, incorrect gamma negatively affects image quality, even in monitors with high contrast ratios.
In the gamma charts below, the yellow line represents 2.2, which is the most widely used standard for television, film, and computer graphics production. The closer the white measurement trace comes to 2.2, the better.

Gamma is the only area where the BL3200PT could use improvement. The default Gamma 3 setting results in an average value of over 3.0, which is much too dark for typical content and puts the trace off of our chart. Video and gaming looks flat and dull with poor detail. Changing to Gamma 1 improves image quality significantly. The trace is still a little darker than 2.2, but it’s pretty close.
High gamma values can make a monitor appear to have greater contrast, but this display doesn’t need any help in that department.

If you use the CAD/CAM preset, the gamma control is locked to a very high setting. The graph above shows what you end up with. While this may work fine for industrial design and CAD applications, it is not practical for use with typical computing tasks or entertainment.
|
|
|
|
|
| Read the Review | Read the Review | Read the Review | Read the Review | Read the Review |
|
|
|
|
|
|
Here is our comparison group again:

Even though the gamma runs a tad dark, it tracks reasonably well. Thanks to a tremendous amount of available contrast, the BL3200PT’s image pops better than the TN or IPS displays we’ve seen run through our labs lately.
We calculate gamma deviation by simply expressing the difference from 2.2 as a percentage.

The average gamma value is 2.34. Obviously, that's a little darker than 2.2. The error is fairly consistent from 10- to 100-percent brightness and runs to a maximum of 5.5 cd/m2. We still love the BL3200PT's image quality, but wish there was one more gamma preset to match the 2.2 standard.
- A 32-Inch QHD AMVA Monitor
- Packaging, Physical Layout and Accessories
- OSD Setup and Calibration
- Measurement and Calibration Methodology: How We Test
- Results: Brightness and Contrast
- Results: Grayscale Tracking and Gamma Response
- Results: Color Gamut nd Performance
- Results: Viewing Angles and Uniformity
- Results: Pixel Response and Input Lag
- BenQ BL3200PT: Bigger Is Better
I can't understand why I would need a monitor with lower pixel density? Why not just zoom the text a notch in your word processor or whatever software you are using? Of two otherwise similar monitors I would always choose the one with higher PPI, even if I used it only for word processing.
The days of 60Hz are almost over with..
The days of 60Hz are almost over with..
Except that the Swift cost $800
That's why I don't understand people saying 1080p is crap and has to go away. I've always find that even at 1080p, the fonts are really small, and icons and interfaces in general are very tiny. In my case, it's not even a case of not being able to read, it's just that everything looks so out of place and hideous, like, Windows wasn't meant for such resolutions.
I can't imagine 1440p. Must be ridiculous to look at. It's just aesthetically not nice.
Bring on the downvotes...
What is Active Sync?
It's not 1000$ though...
Part of the reason people do comes down to one, the pixel density (if that matters) and two the GPU horsepower necessary to run it. 4K panels are cool, but I don't game on one at all. I have one, but it isn't my go to monitor due to the low refresh rate, lag, and blur. Is it pretty? Sure. But honestly right now that 28" 4K panel is dumb as a post.
I'm always amazed how most people don't know you can adjust the size of pretty much every font inside of Windows. I've had people lowering the resolution of the screen and seeing everything blurred until I showed them that you can adjust the font sizes.
But for TH to make a comment like that? Did BenQ's marketing department sent you the text ready?
I can't understand why I would need a monitor with lower pixel density? Why not just zoom the text a notch in your word processor or whatever software you are using? Of two otherwise similar monitors I would always choose the one with higher PPI, even if I used it only for word processing.
Its not so much your apps that are the concern, because yes, most of them will give you some scaling options. The issue is that Windows does not scale very far. Your UI (icon text, folder names, Windows Explorer stuff) will be smaller at higher PPI.
That's why I don't understand people saying 1080p is crap and has to go away. I've always find that even at 1080p, the fonts are really small, and icons and interfaces in general are very tiny. In my case, it's not even a case of not being able to read, it's just that everything looks so out of place and hideous, like, Windows wasn't meant for such resolutions.
I can't imagine 1440p. Must be ridiculous to look at. It's just aesthetically not nice.
Bring on the downvotes...
That's why I don't understand people saying 1080p is crap and has to go away. I've always find that even at 1080p, the fonts are really small, and icons and interfaces in general are very tiny. In my case, it's not even a case of not being able to read, it's just that everything looks so out of place and hideous, like, Windows wasn't meant for such resolutions.
I can't imagine 1440p. Must be ridiculous to look at. It's just aesthetically not nice.
Bring on the downvotes...
Windows 7/8/8.1 has gui scaling as does MacOSX. Non issue.
I'm always amazed how most people don't know you can adjust the size of pretty much every font inside of Windows. I've had people lowering the resolution of the screen and seeing everything blurred until I showed them that you can adjust the font sizes.
But for TH to make a comment like that? Did BenQ's marketing department sent you the text ready?
I am one of the people to whom 1080p @ 24" renders things hard to see (not exclusive to text, mind you).
I am fully aware of Windows' high-DPI settings. But let me tell you, unless the applications you are running have good built-in support for it, Windows' high-DPI is not going to be a magic bullet.
You have 2 options: Win XP's high-DPI which will increase font size and leave every GUI element on screen looking highly unbalanced, OR the newest method that scales up the canvas surface upon which everything was rendered before "printing" it on screen, in which case you will also end up with blurriness.
Trust me on this. I have tried using high-DPI for extended periods of time, not just toggled it on and off so I could tell myself it's there and pretend it works fine. Unless you have a real disability like me though, you may have a hard time understanding where I'm coming from... so no hard feelings.
Basically, sharpness of a glossy (or anti reflect, just not anti glare) high DPI monitor is amazing, I just can't get over that... I don't understand why the market is moving away from that...
By the way, is there any monitor you can reccomend that has this specs? And one that is more than 60HZ?
I can't understand why I would need a monitor with lower pixel density? Why not just zoom the text a notch in your word processor or whatever software you are using? Of two otherwise similar monitors I would always choose the one with higher PPI, even if I used it only for word processing.
Its not so much your apps that are the concern, because yes, most of them will give you some scaling options. The issue is that Windows does not scale very far. Your UI (icon text, folder names, Windows Explorer stuff) will be smaller at higher PPI.