Acer Predator XB321HK 32-inch Ultra HD G-Sync Monitor Review

Grayscale Tracking And Gamma Response

Our grayscale and gamma tests are described in detail here.

One might look at the default grayscale tracking chart and conclude the XB321HK is ready to go without adjustment. That is mostly true. Certainly in this test, there’s no concern. RGB levels are almost ruler-flat from bottom to top. This is the result we’d expect from a professional monitor.

Our contrast and gamma changes alter the landscape a little at the 100% brightness level. There we see a slight blue reduction on the chart, but the error is still below the visible point. Things are still looking quite good for our test subject.

Here is our comparison group.

The XB321HK would beat out most professional displays in the out-of-box grayscale test, yet there is no evidence of a factory-certified calibration here. Acer is simply using a good panel part that’s been engineered properly.

We don’t often record a worse grayscale result after calibration but .85dE is still a super-low error level. The change is due to the 100% brightness point which has shifted thanks to our gamma and contrast adjustments.

Gamma Response

Now we get to the heart of the matter. This is what the XB321HK’s gamma looks like by default. You’d think some sort of dynamic contrast was at work here, but that is not the case. It’s simply a result of the contrast slider being set too high. In real-world content you’ll see a distinct lack of definition in brighter material and highlight detail will disappear.

After changing only the contrast slider, gamma greatly improves. At least now we’re getting close to the mark. It still looks a little too dark, however. We wish there were a 2.0 setting, but there isn’t.

We weren’t sure which chart we liked better, this one or the previous. To the eye, it’s a matter of preference. When gamma is set to 1.8, the image is a little brighter. One might leave it to personal preference, but after you see the color gamut results on the next page, the choice becomes clear.

Here is our comparison group again.

Tracking isn’t quite ruler-flat like the others, so the XB321HK finishes last in this test. A .5 range of values isn’t too bad but at this price point, it should be better.

We calculate gamma deviation by simply expressing the difference from 2.2 as a percentage.

Gamma is a compromise with the XB321HK, although the 1.8 setting is better for color accuracy as you’ll see on the next page. It also comes closer to the 2.2 standard. It seems four of the screens could use a tweak in this department. Only the XB271HK and XB2700-4K hit the mark squarely.

This thread is closed for comments
24 comments
    Your comment
  • pjc6281
    So for 1100-1200 you get 60hz. That is a bit alarming.
  • Bartendalot
    The nite about 4K@60hz being obsolete soon is a valid one and makes the purchase price even more difficult to swallow.

    I'd argue that my 1440p@144hz is a more future-proof investment.
  • Yaisuah
    I have this monitor and think its great and almost worth the money, but I want to point out that nobody on the internet seems to realize there's a perfect resolution between 1440 and 4k that looks great and runs great and I think it would be considered 3k. Try adding 2880 x 1620 to your resolutions and see how it looks on any 4k monitor. I run windows and most less intensive games at this resolution and constantly get 60fps with a 970(around 30fps at 4k). You also don't have to mess with windows scaling on a 32in monitor. After seeing how great 3k looks and runs, I really don't know why everyone immediately jumped to 4k.
  • cknobman
    Nvidia G-Sync = high price
  • mellis
    I am still going to wait before getting a 4K monitor, since there is still not a practical solution for 4K gaming. In a couple of more years hopefully 4K monitors will be cheap and midrange GPUs will be able to support gaming on them. I think trying to invest in 4K gaming now is a wait of money. Sticking with 1080p for now.
  • truerock
    A 4K@120Hz G-Sync video monitor based PC rig under $4,000 is probably 2 to 3 years away.
  • RedJaron
    2296487 said:
    I have this monitor and think its great and almost worth the money, but I want to point out that nobody on the internet seems to realize there's a perfect resolution between 1440 and 4k that looks great and runs great and I think it would be considered 3k. Try adding 2880 x 1620 to your resolutions and see how it looks on any 4k monitor. I run windows and most less intensive games at this resolution and constantly get 60fps with a 970(around 30fps at 4k). You also don't have to mess with windows scaling on a 32in monitor. After seeing how great 3k looks and runs, I really don't know why everyone immediately jumped to 4k.
    I would hazard a few guesses. First would be that 2880x1620 is so close to 2560x1440 that no manufacturer wants to complicate product lines like that.

    Second, 2160 is the least common multiple of both 720 and 1080, meaning it's the lowest resolution that's a perfect integer scalar of both. So with proper upscaling, a 720 or 1080 source picture can be displayed reasonably well on a 4K display. These panels are made for TVs as well as computer monitors, and the majority of TV signal ( at least in the US ) is still in either 720p or 1080p. Upscaling 1080 to 1620 is the same as upscaling 720 to 1080 ( they're both a factor of 150% ). Upscaling by non-integer factors means you need a lot of pixel interpolation and anti-aliasing. To me, this looks very fuzzy ( I bought a 720p TV over a 1080 TV years ago because playing 720p PS3 games and 720p cable TV on a 1080p display looked horrible to me ). So it may be the powers that be decided on the 4K resolution so that people could adopt the new panels and still get decent picture quality with the older video sources ( at least until, or if, they get upgraded ). If so, I can agree with that.
  • michalt
    I have one and have not regretted my purchase for a second. I tend to keep monitors for a long time (my Dell 30 inch displays have been with me for a decade). When looked at over that time period, it's not that expensive for something I'll be staring at all day every day.
  • photonboy
    It would be nice to offer a GLOBAL FPS LOCK to stay in asynchronous mode at all times regardless how high the FPS gets.
  • photonboy
    Update: AMD has this, not sure where it is for NVidia or if the GLOBAL FPS LOCK is easy to do.
  • gulvastr
    RivaTuner has a great global FPS lock option. I set mine at 95 FPS on my X34 and stay in Gsync at all times.
  • Sam Hain
    4K at 60Hz for $1k+ does not "compute" and is not a one size fits all for all users; speaking for gaming here. Graphics applications and photo/video editing, it's all yours!

    A 1440p 120Hz/144Hz gaming monitor of your particular flavor (IPS, TN, AMVA/Flat, Curved/Single, Multiple, etc; You get the idear folks) is of a more logical and performance oriented enterprise for the foreseeable future for gaming and has been since the first 4K display that debuted as a "gaming" monitor.

    1440p gaming monitors come in under less than 4K "gaming" monitors, traditionally. BETTER performance, no questions asked and are less taxing on your GPU(s). For those with higher-end GPUs, crank up those settings to Ultra and Max, ESPECIALLY with the GTX 1080 now in action and the soon to debut MONSTER, Pascal Titan X.

    No matter how you slice it, SLI it (your rig)... 4K at 60Hz is still 4K stuck at 60Hz. No thank you.
  • Sam Hain
    2296487 said:
    I have this monitor and think its great and almost worth the money, but I want to point out that nobody on the internet seems to realize there's a perfect resolution between 1440 and 4k that looks great and runs great and I think it would be considered 3k. Try adding 2880 x 1620 to your resolutions and see how it looks on any 4k monitor. I run windows and most less intensive games at this resolution and constantly get 60fps with a 970(around 30fps at 4k). You also don't have to mess with windows scaling on a 32in monitor. After seeing how great 3k looks and runs, I really don't know why everyone immediately jumped to 4k.

    Does not make sense to "downscale" from 4K to any resolution... I see what you are saying but, like RedJaron stated it's so close to 1440p, it's negligible.

    The optimal way to for someone to use DSR is with a lower rez monitor going up... otherwise, money ill spent on a 4k $1K monitor. Just my opinion.
  • Yaisuah
    1440 and 1620 might seem close, but when you do the calculations, its actually almost 1 million more pixels on the screen. The difference between 1080 and 1440 is 1.6 million, so I wouldn't see how 1 million could ever be considered negligible. Plus my point was more about how companies just jumped straight from 1080/1440, to 4k as the next goal, when 3k was the logical next step. On this monitor, 3k res looks like the native resolution because the pixels are so small and I really have to get up close to tell the difference between 3k and 4k. That might sound unbelievable, but its true, and the performance makes it worth the very slight decrease in quality. This monitor has made me realize 4k is very unnecessary right now, just try 3k out on any 4k and you'll see.
  • RedJaron
    I know the math, but I didn't say it was negligible. I said mfrs probably consider it too small a jump to clutter their product lines by adding it. Yes, 1M extra pixels sounds like a lot when 1080p only has a little over 2M pixels. However, 1440p has 3.7M, so using the number of pixels alone doesn't tell the whole story. Look at the pixel increase from a percentage standpoint. 2880x1620 only has ~25% more pixels than 2560x1440. That's not a lot of difference considering going from 1080p to 1440p gives you more than 75% more pixels. That's about the same as moving from 1600x900 to 1680x1050. Most significant resolution jumps give you at least 30% more pixels, and that's the low side.

    I get what you're saying. 480, 720, and 1080 were all a factor of 1.5 from each other. Why break trend now? And I'm not saying the downscaled picture doesn't look good to you. Just remember than some people have sharper eyes than others, and what looks fine to one can be a fuzzy mess to another.
  • JackNaylorPE
    I can't see buying a monitor where you pay a premium for the G-Sync hardware module but the monitor is too slow to use the ULMB feature that the module provides.

    Until monitors arrive with DP 1.4 and panels fast enough ti support it, I can't seen investing $1200 for the "long term", then the technology will be obsoleted in a matter of months.

    And lets please make a very important distinction ... while IPS panels do have better viewing angles and color than TN panels .... and very few of them are fast enough for gaming.
  • somebodyspecial
    Wake me when it's 16:10 :)
  • picture_perfect
    Quote:
    DisplayPort 1.3 (approved in 2014) supports 3840x2160 signals up to 120Hz. Why we have yet to see monitors and video cards with this spec is anyone’s guess.


    Quote:
    But even when graphics cards add DP 1.3, they’ll still need more processing power to enable those higher speeds


    I think you answered your own question.
    Graphics power is limited. It's true, it's true.
    Otherwise ppl would be barfing much less in their VR headsets right.
    And I don't think 4K 60hz monitors will be obsolete soon.
  • JackNaylorPE
    Next thing we see will be 1.4 ... the GFX cards are already here. Asus showed it's DP 1.4 monitor already at Computex ... should be on the shelves by January at latest

    http://www.144hzmonitors.com/monitors/asus-computex-2016-27-inch-4k-144hz-gaming-monitor/
  • spat55
    Got my beautiful XF270HU (Wish it was G-Sync) with 1440p/144hz which is enough until we start seeing UHD 144hz preferably 21:9.
  • _MOJO_
    I have the ASUS ROG PG279Q - a hefty investment of cash but I can play my favorite titles at rates up to 165 Hz.The new 1070 and 1080 will push those buttery smooth frames to maximum with an IPS display , Gsync, and a solid 4 ms response time. I also use the monitor for art applications in tandem with a Wacom tablet for my workstation. I could not be happier with this versatile monitor that was almost half the price and twice the frame rate. 4K is adequate for multitasking, or graphic designs, but not synonymous with gaming at the moment in my opinion. The GPU manufacturers are almost there technologically, but the market only caters to the top tier financial demographic. A 10 series card and this monitor would set one back around $2000?! To push 4K to 60 fps- I'll pass.
  • rauf00
    from X34 standpoint (21:9 3440x1440 g-sync IPS) i will get next Predator ASAP when it will get +100Hz in 4K 21:9
  • _MOJO_
    2305729 said:
    from X34 standpoint (21:9 3440x1440 g-sync IPS) i will get next Predator ASAP when it will get +100Hz in 4K 21:9


    Agreed. I have the ROG Swift which I use mostly for CSGO and Dog fights on War Thunder. I use my LG 3440x1440 Ultrawide for graphic design, video production, and Sid Meiers Civilization V or Total War Warhammer.

    I think the ultimate gaming setup would include a 144 Hz - 165 Hz Ultrawide 3440x1440 personally. I have a feeling we will see it within the next year *fingers crossed*

    4K at that frame rate is a ways off at the moment.
  • maddogcollins
    Its a gaming monitor with 63ms lag and they say that shouldn't be a problem even for skilled players? Paying over a 1,000 with 63ms lag is a problem for me, a very average player.