Acer Predator XB321HK 32-inch Ultra HD G-Sync Monitor Review

Conclusion

There are some who will quickly dismiss the Predator XB321HK because of its high price. But just how much more does it cost than any other 32-inch Ultra HD monitor? And we have to remember the G-Sync premium, regardless of the screen it’s installed in, is around $200 of that cost.

But sometimes there are expensive things in this world that are actually worth the asking price. It is true that the number of buyers willing to spend $1200 or more on a monitor is relatively small. And not every premium product can be called a good value. But in this case, we think it’s worth every penny.

Our test results play only a small part in this opinion. Out-of-box accuracy is good and many gamers will enjoy the XB321HK’s picture quality without ever opening the OSD. However we strongly recommend setting Contrast on 40 and Gamma on 1.8 to unlock the monitor’s full potential. Those two changes will take quality from good to great and put the Predator within striking distance of many professional displays we’ve tested.

The thing that impressed us most, however, was the sense of depth and immersion we experienced when using it. Games take on a completely different feel thanks to this jumbo screen. And even though the framerate maxes at 60fps, we never felt cheated when it came to smoothness and clarity. G-Sync is certainly a big factor in that equation and the XB complemented our GTX Titan X gaming system perfectly. As with any Ultra HD monitor, you’ll need to tailor your particular games’ detail levels to achieve playable framerates. Even with lots of graphics horsepower on hand, Far Cry 4 on Ultra probably won’t be that much fun to play.

So now the decision becomes this: Do you go for the flagship 32-inch Ultra HD monitor with G-Sync? Or would it be better to stick with QHD and a higher refresh rate? It seems likely that interface speeds will increase soon since DisplayPort 1.3 (approved in 2014) supports 3840x2160 signals up to 120Hz. Why we have yet to see monitors and video cards with this spec is anyone’s guess. So one could say that Ultra HD monitors available now may soon be obsolete. But even when graphics cards add DP 1.3, they’ll still need more processing power to enable those higher speeds.

Given these facts, we think a premium display like the Predator XB321HK makes a great choice for a high-end gaming rig or even a nice desktop productivity system. Gamers will appreciate its low latency, G-Sync adaptive refresh and stunning clarity. Business and graphics users will no doubt find many uses for its vivid and accurate color-rendering.

It’s not always easy to reward expensive displays no matter how good they are, but the XB321HK represents a good value in our eyes. For that reason, we’re giving it the Tom’s Hardware Editor Recommended Award.

MORE: Best Computer Monitors

MORE: How To Choose A Monitor

MORE: Display Calibration 101

MORE: The Science Behind Tuning Your Monitor

MORE: All Monitor Content

Subscribe to us on FacebookGoogle+RSSTwitter & YouTube.

This thread is closed for comments
24 comments
    Your comment
  • pjc6281
    So for 1100-1200 you get 60hz. That is a bit alarming.
  • Bartendalot
    The nite about 4K@60hz being obsolete soon is a valid one and makes the purchase price even more difficult to swallow.

    I'd argue that my 1440p@144hz is a more future-proof investment.
  • Yaisuah
    I have this monitor and think its great and almost worth the money, but I want to point out that nobody on the internet seems to realize there's a perfect resolution between 1440 and 4k that looks great and runs great and I think it would be considered 3k. Try adding 2880 x 1620 to your resolutions and see how it looks on any 4k monitor. I run windows and most less intensive games at this resolution and constantly get 60fps with a 970(around 30fps at 4k). You also don't have to mess with windows scaling on a 32in monitor. After seeing how great 3k looks and runs, I really don't know why everyone immediately jumped to 4k.
  • cknobman
    Nvidia G-Sync = high price
  • mellis
    I am still going to wait before getting a 4K monitor, since there is still not a practical solution for 4K gaming. In a couple of more years hopefully 4K monitors will be cheap and midrange GPUs will be able to support gaming on them. I think trying to invest in 4K gaming now is a wait of money. Sticking with 1080p for now.
  • truerock
    A 4K@120Hz G-Sync video monitor based PC rig under $4,000 is probably 2 to 3 years away.
  • RedJaron
    2296487 said:
    I have this monitor and think its great and almost worth the money, but I want to point out that nobody on the internet seems to realize there's a perfect resolution between 1440 and 4k that looks great and runs great and I think it would be considered 3k. Try adding 2880 x 1620 to your resolutions and see how it looks on any 4k monitor. I run windows and most less intensive games at this resolution and constantly get 60fps with a 970(around 30fps at 4k). You also don't have to mess with windows scaling on a 32in monitor. After seeing how great 3k looks and runs, I really don't know why everyone immediately jumped to 4k.
    I would hazard a few guesses. First would be that 2880x1620 is so close to 2560x1440 that no manufacturer wants to complicate product lines like that.

    Second, 2160 is the least common multiple of both 720 and 1080, meaning it's the lowest resolution that's a perfect integer scalar of both. So with proper upscaling, a 720 or 1080 source picture can be displayed reasonably well on a 4K display. These panels are made for TVs as well as computer monitors, and the majority of TV signal ( at least in the US ) is still in either 720p or 1080p. Upscaling 1080 to 1620 is the same as upscaling 720 to 1080 ( they're both a factor of 150% ). Upscaling by non-integer factors means you need a lot of pixel interpolation and anti-aliasing. To me, this looks very fuzzy ( I bought a 720p TV over a 1080 TV years ago because playing 720p PS3 games and 720p cable TV on a 1080p display looked horrible to me ). So it may be the powers that be decided on the 4K resolution so that people could adopt the new panels and still get decent picture quality with the older video sources ( at least until, or if, they get upgraded ). If so, I can agree with that.
  • michalt
    I have one and have not regretted my purchase for a second. I tend to keep monitors for a long time (my Dell 30 inch displays have been with me for a decade). When looked at over that time period, it's not that expensive for something I'll be staring at all day every day.
  • photonboy
    It would be nice to offer a GLOBAL FPS LOCK to stay in asynchronous mode at all times regardless how high the FPS gets.
  • photonboy
    Update: AMD has this, not sure where it is for NVidia or if the GLOBAL FPS LOCK is easy to do.
  • gulvastr
    RivaTuner has a great global FPS lock option. I set mine at 95 FPS on my X34 and stay in Gsync at all times.
  • Sam Hain
    4K at 60Hz for $1k+ does not "compute" and is not a one size fits all for all users; speaking for gaming here. Graphics applications and photo/video editing, it's all yours!

    A 1440p 120Hz/144Hz gaming monitor of your particular flavor (IPS, TN, AMVA/Flat, Curved/Single, Multiple, etc; You get the idear folks) is of a more logical and performance oriented enterprise for the foreseeable future for gaming and has been since the first 4K display that debuted as a "gaming" monitor.

    1440p gaming monitors come in under less than 4K "gaming" monitors, traditionally. BETTER performance, no questions asked and are less taxing on your GPU(s). For those with higher-end GPUs, crank up those settings to Ultra and Max, ESPECIALLY with the GTX 1080 now in action and the soon to debut MONSTER, Pascal Titan X.

    No matter how you slice it, SLI it (your rig)... 4K at 60Hz is still 4K stuck at 60Hz. No thank you.
  • Sam Hain
    2296487 said:
    I have this monitor and think its great and almost worth the money, but I want to point out that nobody on the internet seems to realize there's a perfect resolution between 1440 and 4k that looks great and runs great and I think it would be considered 3k. Try adding 2880 x 1620 to your resolutions and see how it looks on any 4k monitor. I run windows and most less intensive games at this resolution and constantly get 60fps with a 970(around 30fps at 4k). You also don't have to mess with windows scaling on a 32in monitor. After seeing how great 3k looks and runs, I really don't know why everyone immediately jumped to 4k.

    Does not make sense to "downscale" from 4K to any resolution... I see what you are saying but, like RedJaron stated it's so close to 1440p, it's negligible.

    The optimal way to for someone to use DSR is with a lower rez monitor going up... otherwise, money ill spent on a 4k $1K monitor. Just my opinion.
  • Yaisuah
    1440 and 1620 might seem close, but when you do the calculations, its actually almost 1 million more pixels on the screen. The difference between 1080 and 1440 is 1.6 million, so I wouldn't see how 1 million could ever be considered negligible. Plus my point was more about how companies just jumped straight from 1080/1440, to 4k as the next goal, when 3k was the logical next step. On this monitor, 3k res looks like the native resolution because the pixels are so small and I really have to get up close to tell the difference between 3k and 4k. That might sound unbelievable, but its true, and the performance makes it worth the very slight decrease in quality. This monitor has made me realize 4k is very unnecessary right now, just try 3k out on any 4k and you'll see.
  • RedJaron
    I know the math, but I didn't say it was negligible. I said mfrs probably consider it too small a jump to clutter their product lines by adding it. Yes, 1M extra pixels sounds like a lot when 1080p only has a little over 2M pixels. However, 1440p has 3.7M, so using the number of pixels alone doesn't tell the whole story. Look at the pixel increase from a percentage standpoint. 2880x1620 only has ~25% more pixels than 2560x1440. That's not a lot of difference considering going from 1080p to 1440p gives you more than 75% more pixels. That's about the same as moving from 1600x900 to 1680x1050. Most significant resolution jumps give you at least 30% more pixels, and that's the low side.

    I get what you're saying. 480, 720, and 1080 were all a factor of 1.5 from each other. Why break trend now? And I'm not saying the downscaled picture doesn't look good to you. Just remember than some people have sharper eyes than others, and what looks fine to one can be a fuzzy mess to another.
  • JackNaylorPE
    I can't see buying a monitor where you pay a premium for the G-Sync hardware module but the monitor is too slow to use the ULMB feature that the module provides.

    Until monitors arrive with DP 1.4 and panels fast enough ti support it, I can't seen investing $1200 for the "long term", then the technology will be obsoleted in a matter of months.

    And lets please make a very important distinction ... while IPS panels do have better viewing angles and color than TN panels .... and very few of them are fast enough for gaming.
  • somebodyspecial
    Wake me when it's 16:10 :)
  • picture_perfect
    Quote:
    DisplayPort 1.3 (approved in 2014) supports 3840x2160 signals up to 120Hz. Why we have yet to see monitors and video cards with this spec is anyone’s guess.


    Quote:
    But even when graphics cards add DP 1.3, they’ll still need more processing power to enable those higher speeds


    I think you answered your own question.
    Graphics power is limited. It's true, it's true.
    Otherwise ppl would be barfing much less in their VR headsets right.
    And I don't think 4K 60hz monitors will be obsolete soon.
  • JackNaylorPE
    Next thing we see will be 1.4 ... the GFX cards are already here. Asus showed it's DP 1.4 monitor already at Computex ... should be on the shelves by January at latest

    http://www.144hzmonitors.com/monitors/asus-computex-2016-27-inch-4k-144hz-gaming-monitor/
  • spat55
    Got my beautiful XF270HU (Wish it was G-Sync) with 1440p/144hz which is enough until we start seeing UHD 144hz preferably 21:9.
  • _MOJO_
    I have the ASUS ROG PG279Q - a hefty investment of cash but I can play my favorite titles at rates up to 165 Hz.The new 1070 and 1080 will push those buttery smooth frames to maximum with an IPS display , Gsync, and a solid 4 ms response time. I also use the monitor for art applications in tandem with a Wacom tablet for my workstation. I could not be happier with this versatile monitor that was almost half the price and twice the frame rate. 4K is adequate for multitasking, or graphic designs, but not synonymous with gaming at the moment in my opinion. The GPU manufacturers are almost there technologically, but the market only caters to the top tier financial demographic. A 10 series card and this monitor would set one back around $2000?! To push 4K to 60 fps- I'll pass.
  • rauf00
    from X34 standpoint (21:9 3440x1440 g-sync IPS) i will get next Predator ASAP when it will get +100Hz in 4K 21:9
  • _MOJO_
    2305729 said:
    from X34 standpoint (21:9 3440x1440 g-sync IPS) i will get next Predator ASAP when it will get +100Hz in 4K 21:9


    Agreed. I have the ROG Swift which I use mostly for CSGO and Dog fights on War Thunder. I use my LG 3440x1440 Ultrawide for graphic design, video production, and Sid Meiers Civilization V or Total War Warhammer.

    I think the ultimate gaming setup would include a 144 Hz - 165 Hz Ultrawide 3440x1440 personally. I have a feeling we will see it within the next year *fingers crossed*

    4K at that frame rate is a ways off at the moment.
  • maddogcollins
    Its a gaming monitor with 63ms lag and they say that shouldn't be a problem even for skilled players? Paying over a 1,000 with 63ms lag is a problem for me, a very average player.