Skip to main content

Acer Predator XB321HK 32-inch Ultra HD G-Sync Monitor Review

Today we’re looking at Acer’s latest flagship gaming monitor, the Predator XB321HK. Sporting a 32-inch IPS screen, G-Sync and premium build quality; it looks like just the thing for a cost-no-object gaming rig.

OSD Setup And Calibration

We’d love to see a controller like BenQ’s S-switch on these high-end Predator screens. But Acer’s traditional bezel-key arrangement works well with controls that click firmly and can move you through the OSD with ease.

OSD Tour

Pressing any key brings up a quick menu. The big G stands for Game Mode. There are three settings memories which you can be program. The needle icon accesses the overdrive option. Number three controls volume, the fourth selects the input and the fifth brings up the full OSD.

eColor Management offers five picture modes, four of which are fixed. Changing any setting, even brightness, will switch the XB321HK to User mode. You are then free to calibrate to your preference.

Brightness modulates the backlight through a range of 55-340cd/m2. Contrast comes set to 50 by default, but we found that caused gamma and color accuracy issues. We’ll tell you more about that below.

Blue Light reduces the blue primary to make the white point warmer and thereby reduce eye fatigue. Dark Boost raises low end gamma to make shadow detail more visible. Adaptive Contrast can extend perceived contrast but it will clip information at the extremes of the luminance range.

Gamma has two presets: 1.8 and 2.2. The latter is the default but it seems the former is the better choice. Check out our results on pages five and six for more information. Color Temp has three presets plus a user mode. sRGB Mode appears to have no effect on accuracy which is pretty good out of the box if you make a couple of minor changes which we’ll tell you about.

Saturate works like a color control to modulate overall saturation. It’s best left centered as you see above. 6-axis color gives you a luminance slider for each color. Again, there is no need to adjust this unless you’re seeking something other than Rec.709/sRGB.

The RGB sliders start maxed so you can only reduce them to dial in grayscale tracking. We only had to make a single adjustment to blue during our tests.

Here are the CMS sliders. They’re labeled color but actually adjust luminance. If you find yourself lost after too much tweaking, a handy Reset function is provided at the bottom.

The OSD comes in 15 languages and can be left up for two minutes max. It appears in the lower right corner of the screen and cannot be moved.

Refresh rate num is an fps counter that appears at the top right corner when turned on. The numbers are fairly large so you might only want them up for testing purposes. Transparency refers to the OSD screens and has four levels.

Game mode contains three settings memories that can be configured by the user. After making changes, save your options in one of the Settings fields. Then they are easily recalled in the quick menu. Aim point offers three different aiming reticules which are a great help to first-person-shooter novices.

Here are the remaining settings. There’s another input selector, DTS audio toggle, overdrive (off, Normal, Extreme), Wide Mode (aspect ratio options), Power LED brightness, Deep Sleep (less current draw in sleep mode) and a factory reset. Power-off USB charge will leave the ports active so you can charge devices when the XB321HK is turned off.

The signal info screen includes G-Sync status which is handier than opening Nvidia Control Panel. You’re also told the current input, resolution, refresh rate and which Game Mode preset you’ve selected.

Calibration

The XB321HK is fairly accurate out of the box, but there are two options we strongly recommend addressing: contrast and gamma. Contrast is set too high by default, which caused measurable and visible problems with gamma and color saturation. Drop it to 40 to fix this. Also, the default gamma setting of 2.2 makes the image a bit dark and murky, and has a negative impact on color. Changing it to 1.8 gave us almost perfect gamut results plus a brighter and more vivid picture.

Acer Predator XB321HK Calibration Settings
Brightness 200cd/m246
Brightness 120cd/m220
Brightness 100cd/m214
Brightness 80cd/m27
Contrast40
Gamma1.8
Color Temp UserRed 100, Green 100, Blue 99
  • pjc6281
    So for 1100-1200 you get 60hz. That is a bit alarming.
    Reply
  • Bartendalot
    The nite about 4K@60hz being obsolete soon is a valid one and makes the purchase price even more difficult to swallow.

    I'd argue that my 1440p@144hz is a more future-proof investment.
    Reply
  • Yaisuah
    I have this monitor and think its great and almost worth the money, but I want to point out that nobody on the internet seems to realize there's a perfect resolution between 1440 and 4k that looks great and runs great and I think it would be considered 3k. Try adding 2880 x 1620 to your resolutions and see how it looks on any 4k monitor. I run windows and most less intensive games at this resolution and constantly get 60fps with a 970(around 30fps at 4k). You also don't have to mess with windows scaling on a 32in monitor. After seeing how great 3k looks and runs, I really don't know why everyone immediately jumped to 4k.
    Reply
  • cknobman
    Nvidia G-Sync = high price
    Reply
  • mellis
    I am still going to wait before getting a 4K monitor, since there is still not a practical solution for 4K gaming. In a couple of more years hopefully 4K monitors will be cheap and midrange GPUs will be able to support gaming on them. I think trying to invest in 4K gaming now is a wait of money. Sticking with 1080p for now.
    Reply
  • truerock
    A 4K@120Hz G-Sync video monitor based PC rig under $4,000 is probably 2 to 3 years away.
    Reply
  • RedJaron
    18346905 said:
    I have this monitor and think its great and almost worth the money, but I want to point out that nobody on the internet seems to realize there's a perfect resolution between 1440 and 4k that looks great and runs great and I think it would be considered 3k. Try adding 2880 x 1620 to your resolutions and see how it looks on any 4k monitor. I run windows and most less intensive games at this resolution and constantly get 60fps with a 970(around 30fps at 4k). You also don't have to mess with windows scaling on a 32in monitor. After seeing how great 3k looks and runs, I really don't know why everyone immediately jumped to 4k.
    I would hazard a few guesses. First would be that 2880x1620 is so close to 2560x1440 that no manufacturer wants to complicate product lines like that.

    Second, 2160 is the least common multiple of both 720 and 1080, meaning it's the lowest resolution that's a perfect integer scalar of both. So with proper upscaling, a 720 or 1080 source picture can be displayed reasonably well on a 4K display. These panels are made for TVs as well as computer monitors, and the majority of TV signal ( at least in the US ) is still in either 720p or 1080p. Upscaling 1080 to 1620 is the same as upscaling 720 to 1080 ( they're both a factor of 150% ). Upscaling by non-integer factors means you need a lot of pixel interpolation and anti-aliasing. To me, this looks very fuzzy ( I bought a 720p TV over a 1080 TV years ago because playing 720p PS3 games and 720p cable TV on a 1080p display looked horrible to me ). So it may be the powers that be decided on the 4K resolution so that people could adopt the new panels and still get decent picture quality with the older video sources ( at least until, or if, they get upgraded ). If so, I can agree with that.
    Reply
  • michalt
    I have one and have not regretted my purchase for a second. I tend to keep monitors for a long time (my Dell 30 inch displays have been with me for a decade). When looked at over that time period, it's not that expensive for something I'll be staring at all day every day.
    Reply
  • photonboy
    It would be nice to offer a GLOBAL FPS LOCK to stay in asynchronous mode at all times regardless how high the FPS gets.
    Reply
  • photonboy
    Update: AMD has this, not sure where it is for NVidia or if the GLOBAL FPS LOCK is easy to do.
    Reply