Skip to main content

Asus MG278Q 27-inch QHD FreeSync Gaming Monitor Review

Recently we checked out Asus' MG279Q, a stunning-looking 27-inch IPS gaming monitor with FreeSync, 144Hz and a premium price. Today we're reviewing a cheaper alternative -- the TN-based MG278Q.

Brightness And Contrast

To read about our monitor tests in-depth, please check out Display Testing Explained: How We Test Monitors and TVs. Brightness and Contrast testing is covered on page two.

Uncalibrated – Maximum Backlight Level

Today's comparison group is a mix of IPS and TN displays, all with FreeSync capability except the Acer XB270HU, which features G-Sync. Remaining screens are Asus' MG279Q, Acer's XG270HU and XR341CK, and the value-priced 24-inch Nixeus NX-VUE24A we reviewed last month.

All the displays max out comfortably over 300cd/m2 except the Nixeus. The MG278Q takes the win here with a tiny bit more brightness than the Acer XG270HU. This is more light output than is needed for indoor use but if you were to play games in a brightly-lit space, the extra punch could be useful.

The bright backlight contributes to a higher than average black level but as you can see it's quite a bit greater than the next-place finisher.

One of the things you'll sacrifice for the MG278Q's lower price is contrast. This isn't a deal-breaker given the excellent gamma tracking we measured but other displays have deeper blacks. Asus' own MG279Q takes the win with over 1100:1. In our opinion that would be worth paying more for.

Uncalibrated – Minimum Backlight Level

The MG278Q's backlight bottoms out at a very-low 26.7435cd/m2. We wouldn't want to play games with a picture that dim; 50 is our ideal level for dark room use. You can set that level by raising the brightness control from zero to 6.

The minimum black level measurement doesn't have quite the impact when the backlight goes this low. It's a good number but not entirely useful in practice.

Contrast takes a small  three percent hit, which is well within the realm of consistency we look for. Any backlight setting will yield around 800:1 contrast.

After Calibration to 200cd/m2

Calibration neither helps nor harms the MG278Q's black level. It's still a little higher than the competition. In this group, IPS seems to be the better choice, at least in the area of contrast. It does come at a higher price though.

Contrast after calibration stays about the same as before adjustment. We were able to increase the contrast slider one click, which helped add depth without introducing any clipping. When considering cost versus performance, the Nixeus easily leads the pack here. Its cost is less than half that of the 27-inch IPS monitors.

ANSI Contrast Ratio

It's rare that we see ANSI contrast exceed the calibrated result, but the MG278Q has achieved that distinction. It's still in last place but the panel part is obviously of good quality. Later you'll see that screen uniformity is excellent. Even though it doesn't have best-in-class black levels, this is a well-made product.

  • Rishi100
    I am flummoxed why even at this stage, the displays are being churned out with HDMI 1.4 and Displayport 1.2 standards. They should have been HDMI 2 with HDCP 2.2 and DP1.3.
    Reply
  • TeamColeINC
    From what I've seen in reviews so far, then TN model is better anyway. But I would think ASUS fixed all the issues that people were reporting with the IPS model...
    Reply
  • 10tacle
    The last time I tried an ASUS 1440p panel was with their PB278Q (60Hz IPS). The first one I got had terrible back light bleed on the left side and two dead pixels right in the middle. Couldn't live with that. So I returned it and got another. The second one was sealed up on backlight bleed (good enough for the typical PLS/IPS anyway) but had four dead pixels, two which were close together on the center right side and the other two in different spots but in the general viewing area. Again, couldn't live with it and returned for my money back.

    I hope their quality control has improved, because for a $500+ monitor, any dead pixels and manufacturing tolerance defects are unacceptable. I paid a little more for a Dell U2713HM and have been happy ever since. I'll be in the market for a 1440p G-sync next year as an SLI 970 owner and would not rule out ASUS if they have improved their quality control. One thing I am not clear on is if you can select custom Free-Sync or G-Sync frequencies to better match your GPU power beyond factory monitor Hz settings (90Hz, 120Hz, 144Hz).
    Reply
  • alextheblue
    At speeds below 40fps, you'll need to turn on V-Sync to prevent tearing, though by that point stutter is the bigger problem. It's better to either reduce resolution or turn down the detail level to keep frame rates above 40.

    Uh, what about turning on LFC? LFC will work on monitors with a good variable refresh range such as this Asus unit. I'd like to see that tested for those cases where you dip in frames occasionally.

    One thing I am not clear on is if you can select custom Free-Sync or G-Sync frequencies to better match your GPU power beyond factory monitor Hz settings (90Hz, 120Hz, 144Hz).

    Wait, what? As long as you're within the variable refresh rate range, you're good to go. If you want to save power and reduce the framerate on a low-demand (old) game something like FRTC should work if there's no in-game cap.
    Reply
  • 10tacle
    17162561 said:
    One thing I am not clear on is if you can select custom Free-Sync or G-Sync frequencies to better match your GPU power beyond factory monitor Hz settings (90Hz, 120Hz, 144Hz).

    Wait, what? As long as you're within the variable refresh rate range, you're good to go. If you want to save power and reduce the framerate on a low-demand (old) game something like FRTC should work if there's no in-game cap.

    No what I'm talking about are complaints about (and this was from G-sync users) that they couldn't set a custom refresh rate to something like 100Hz or 110Hz in the Nvidia control panel on a G-sync monitor to better match their GPU power FPS and cap it. Maybe something's changed or they didn't know what they were talking about (or doing).

    I don't have one so I can't comment. I overclock my 1440p monitors to 75Hz (Dell) and 90Hz (Crossover) and cap frames accordingly, but just have never been clear on what that meant to a G-sync monitor that advertises 120Hz/144Hz capability.
    Reply
  • M for Moartea

    No what I'm talking about are complaints about (and this was from G-sync users) that they couldn't set a custom refresh rate to something like 100Hz or 110Hz in the Nvidia control panel on a G-sync monitor to better match their GPU power FPS and cap it. Maybe something's changed or they didn't know what they were talking about (or doing).

    I don't have one so I can't comment. I overclock my 1440p monitors to 75Hz (Dell) and 90Hz (Crossover) and cap frames accordingly, but just have never been clear on what that meant to a G-sync monitor that advertises 120Hz/144Hz capability.

    I'm not sure I fully understand your concern but, if I may, I'll give it a try.
    As a user of Asus PG278Q (with G-sync) for a year now, I can tell you this much:
    G-sync, much like FreeSync, works within a frame rate range, depending on the monitor and not the adaptive sync technology behind it, in my case within 30-144Hz. Between that frame rate range, the refresh rate is variable and depends on how many FPS your GPU can push.
    This is where the similarities between the two stop because outside of that range the two technologies behave differently. Below the minimum range, 30 FPS in my case, the G-sync module automatically displays the same frame twice, making the frame rate appear double than what it is and the gameplay feel smoother. At the other end, G-sync module automatically caps your frame rate to the maximum refresh rate of your monitor (144 in my case).

    That being said, having nothing to do with these adaptive sync technologies, Radeons do have a frame rate target control feature in the Catalyst control center (or whatever it's called nowdays) for power savings reasons, feature that you don't have as a Nvidia user.

    Now, regarding your concern, a custom refresh rate simply defeats the purpose of having an adaptive sync technology and, outside of power savings reasons, I fail to see how a custom refresh rate target would help since G-sync (and FreeSync for that matter) already cap the refresh rate of your monitor "to better match their GPU power FPS".
    If you prefer a custom refresh rate, you can chose do simply disable G-sync and set your (G-sync enabled) monitor to a fixed refresh rate (in my case I have the following options: 24, 60, 85, 100, 120, 144 Hz).
    I hope that was helpful.
    Reply
  • Verrin
    At speeds below 40fps, you'll need to turn on V-Sync to prevent tearing, though by that point stutter is the bigger problem.

    This technically isn't true any more, if you are using the Crimson driver and have a panel with a maximum refresh rate that is 2.5 times greater than the minimum (e.g. 144Hz panels).

    AMD refers to this new tech as Low Frame Rate Compensation (LFC), and it effectively does the same as Nvidia's solution (although by different means) by duplicating frames to maintain the refresh rate above a minimum refresh value (such as 40Hz). I've been playing around with it on my 390X and my Acer XG270HU and it's been working great, no stutter or hitching, just the usual expected loss in fluidity from going that low in the first place.
    Reply
  • picture_perfect
    Tearing happens when fps are higher than hz (multiple frames per refresh appear as horizontal tears in time)
    Stutter/Judder happens when fps are lower than hz (multiple refreshes per frame appear as double vision judder)

    To sync the two:
    Freesync/G-sync adjusts a monitor's hz to match fps.
    V-sync adjusts fps to match a monitor's hz.
    Reply
  • Andrew Pacely
    This is cool and all... but where's that new 34 incher they revealed last September?
    Reply
  • jdwii
    Threw up a little when i heard TN, little to no excuse anymore not to own a IPS monitor since latency is as low as 4ms and i bet 99.9% of you guys can't tell the difference
    Reply