MSI Optix MAG341CQ Curved Ultra-Wide Gaming Monitor Review: A Price Breakthrough

Brightness and Contrast

To read about our monitor tests in-depth, check out Display Testing Explained: How We Test Monitors and TVs. We cover Brightness and Contrast testing on page two.

Loading...

Uncalibrated – Maximum Backlight Level

For comparison, we brought in a group of VA panels. On the 16:9 side is MSI’s Optix MPG27CQ and Monoprice’s 33822. Ultra-wides include BenQ’s EX3501R, AOC’s Agon AG352UCG6 and the mega-wide Samsung C49HG90.

The MAG341CQ is rated at 250 nits but our sample only managed 213.6 at maximum brightness settings. But that’s not a deal-breaker as that is more than enough light for the average indoor space; although, you should avoid bright, sunny windows. We’d like to see at least 300 nits to provide some headroom for calibration, which can reduce peak output.

Black levels were VA-dark with our review monitor earning first place. This is precisely the reason for buying a VA panel and why they’re our favorite for entertainment. That extra depth is easy to see and really enhances gaming and video content. Resulting contrast is mid-pack in our group and about average for VA screens as a whole.

After Calibration to 200 nits

After calibration, the MAG341CQ’s black level didn’t notably change, but peak white dropped to 163 nits with the brightness slider maxed. Again, this was enough light for most indoor environments, but there was no headroom left.

We had to reduce contrast a bit to fix a gamma issue, and our changes to the RGB sliders cost us some dynamic range. 1,881.7:1 is still a respectable contrast ratio, but all the other screens here fared better, except for the AOC. In the ANSI contrast test, the MAG341CQ moved up a spot with a solid 1,660.8:1 score. While that wasn’t enough to win in this group, it’s significantly better than any IPS monitor short of a full-array backlight model.

MORE: Best Gaming Monitors

MORE: How We Test Monitors

MORE: All Monitor Content

13 comments
    Your comment
  • Energy96
    Why do these “high end” gaming monitors always seem to come with free sync instead of the Nvidia G-sync. Most people willing to shell out $450 and up on a monitor are going to be running Nvidia cards which makes the feature useless. With an Nvidia card it isn’t even worth considering a monitor that doesn’t support G-sync.
  • jason.gress
    Is there a VESA mount?
  • mitch074
    @energy96: because including G-sync requires a proprietary Nvidia scaler, which is very expensive, while Freesync is based off a standard and thus much cheaper. So, someone owning an Nvidia card would have to pay 600 bucks for a similarly featured screen.
  • stevehottois
    if u have a nvidia card free sync will still work
  • Energy96
    This is completely false. Free sync does not work with Nvidia cards, only Radeon. There is a sort of hack work around but it’s worse than just buying a g-sync monitor.
  • Energy96
    @energy96: because including G-sync requires a proprietary Nvidia scaler, which is very expensive, while Freesync is based off a standard and thus much cheaper. So, someone owning an Nvidia card would have to pay 600 bucks for a similarly featured screen.

    I know this. My point was most people who buy a Radeon card are doing it because they are on a budget. It’s unlikely they will have the funds for a high end gaming monitor that is $450+. That’s more than they likely would have spent on the Radeon card.

    Majority of people dropping that much or more on a gaming monitor will be running Nvidia cards. I know it adds cost but if you are running Nvidia a free sync monitor is out of the question. Free sync seems pointless in any monitor that is much over $300.
  • cryoburner
    550376 said:
    Why do these “high end” gaming monitors always seem to come with free sync instead of the Nvidia G-sync. Most people willing to shell out $450 and up on a monitor are going to be running Nvidia cards which makes the feature useless. With an Nvidia card it isn’t even worth considering a monitor that doesn’t support G-sync.

    The real question you should be asking is why a supposedly "high end" graphics card doesn't support a standard like VESA Adaptive-Sync, otherwise branded as FreeSync. You should be complaining on Nvidia's graphics card reviews that they still don't support the open standard for adaptive sync, not that a monitor doesn't support Nvidia's proprietary version of the technology that requires special hardware from Nvidia to do pretty much the same thing. It's not just AMD that will be supporting VESA Adaptive-Sync either, as Intel representatives have stated on at least a couple occasions that they intend to include support for it in the future, likely with their upcoming graphics cards. Microsoft's Xbox consoles also support FreeSync, albeit a less-standard implementation over HDMI.

    Nvidia doesn't support it because they want to sell you an overpriced chipset as a part of your monitor, and they want you arbitrarily locked into their hardware ecosystem once competition heats up at the high-end. I suspect that even they may support it eventually though. They're just holding out so that they can price-gouge their customers as long as they can.
  • Energy96
    I can agree with this, but currently it’s the only choice we have. I’m not downgrading to a Radeon card. It would be nice if some of these monitors at least offered versions with it. I know a few do but the selection is very limited.
  • mitch074
    550376 said:
    I can agree with this, but currently it’s the only choice we have. I’m not downgrading to a Radeon card. It would be nice if some of these monitors at least offered versions with it. I know a few do but the selection is very limited.

    A 1440p monitor and a Vega 56 go well together - if you have a 1080/1080ti/2070/2080/2080ti then yes it's a "downgrade", but if you run a triple screen off a Radeon card already, then it's not.
    I think my RX480 could run Dirt Rally on that thing quite well, for example - not everybody buy these things for a 144fps+ shooter.
  • Dosflores
    550376 said:
    Free sync seems pointless in any monitor that is much over $300.


    FreeSync isn't expensive to implement, unlike G-Sync, so why should companies not implement it? You seem to think that the only reason there can ever be to buy a new monitor is adaptive sync, which isn't true. People may want a bigger screen, higher resolution, higher refresh rate, better contrast… If they want G-Sync, they can expend extra on it. If they don't think it is worth it, they can save money. FreeSync doesn't harm anyone.

    Buying a $450 monitor seems pointless to me after having spent $1200+ on a graphics card. The kind of monitor that Nvidia thinks is a good match for it is something like this:

    https://www.tomshardware.com/reviews/asus-rog-swift-pg27u,5804.html
  • glass330
    I purchased this monitor somehow from MicroCenter in October for $380. Also, you can 3d print a vesa mount, there are several plans out there already on the maker sites.

    This monitor is certainly not the best on paper but are all those other "buzz word" features worth 2x or 3x the price for a non-professional home user? Do you need a $1400 display for a $19 steam game? Freesync works great and productivity is through the roof. Just don't hit the fullscreen button ever again. Gaming is fun and no stutters! Vega64 user here. I really like this monitor and this is the first purchase I have made in several years without any Buyers remorse! Seriously!
  • helder3araujo
    First off NVIDIA announced they will now support freesync; they lost and conceded defeat to AMD. So no need to get overpriced g-sync monitors anymore or do work arounds. NVIDIA drivers as of tomorrow will support 12 Freesync monitors. Not only that they will allow you to use a tool to test and see if yours is compatible to use with their drivers. over time they will increase support for more monitors. Nothing wrong with buying this MSI.
  • kaywould76
    This review states that this monitor does NOT support LFC; however, AMD's FreeSync webpage states that it does. Who is wrong?