MSI Optix MAG341CQ Curved Ultra-Wide Gaming Monitor Review: A Price Breakthrough
Why you can trust Tom's Hardware
Viewing Angles, Uniformity, Response and Lag
VA panels are not known for stellar off-axis image quality, but the MAG341CQ looks better than most. From the sides, color tended slightly towards red with a light reduction of about 40 percent. Detail stayed solid, so users gaming on multiple screens will be able to see their enemy at a glance no matter where they approach from. When looking down from the top, detail lost definition, and the color was somewhat reddish. While an IPS screen would perform better in this test, the Optix beats many of the VA panels we’ve photographed.
Screen Uniformity
To learn how we measure screen uniformity, click here.
The screen uniformity of our MAG341CQ was a bit disappointing. We could see the black field get brighter towards the bottom where the backlight is situated. 15.01 percent isn’t a bad score, but it is higher than the other monitors. Note that the results here will vary among different MAG341CQ samples. In our case, the hot zones transitioned smoothly from the dark zones, so there were no obvious blotches or glowing edges. The issue was hard to see in most real-world content.[1] [2]
Pixel Response & Input Lag
Click here to read up on our pixel response and input lag testing procedures.
The MAG341CQ’s most impressive trait is speed. Though it runs at 100Hz, it managed to beat the Monoprice 33822, which hits 144Hz. It also traded punches with the 120Hz AG352UCG6. Motion blur was nearly non-existent, especially when in-game framerates were near 100fps.
Input lag is also a non-factor with only a 37ms delay. While users might gravitate towards 120 and 144Hz monitors, this one makes the most of its 100Hz rating. That’s one of the reasons for its low price, and obviously, there is no sacrifice in performance.
MORE: Best Gaming Monitors
MORE: How We Test Monitors
MORE: All Monitor Content
Current page: Viewing Angles, Uniformity, Response and Lag
Prev Page Grayscale, Gamma and Color Next Page ConclusionStay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Christian Eberle is a Contributing Editor for Tom's Hardware US. He's a veteran reviewer of A/V equipment, specializing in monitors. Christian began his obsession with tech when he built his first PC in 1991, a 286 running DOS 3.0 at a blazing 12MHz. In 2006, he undertook training from the Imaging Science Foundation in video calibration and testing and thus started a passion for precise imaging that persists to this day. He is also a professional musician with a degree from the New England Conservatory as a classical bassoonist which he used to good effect as a performer with the West Point Army Band from 1987 to 2013. He enjoys watching movies and listening to high-end audio in his custom-built home theater and can be seen riding trails near his home on a race-ready ICE VTX recumbent trike. Christian enjoys the endless summer in Florida where he lives with his wife and Chihuahua and plays with orchestras around the state.
30-year-old Pentium FDIV bug tracked down in the silicon — Ken Shirriff takes the microscope to Intel's first-ever recall
Cyberpunk 2077 update 2.2 claims to improve Arrow Lake performance by up to 33%, theoretically matching the Ryzen 7 7800X3D
Empyrean Technology gives control to CEC after U.S. blacklisting — China’s top developer of chip design systems hands reins to state-owned firm
-
Energy96 Why do these “high end” gaming monitors always seem to come with free sync instead of the Nvidia G-sync. Most people willing to shell out $450 and up on a monitor are going to be running Nvidia cards which makes the feature useless. With an Nvidia card it isn’t even worth considering a monitor that doesn’t support G-sync.Reply -
mitch074 @energy96: because including G-sync requires a proprietary Nvidia scaler, which is very expensive, while Freesync is based off a standard and thus much cheaper. So, someone owning an Nvidia card would have to pay 600 bucks for a similarly featured screen.Reply -
Energy96 This is completely false. Free sync does not work with Nvidia cards, only Radeon. There is a sort of hack work around but it’s worse than just buying a g-sync monitor.Reply -
Energy96 @energy96: because including G-sync requires a proprietary Nvidia scaler, which is very expensive, while Freesync is based off a standard and thus much cheaper. So, someone owning an Nvidia card would have to pay 600 bucks for a similarly featured screen.Reply
I know this. My point was most people who buy a Radeon card are doing it because they are on a budget. It’s unlikely they will have the funds for a high end gaming monitor that is $450+. That’s more than they likely would have spent on the Radeon card.
Majority of people dropping that much or more on a gaming monitor will be running Nvidia cards. I know it adds cost but if you are running Nvidia a free sync monitor is out of the question. Free sync seems pointless in any monitor that is much over $300. -
cryoburner
The real question you should be asking is why a supposedly "high end" graphics card doesn't support a standard like VESA Adaptive-Sync, otherwise branded as FreeSync. You should be complaining on Nvidia's graphics card reviews that they still don't support the open standard for adaptive sync, not that a monitor doesn't support Nvidia's proprietary version of the technology that requires special hardware from Nvidia to do pretty much the same thing. It's not just AMD that will be supporting VESA Adaptive-Sync either, as Intel representatives have stated on at least a couple occasions that they intend to include support for it in the future, likely with their upcoming graphics cards. Microsoft's Xbox consoles also support FreeSync, albeit a less-standard implementation over HDMI.21646897 said:Why do these “high end” gaming monitors always seem to come with free sync instead of the Nvidia G-sync. Most people willing to shell out $450 and up on a monitor are going to be running Nvidia cards which makes the feature useless. With an Nvidia card it isn’t even worth considering a monitor that doesn’t support G-sync.
Nvidia doesn't support it because they want to sell you an overpriced chipset as a part of your monitor, and they want you arbitrarily locked into their hardware ecosystem once competition heats up at the high-end. I suspect that even they may support it eventually though. They're just holding out so that they can price-gouge their customers as long as they can.
-
Energy96 I can agree with this, but currently it’s the only choice we have. I’m not downgrading to a Radeon card. It would be nice if some of these monitors at least offered versions with it. I know a few do but the selection is very limited.Reply -
mitch074
A 1440p monitor and a Vega 56 go well together - if you have a 1080/1080ti/2070/2080/2080ti then yes it's a "downgrade", but if you run a triple screen off a Radeon card already, then it's not.21651950 said:I can agree with this, but currently it’s the only choice we have. I’m not downgrading to a Radeon card. It would be nice if some of these monitors at least offered versions with it. I know a few do but the selection is very limited.
I think my RX480 could run Dirt Rally on that thing quite well, for example - not everybody buy these things for a 144fps+ shooter. -
Dosflores 21650670 said:Free sync seems pointless in any monitor that is much over $300.
FreeSync isn't expensive to implement, unlike G-Sync, so why should companies not implement it? You seem to think that the only reason there can ever be to buy a new monitor is adaptive sync, which isn't true. People may want a bigger screen, higher resolution, higher refresh rate, better contrast… If they want G-Sync, they can expend extra on it. If they don't think it is worth it, they can save money. FreeSync doesn't harm anyone.
Buying a $450 monitor seems pointless to me after having spent $1200+ on a graphics card. The kind of monitor that Nvidia thinks is a good match for it is something like this:
https://www.tomshardware.com/reviews/asus-rog-swift-pg27u,5804.html