The majority of monitors, especially newer models, display excellent grayscale tracking (even at stock settings). It’s important that the color of white be consistently neutral at all light levels from darkest to brightest. Grayscale performance impacts color accuracy with regard to the secondary colors: cyan, magenta, and yellow. Since computer monitors typically have no color or tint adjustment, accurate grayscale is key.

User is the default color temp mode and it’s pretty close to correct. The RGB Balance chart shows a tendency towards red, but all of the errors are under three Delta E and therefore very hard to see. Even still, we observed an improvement after calibration.

Adjusting the RGB controls gives us a better result that is now under two Delta E across the board. You give up a tiny bit of contrast, but we think it’s worth it.
Here is our comparison group:

A Delta E measurement of 2.42 puts the PG278Q near the top in out-of-box grayscale performance. In fact, its result exceeds a few professional-class monitors we’ve tested. We feel most gamers would be satisfied with the Swift in its uncalibrated state.

A little adjustment brings the average error down to 1.27 Delta E. The improvement in image quality is noticeable to our eyes. When you always use a calibrated monitor, a display with even a tiny error doesn't look quite right. Today though, the grayscale prize goes to BenQ's XL2720Z with its stellar numbers.
Gamma Response
Gamma is the measurement of luminance levels at every step in the brightness range from 0 to 100 percent. This is important because poor gamma can either crush detail at various points or wash it out, making the entire picture appear flat and dull. Correct gamma produces a more three-dimensional image, with a greater sense of depth and realism. Meanwhile, incorrect gamma can negatively affect image quality, even in monitors with high contrast ratios.
In the gamma charts below, the yellow line represents 2.2, which is the most widely used standard for television, film, and computer graphics production. The closer the white measurement trace comes to 2.2, the better.

The gamma tracking runs just a tiny bit light at an average value of 2.15. It’s only slightly off of our standard and again, most users won’t notice. We do wish there was a gamma adjustment, however. Some games benefit from a little lighter or darker tone to help bring out detail. Fortunately, many developers add a software-based slider to compensate.
Here is our comparison group again:

The difference between the highest (2.29) and lowest (2.07) values is pretty small, indicating good tracking. There aren’t any significant dips or peaks to spoil the result.
We calculate gamma deviation by simply expressing the difference from 2.2 as a percentage.

A 2.15 average puts the PG278Q in fifth place among today’s group. Overall, we’re perfectly satisfied with its gamma performance. Our only beef is the lack of additional presets.
- Asus ROG Swift PG278Q G-Sync Monitor Review
- Gaming Features: G-Sync, Fast Refresh, ULMB and GamePlus
- Packaging, Physical Layout, And Accessories
- OSD Setup and Calibration of the PG278Q
- Measurement And Calibration Methodology: How We Test
- Results: Brightness and Contrast
- Results: Grayscale Tracking and Gamma Response
- Results: Color Gamut and Performance
- Results: Viewing Angles and Uniformity
- Results: Pixel Response, Input Lag and Blur Reduction
- ROG Swift PG278Q, A Display Technology Revolution
But one thing I do hope for is a 144hz g-sync IPS monitor, ever since I've gotten my new Asus MX239H the ips makes a huge difference in games.
But besides that, it is a glorious monitor, resolution is great, 144hz, and of course g sync makes it a wonderful monitor.
But really $800? I know that it is one of the few g sync equipped monitors, but you can buy a 4k monitor for $650!
Pretty unlikely. ULMB requires a static refresh rate, because it has to strobe the monitor at a constant rate. GSYNC would mean that it would have to strobe in time with each frame, at a variable rate. You would introduce a lag time on the strobing if you tried to do this, since it would be at a variable rate instead of a constant one.
Off to read it now! lol
Off to read it now! lol
There have been plenty reviews for this monitor just Google it. And they have all been great reviews...makes me want it even more
Personally, I'm sick of the crappy motion resolution in LCDs. It's not so bad in some games, but it's nigh-unbearable in certain games. My next monitor/TV WILL have Strobing-Backlights since it's the best way to get rid of motion blur.
However, maybe someone can help me out on this, I don't understand why monitors that feature such motion-enhancing technologies seem very nitpicky with which frame rate, refresh rate, etc. it's being used with. I'm saying this because more and more TVs are coming out with such Strobing-backlight technology, and I'm pretty sure those don't require an absolute steady framerate for it to work.
For example, if I were to connect a console to this ASUS Swift monitor, could I use ULMB in 120hz mode with a 30fps game?
It's not the framerate they're being picky about, it's the refresh rate. The light has to strobe in time with when the next frame is being introduced. When the refresh rate is constant (i.e., locked at 80, 100, or 120 Hz) then the strobe knows exactly when the next frame will be displayed. You're asking the display to strobe the backlight at will whenever the GPU can put out a frame. You're essentially asking the GPU to not only handshake with GSYNC when to render a frame, but to trigger the backlight to strobe then too. The tricky part here is that's another layer where you will have to reduce response time (response from the GPU's frame being rendered to backlight being strobed) since the refresh rate is no longer constant (it's now dependent on your game's refresh rate - which is barely ever anywhere near "constant").
How awful would your strobing backlight look if it came a few ms after your frame rendered? That'd probably screw all of the blur reduction qualities you want from it. At best, you could make an algorithm that would strobe at the *average* framerate you're outputting since framerate can rise and dip so quickly, but that could still cause a lot of problems
For example, if I were to connect a console to this ASUS Swift monitor, could I use ULMB in 120hz mode with a 30fps game?
I'm definitely not an expert on ULMB or Gsync but the blurbusters website says "LightBoost motion blur elimination is not noticeable at 60 frames per second." So even if you could get a console hooked up to the Asus Swift I don't think you would be able to notice any difference unless you get 85+ fps.
Off to read it now! lol
There have been plenty reviews for this monitor just Google it. And they have all been great reviews...makes me want it even more
Oh I know that, it's just that I was waiting for Tom's Hardware specifically to do a review since I like their reviews!
It's not the framerate they're being picky about, it's the refresh rate.
Ok, let's forget consoles then for a second, because I didn't think of the fact that they can't output at 120hz. If, for example, I had my PC hooked to the Swift monitor, set to 120Hz, and that the game I play has a fluctuating framerate going anywhere from 30fps to 90fps. Would I be able to use ULMB since the monitor is running at 120Hz? Despite the framerate being all over the place, and not ever at 120fps?
Thanks for your reply btw.
For example, if I were to connect a console to this ASUS Swift monitor, could I use ULMB in 120hz mode with a 30fps game?
I'm definitely not an expert on ULMB or Gsync but the blurbusters website says "LightBoost motion blur elimination is not noticeable at 60 frames per second." So even if you could get a console hooked up to the Asus Swift I don't think you would be able to notice any difference unless you get 85+ fps.
But like I said, more and more TVs are being released with a 'Black-Frame insertion' option, and from reviews, it gets rid of motion blur very well, even for a movie, which plays at 24fps.
Ok, let's forget consoles then for a second, because I didn't think of the fact that they can't output at 120hz. If, for example, I had my PC hooked to the Swift monitor, set to 120Hz, and that the game I play has a fluctuating framerate going anywhere from 30fps to 90fps. Would I be able to use ULMB since the monitor is running at 120Hz? Despite the framerate being all over the place, and not ever at 120fps?
Thanks for your reply btw.
No problem. I enjoy discussing the topic.
Yes. You would. Because with ULMB on, the REFRESH RATE stays constant, despite your varying frame rate. The monitor (In regular, or in ULMB mode, with Gsync off) will only refresh the frame at a rate of every 8.33 ms (1 / 120Hz), regardless of your framerate. This has nothing to do with the 1ms response time. That's where your keyboard or mouse input lag comes in. This is also what causes horziontal tearing, which is what GSYNC aims to remove. If your FRAME rate is much higher, or much lower, than your monitor's REFRESH rate, you will observe lots of tearing. ULMB does not reduce tearing, just motion blur.
You don't have to hit 120 fps to refresh at 120 Hz, but you get the most benefit out of your monitor that way. So yes, you can play ULMB at any framerate, but you *will* notice stutter if you're playing in the 90s and then drop into the 30s. This is what traditionally VSYNC tries to remove, but introduces input lag as a side effect.
Gsync removes the stutter and the tearing with virtually no input lag. It makes it so your monitor will refresh at the same rate as your framerate. So if you set your monitor to 144 Hz, and turn GSYNC on, then suddenly your *Max* framerate becomes 144 Hz (can't update faster than the panel), and the refresh rate of the monitor (when the monitor displays new frames from the GPU) varies with the framerate of the game from any range of 35 FPS up to 144 FPS. If you drop below 30 FPS, the GSYNC module switches to traditional VSYNC.
GSYNC can be toggled on and off from the Nvidia Control Panel. This is how you can switch between GSYNC or ULMB depending on what type of game you want to play.
Some of their lower-end products have some quality issues I hear, but you see that in Dell, HP, Acer... It's not exactly a new trend.