When shopping for a PC monitor, one question PC gamers have to to decide on is Nvidia G-Sync or AMD FreeSync? For serious gamers, one of these versions of adaptive sync is essential for a tear-free image. While G-Sync monitors were made for PCs running Nvidia graphic cards and FreeSync monitors for AMD graphics users, they both promise a smooth image. But with all other things equal, is one adaptive sync better than the other?
To compare performance between the two, we’ve brought in two nearly identical monitors from AOC. The AOC Agon AG241QG has G-Sync, and the AOC Agon AG241QX has FreeSync (it’s also G-Sync Compatible, meaning Nvidia recently approved it to support G-Sync as well). We’re pitting both monitors’ gaming experience, speed and accuracy against each other—both before and after calibration— to see if either G-Sync or FreeSync has a visible advantage for users.
A Brief History
For years, games with rendered graphics suffered from the dreaded frame tear artifact. Explained simply, it happens when the PC’s graphics card sends frames out of sync with the monitor’s refresh rate. For most PC monitors, that’s 60Hz, but even panels that could run at 144Hz or higher exhibited the problem. What was needed was a way to synchronize the monitor and graphics card, so frames would render only at the beginning of the refresh cycle. The simple solution was to create monitors with variable refresh rates. And in 2014, Nvidia did just that with G-Sync.
In its first implementation, G-Sync was a $200 kit that users were required to install into their monitor. This was not a sustainable solution, so manufacturers quickly moved to include the necessary components in purpose-built displays. That’s how G-Sync is packaged today.
However, AMD was not about to be upstaged, so it did something even more revolutionary. In 2015, it released FreeSync, the same technological concept but included as part of the DisplayPort spec. It was, and still is today, free of charge. All a manufacturer need do is support DisplayPort 1.2a and implement the feature through firmware. Now people with AMD graphics cards could enjoy adaptive sync without paying extra for a special monitor.
G-Sync vs. FreeSync: Features Comparison
G-Sync and FreeSync do the same thing in that they link the monitor and graphics card so that frames are only drawn at the beginning of the LCD panel’s refresh cycle. The refresh rate varies in real time, so there is never an incomplete frame on the screen. So, at the core, there is no difference between what a G-Sync user or FreeSync user sees. Either you have frame tears or you don’t. If adaptive sync is working, you won’t be able to tell the difference between G-Sync and FreeSync.
In 2015, we set up a blind test to determine if users could indeed see a difference. Though we admittedly had a few flaws in our methodology, participants chose G-Sync by a respectable margin. We chalked it up to FreeSync’s relative newness and G-Sync’s greater refresh rate operating range. But is the result the same in 2019?
Today, we have “G-Sync Ultimate” and ‘FreeSync 2 HDR” for supporting HDR content and its wider bandwidth. Both are upgrades from their original and deliver more features and better performance in a wider variety of situations than ever before.
FreeSync 2 HDR requires DisplayPort 1.4 for a full implementation. In addition to FreeSync, it supports 144Hz operation, HDR10, 4K gaming and the extended DCI-P3 color gamut. To get this same support with G-Sync, one simply needs to add a few bucks to the price tag and purchase a G-Sync equipped monitor sporting the same specs.
And here’s another thing to consider, some FreeSync monitors, labeled as “G-Sync Compatible,” can run G-Sync when connected to a GTX 10-series Nvidia graphics card or better, just without HDR. How is this accomplished? To learn, check out our article on How to Run G-Sync on a FreeSync Monitor.
Most manufacturers today market versions of the same monitor with G-Sync and FreeSync. The choice then comes down to feature set. Of course, if a user is already committed to AMD or Nvidia, the choice has been made for them. But if you’re building a new system and can go either way, is one better than the other?
G-Sync vs. FreeSync: Tests and Comparison
To explore this, we obtained two 24-inch AOC Agon monitors, the AG241QG (G-Sync) and AG241QX (FreeSync). Both are TN panels running at QHD resolution (2560x1440). The FreeSync monitor tops out at 144Hz, and the G-Sync one overclocks to 165Hz. They offer the same SDR-only signal compatibility. Neither have HDR or extended color.
In terms of ports, the the QX has one DisplayPort and two HDMI 2.0 ports, while the QG makes do with one DisplayPort and one HDMI 1.4. The QX also offers one each of VGA and DVI ports, though those are not relevant to a modern gaming system. Both screens have four downstream USB 3.0 ports, plus one upstream, a headphone jack and built-in speakers. Aside from a one-watt difference in their speakers, they are nearly identical on paper.
Aside from that, there isn’t much to separate the two screens. Physically, the monitors are identical in size, shape and style. But the QX has a nice little puck controller for its on-screen display (OSD). Meanwhile, the QG makes do with bezel keys. The QG also forgoes preset picture modes. Neither has a blur-reduction feature.
To see if either G-Sync or FreeSync offers gamers more advantage than the other, we ran our two monitors through an abbreviated suite of color and luminance tests in order to see if one is more accurate or has better contrast. Theoretically, they should look the same, since they use the same AU Optronics panel part.
Luminance, Color and Response Measurements
|AG241QG (G-Sync)||AG241QX (FreeSync)|
|Maximum White||453.3207 nits||470.2134 nits|
|Maximum Black||0.7158 nit||0.512 nit|
|Calibrated Black||0.4256 nit||0.2243 nit|
|Default Grayscale Error||4.90dE||2.17dE|
|Calibrated Grayscale Error||0.76dE||1.58dE|
|Gamma Deviation from 2.2||2.72%||3.18%|
|Default Color Error||6.89dE||2.79dE|
|Calibrated Color Error||2.34dE||2.32dE|
|Total Input Lag||31ms||33ms|
We were surprised by the results, particularly in the brightness (maximum white) and contrast tests. Obviously, different teams engineered the same panel part because the black levels and dynamic range (both maximum contrast and calibrated contrast) scores are literally night and day. The FreeSync AG241QX delivered performance typical of today’s TN panels with over 918:1 contrast. The G-Sync AG241QG only managed about two-thirds of that. The calibrated results were nearly the same. There is no question when viewing them side-by-side that the QX has a better image. The QG looks washed out by comparison.
Gamma and color test results were much closer, though the G-Sync screen required calibration where the FreeSync did not. With out-of-box scores of 2.17dE and 2.79dE for grayscale and color respectively, the QX had no great need for adjustment. The QG, on the other hand, should be tweaked and benefited greatly from it. You can see further detail in our CalMAN charts below.
In the speed tests, the difference came down to 144 versus 165Hz. Raw input lag among the two displays was equal at 25ms, while panel response differed by 2ms. However, no one will be able to perceive a 2ms difference. When it comes to control response, the two panels are identical.
Default Grayscale and Color Measurements
The CalMAN charts help illustrate the numbers we recorded. The AG241QG came out of the box with a warm look that’s deficient in blue. Gamma was quite far off the mark as well. This adds to the flat-looking image expected from a monitor with below-average contrast. There’s plenty of brightness, but highlight detail doesn’t have the pop that it should, and blacks look more like a medium gray.
The AG241QX, on the other hand, ships with a grayscale that’s fairly close to D65, which matches any content mastered with the Adobe RGB or sRGB color space. The monitor’s errors can’t be seen by the naked eye. Only at 100 percent brightness does it go over 3dE, and only barely so. Gamma is acceptable too.
Grayscale and gamma accuracy greatly affect a monitor’s color. The AG241QG has visible hue errors in red, green and cyan, thanks to a white point that strays toward yellow. Red is also slightly undersaturated, but that appears to be inherent in the panel because the AG241QX exhibits the same behavior.
Besides that undersaturated red, the QX solid in its gamut accuracy. Its hue errors are minor, and all errors are below the visible threshold. This is thanks to its properly tuned grayscale and gamma.
Calibrated Grayscale and Color Measurements
Both monitors benefitted from a few adjustments but the AG241QG more so. That monitor now wins the grayscale contest with a .76dE score. However, the QX is still well within our standard of accuracy.
Color gamut errors are now a wash between the two with a difference of just 0.01dE.
If you own either of these monitors, feel free to try our calibration settings.
|Brightness 200 nits||45||42|
|Color Temp User||Red 50, Green 49, Blue 53||Red 45, Green 48, Blue 50|
After our testing, it’s apparent that the most glaring difference between these two monitors is their price. At this writing, the G-Sync-equipped AG241QG G-Sync screen is selling for around $600/ £399.95, while the FreeSync-equipped AG241QX is $320 / £265.99. That’s a major gap, and it’s more than the usual $150 premium most G-Sync monitors carry. Given the image quality advantage enjoyed by the less expensive FreeSync monitor, we know which one we’d choose.
But wait! We haven’t played any games yet.
G-Sync vs. FreeSync: Gaming Tests
We’ll get right to the burning question, is there a visual difference between a G-Sync monitor and a FreeSync monitor? In our experiment, the two AOC screens performed the same in terms of frame rate, control response and the delivery of a tear-free experience. As we postulated earlier, the two technologies do the same thing, and with everything else being equal, users will not be able to tell a difference. However, we have already established an image quality divide between the two.
We played the same sequence of Call of Duty: WWII on both monitors. The AG241QX with its higher contrast was the clear winner. Shadow areas looked black, image depth was excellent and color looked more saturated. You may be saying, “but the color tests were a wash, right?” Yes, but the QX has almost twice the dynamic range of the QG. That has a much greater impact on how color looks than small differences in gamut accuracy.
Our AG241QG did hit 165 fps a few times in Call of Duty: WWII. We had no problem pegging the AG241QX at 144 fps when playing the same title using FreeSync. And we could tell no difference in smoothness or control response. Honestly, the only way one will benefit from super-high frame rates is with one of the 240Hz 25-inch screens like the Alienware AW2518H or the AOC Agon AG251FZ and then, you’ll have to settle for FHD resolution.
We also tried playing a less-demanding game, Tomb Raider. For the FreeSync monitor, we used a PC running an AMD Radeon R9 285 graphics card, but it only reached 40 fps maximum. After reducing the detail from Ultimate to High, we hit a comfortable 60 fps. Again, there were no tears or other artifacts to spoil the fun. The AG241QX delivered great gameplay, even at lower frame rates. We could not see a difference when playing Tomb Raider on the 144Hz FreeSync monitor compared to its 165Hz G-Sync twin.
When we first conceived this experiment, we expected the result to swing in favor of the G-Sync monitor. G-Sync monitors usually have more features and perform a little better in our color and contrast tests. We hypothesized that while you’re paying a premium for G-Sync, those monitors perform slightly better in other areas related to image quality.
Our testing of the AOC Agon AG241QG and AG241QX proved that theory wrong. Obviously, the FreeSync-based QX is superior in every way except for its maximum refresh rate. But why would one give up contrast and spend an extra $300 just to gain 21Hz that you can’t actually see when playing? Clearly, the AG241QX is the better choice.
But we cannot stress enough the importance of your graphics card. Regardless of whether you choose Nvidia or AMD, buy as much processing power as you can afford. There is nothing that impacts the gaming experience more than frame rate. Your favorite games won’t be enjoyable if you can only pay at 30 or 40 fps, or at low detail levels. Ultimately, frame tears are less and less noticeable at higher frame rates. If you can sustain over 100 fps, tearing is almost impossible to see, even when G-Sync or FreeSync is not being used.
This is something we’ve observed time and time again during our reviews. The G-Sync test system we use has an Nvidia GeForce GTX 1080 Ti Founders Edition graphics card. It can sustain 100 fps in some titles at 4K resolution. At that point, adaptive sync plays a lesser role. We’ve connected FreeSync monitors to that PC and played titles like Far Cry 4 and Call of Duty: WWII without adaptive sync on, and they looked amazing. Speed and detail-rendering reach a new level when your graphics card can maintain 100 fps or more.
What does the future hold? This author would love to see a graphics card that would supports both G-Sync and FreeSync. If there are monitors that can do this, why not graphics boards? Without engaging in too much speculation, I believe the future will bring ever higher pixel counts (8K is just around the corner) and better implementations of HDR and extended color.
For now, though, we have an abundance of choices in gaming monitors, with the number of G-Sync and FreeSync displays growing every day. However, it seems that with everything else equal, if you’re Green Team / Red Team agnostic, AMD FreeSync will improve your gaming experience just as much as Nvidia G-Sync.
MORE: Best Gaming Monitors
MORE: How We Test Monitors
MORE: All Monitor Content