AMD FreeSync Versus Nvidia G-Sync: Readers Choose
We set up camp at Newegg's Hybrid Center in City of Industry, California to test FreeSync against G-Sync in a day-long experiment involving our readers.
Summing The Analysis
We started down the path of comparing AMD’s FreeSync technology to Nvidia’s G-Sync in the hopes that a community-based, hands-on approach would provide additional insight or perhaps lead us to some unexpected conclusion. In some ways it has. But we naturally had some hypotheses going into our event, and those largely proved true, too.
Let’s get the big question out of the way: what’s better, G-Sync or FreeSync? As both technologies stand right now, the G-Sync ecosystem is more mature. From compatible graphics cards to G-Sync-capable displays and variable refresh ranges, Nvidia has the leg up. It also has an advantage when you drop below the variable refresh range. Our experiment never took us to that point, fortunately, so it didn't become an issue we needed to address in the analysis.
Then again, you’ll also pay $150 more for the G-Sync monitor we tested today—a premium that exceeds what most of our respondents claimed they’d be willing to spend for the experience they ultimately preferred, and some of those folks even picked AMD’s less expensive hardware combination as their favorite. [Editor's note: As previously noted, the pricing difference between monitors has increased since we published this article. As always, even that is subject to further change.]
Given the technical similarities of what FreeSync and G-Sync set out to achieve, so long as they’re both in their variable refresh range, all of our sources suggest you should get fundamentally identical results from them. On paper, that is. Differences do crop up outside of those upper and lower bounds, where FreeSync and G-Sync ask you to pick between v-sync on or off, and G-Sync is able to double (or more) screen refreshes when they drop below 30 FPS. We chose to leave v-sync off on the FreeSync-capable machines after surveying our readers’ habits and acknowledging AMD’s default driver behavior. However, it would have been interesting to compare results with v-sync on, particularly in Borderlands.
What about that Asus monitor we chose to represent FreeSync? While we can’t claim to know why the company chose a scaler limited to 90Hz on its 144Hz MG279Q, we’ve heard that it’s selling really well. Our event participants certainly loved it (along with Acer’s XB270HU). A $600 price point certainly isn’t cheap. However, when you’re talking about a 144Hz IPS display, gamers are bound to find a little more room in their budgets. In addition to the enthusiasts buying this screen for its FreeSync support, we suspect that those with high-end graphics subsystems from both GPU vendors are simply running it at 144Hz for fast-paced action on a beautiful-looking panel.
Current page: Summing The Analysis
Prev Page A Little About Our Participants Next Page The Bottom LineStay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
-
Wisecracker With your wacky variables, and subsequent weak excuses, explanations and conclusions, this is not your best work.Reply
-
NethJC Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.Reply
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
-
AndrewJacksonZA Thank you for the event and thank you for your write up. Also, thank you for a great deal of transparency! :-)Reply -
Vlad Rose So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.Reply -
jkrui01 as always on toms, nvidia and intel wins, why bother making this stupid tests, just review nvida an intel hardware only, or better still, just post the pics and a "buy now" link on the page.Reply -
loki1944 Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way. -
Traciatim It seems with these hardware combinations and even through the mess of the problems with the tests that NVidia's G-Sync provides an obviously better user experience. It's really unfortunate about the costs and other drawbacks (like only full screen).Reply
I would really like to see this tech applied to larger panels like a TV. It would be interesting considering films being shot at 24, 29.97, 30, 48 and 60FPS from different sources all being able to be displayed at their native frame rate with no judder or frame pacing tools (like 120hz frame interpolation) that change the image. -
cats_Paw "Our community members in attendance now know what they were playing on. "Reply
Thats when you lost my interest.
It is proven that if you give a person this information they will be affected by it, and that unfortunatelly defeats the whole purpose of using subjective opinions of test subjects to evaluate real life performance rather than scientific facts (as frames per second).
too bad since the article seemed to be very interesting. -
omgBlur Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.
Speak for yourself, I invested in the ROG Swift and noticed the difference. This technology allows me to put out higher graphics settings on a single card @ 1440p and while the fps bounces around worse than a Mexican jumping bean, it all looks buttery smooth. Playing AAA games, I'm usually in the 80s, but when the game gets going with explosions and particle effects, it bounces to as low as 40. No stuttering, no screen tearing.