AMD FreeSync Versus Nvidia G-Sync: Readers Choose
We set up camp at Newegg's Hybrid Center in City of Industry, California to test FreeSync against G-Sync in a day-long experiment involving our readers.
Introduction
Nvidia's G-Sync variable refresh technology was introduced almost two years ago. AMD wasted no time letting us know it had an equivalent capability planned. Many months later, we got our first taste of what came to be known as FreeSync. And now enthusiasts are asking whether to put their money behind AMD or Nvidia. Rather than answering that largely subjective question in our own unilateral way, we took it to the community by setting up a blind taste test of sorts. The experiment was a great success, and we're ready to share our results.
It probably wouldn’t surprise you to learn that reviewing PC hardware is a fairly solitary occupation. Sure, you sit in on Skype calls and answer a seemingly endless stream of emails. But it becomes easy to forget that there are millions of enthusiasts around the world looking for guidance. They don’t want every new technology to receive a soft recommendation delicately worded to avoid hurting anyone’s feelings. Rather, they pledge allegiance to hard info and rely on technically-minded writers able to distill hundreds of data points into a conclusion: buy this or skip it.
Sometimes that’s an easy call to make. More often it’s not, even with benchmarks illuminating the path. But when a technology can’t be easily put through a suite of quantitative metrics... well, then you’re making a purchasing decision based on our good word. And no doubt that makes both of us just a little bit uncomfortable.
When Nvidia introduced its G-Sync technology, we published G-Sync Technology Preview: Quite Literally A Game Changer. In that piece, we covered technical details, features, configuration and early subjective impressions. But we were off the hook, in a way. Our sample was pre-production. Compatible monitors still weren’t selling yet. And we had the word preview right in our title. Still, we knew Nvidia was onto something, even if we couldn't predict G-Sync's uptake.
The market is so much different today. Not only is G-Sync all over the place — available in 24-, 27- and 28-inch form factors, using TN or IPS panels — but screens with AMD’s FreeSync technology are starting to become more popular, too. We count seven compatible models on Newegg as of this writing, and Filippo, who co-authored the G-Sync launch with me, is working on his deep dive into FreeSync. Really, it’s time to take a stand.
Despite Nvidia’s staunch defense of its proprietary approach and AMD’s evangelizing of a standardized tack, both require a commitment to one company’s hardware or the other. There is no mix and match. AMD can't support G-Sync, and Nvidia won't capitulate on FreeSync. So which deserves your investment, G-Sync or FreeSync?
There is no easy way to benchmark the differences between them, making it perplexingly difficult to drench you in data. And we’ve been doing this long enough to know that any proclamation based on one individual’s experience will become a punching bag for the comments section. Clearly, this needed to become a group project.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
-
Wisecracker With your wacky variables, and subsequent weak excuses, explanations and conclusions, this is not your best work.Reply
-
NethJC Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.Reply
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
-
AndrewJacksonZA Thank you for the event and thank you for your write up. Also, thank you for a great deal of transparency! :-)Reply -
Vlad Rose So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.Reply -
jkrui01 as always on toms, nvidia and intel wins, why bother making this stupid tests, just review nvida an intel hardware only, or better still, just post the pics and a "buy now" link on the page.Reply -
loki1944 Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way. -
Traciatim It seems with these hardware combinations and even through the mess of the problems with the tests that NVidia's G-Sync provides an obviously better user experience. It's really unfortunate about the costs and other drawbacks (like only full screen).Reply
I would really like to see this tech applied to larger panels like a TV. It would be interesting considering films being shot at 24, 29.97, 30, 48 and 60FPS from different sources all being able to be displayed at their native frame rate with no judder or frame pacing tools (like 120hz frame interpolation) that change the image. -
cats_Paw "Our community members in attendance now know what they were playing on. "Reply
Thats when you lost my interest.
It is proven that if you give a person this information they will be affected by it, and that unfortunatelly defeats the whole purpose of using subjective opinions of test subjects to evaluate real life performance rather than scientific facts (as frames per second).
too bad since the article seemed to be very interesting. -
omgBlur Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.
Speak for yourself, I invested in the ROG Swift and noticed the difference. This technology allows me to put out higher graphics settings on a single card @ 1440p and while the fps bounces around worse than a Mexican jumping bean, it all looks buttery smooth. Playing AAA games, I'm usually in the 80s, but when the game gets going with explosions and particle effects, it bounces to as low as 40. No stuttering, no screen tearing.