AMD FreeSync Versus Nvidia G-Sync: Readers Choose

The Bottom Line

AMD is at a disadvantage because it doesn’t have the variable refresh range coverage currently enjoyed by Nvidia. If you spring for the QHD/FreeSync/IPS display we tested with and run a game like Borderlands on a 390X, it’s going to fall outside of 35-90Hz almost exclusively, even if you dial in the most taxing settings possible. Conversely, the QHD/FreeSync/TN screen we could have chosen instead would have likely run into issues with the quality settings we used in The Witcher, which averaged in the 40s, but also dipped lower.

Theoretical similarities between G-Sync and FreeSync aside, we also cannot ignore the fact that a number of our event participants chose the Nvidia solution in games where both FreeSync and G-Sync should have been inside their respective ranges at all times. This happened at a rate of 2:1 in Crysis, and almost 3:1 in Battlefield 4. Those are discrepancies we’d have a tough time attributing to variable refresh. Something else is going on there—a situation made stranger by the fact that both games were picked for their AMD Gaming Evolved affiliations.

Of course, FreeSync is relatively young compared to G-Sync, and we understand that there are hardware and software improvements planned that’ll address some of the technology’s current weaknesses. Technical individuals within AMD and Nvidia acknowledge that, inside the variable refresh range, FreeSync and G-Sync should be equally enjoyable. While our experimental data actually gives Nvidia an edge where one wasn’t expected, paying an extra $150 or more for it may sway certain enthusiasts the other way.

As for us, we’re just glad both technologies exist. Nvidia should be commended for its innovation, which set us on this path almost two years ago. AMD is taking a different approach, and it’s progressing much more slowly. But viable—nay, successful partner products are available, as evidenced by the MG279Q. No doubt FreeSync's lower barrier to entry will be appreciated by more gamers as the line-up of compatible components grows. Might the same sort of experiment held a year in the future yield different results? It’s hard to say. But based on the enthusiasm we saw at Newegg’s Hybrid Center, we’re confident we have the crew for whatever testing is needed.

Chris Angelini
Chris Angelini is an Editor Emeritus at Tom's Hardware US. He edits hardware reviews and covers high-profile CPU and GPU launches.
  • Wisecracker
    With your wacky variables, and subsequent weak excuses, explanations and conclusions, this is not your best work.

    Reply
  • NethJC
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
    Reply
  • AndrewJacksonZA
    Thank you for the event and thank you for your write up. Also, thank you for a great deal of transparency! :-)
    Reply
  • Vlad Rose
    So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.
    Reply
  • jkrui01
    as always on toms, nvidia and intel wins, why bother making this stupid tests, just review nvida an intel hardware only, or better still, just post the pics and a "buy now" link on the page.
    Reply
  • loki1944
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.

    Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.
    Reply
  • Traciatim
    It seems with these hardware combinations and even through the mess of the problems with the tests that NVidia's G-Sync provides an obviously better user experience. It's really unfortunate about the costs and other drawbacks (like only full screen).

    I would really like to see this tech applied to larger panels like a TV. It would be interesting considering films being shot at 24, 29.97, 30, 48 and 60FPS from different sources all being able to be displayed at their native frame rate with no judder or frame pacing tools (like 120hz frame interpolation) that change the image.
    Reply
  • Calculatron
    Huzzah!
    Reply
  • cats_Paw
    "Our community members in attendance now know what they were playing on. "
    Thats when you lost my interest.

    It is proven that if you give a person this information they will be affected by it, and that unfortunatelly defeats the whole purpose of using subjective opinions of test subjects to evaluate real life performance rather than scientific facts (as frames per second).

    too bad since the article seemed to be very interesting.
    Reply
  • omgBlur
    Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
    I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.

    Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.

    Speak for yourself, I invested in the ROG Swift and noticed the difference. This technology allows me to put out higher graphics settings on a single card @ 1440p and while the fps bounces around worse than a Mexican jumping bean, it all looks buttery smooth. Playing AAA games, I'm usually in the 80s, but when the game gets going with explosions and particle effects, it bounces to as low as 40. No stuttering, no screen tearing.
    Reply