AMD FreeSync Versus Nvidia G-Sync: Readers Choose
We set up camp at Newegg's Hybrid Center in City of Industry, California to test FreeSync against G-Sync in a day-long experiment involving our readers.
Post Mortem
There's no such thing as a perfect experiment. We put a ton of time into planning our event, were as transparent as possible with both AMD and Nvidia ahead of it and still ran into zero-day issues that had to be dealt with. As part of our analysis, we thought it important to follow-up with both companies and get their feedback. Specifically, we wanted suggestions on ways to make future events better. Part of this involved facing perceived shortcomings. Some of these came from AMD and Nvidia, and others were noted by our own team.
Let's start with Nvidia's commentary, provided by Tom Petersen, director of technical marketing and one of our attendees.
The side by side blind testing technique is a great way to get some direct feedback from gamers about new technologies. Unfortunately, it is also the case that whenever someone knows what we are testing for they are biased to some extent which could inadvertently impact their testing and feedback.
There are a few techniques that can help mitigate this inherent expectation bias:
1. Double blind studies – the test administrator and the testers should not know what is being tested.
a. Don’t tell the gamers the purpose of the evaluation – knowledge of this being a G-Sync vs. FreeSync could impact results.
b. Use volunteers to run the test flow to eliminate the risk of administrators passing along test information
2. Include a control group with nothing new. In this case I would have used one of the monitors in “fixed refresh rate mode.”
3. Increase the sample size. This may be very difficult in practice, but more data is definitely better when science is involved.
Overall I enjoyed the opportunity to engage with THG’s community. I look forward to seeing the results.
We especially like Tom's suggestion to use a control group in a fixed refresh mode for comparison. Given a longer day and perhaps more activities to keep other folks busy, we would like to see gamers on three systems, one of them being a control of some sort.
A larger sample size was on our wish list all along, but there's only so much you can do with eight machines and one Saturday afternoon. This event was already several times as large as our last one, and we'll definitely shoot for something even larger next time.
The idea to keep the purpose of the experiment under wraps is also intriguing, though I'm not sure we'd have as much luck getting volunteers to sign up without some sort of teaser ahead of time. This and volunteer-run testing might be ideal, but they present us with some practical challenges we'll have to think about.
Now for AMD's feedback, which comes to us by way of Antal Tungler, public relations manager, who helped us coordinate the company's participation (including AMD attendees).
AMD is always happy to see this sort of testing become available to end users and community members and people who are just interested in tech and PC gaming. We applaud Tom’s for this initiative regardless of the outcome. AMD FreeSync technology has now been on the market for almost six months and we’ve seen terrific adoption from display vendors: there are now 20 FreeSync-enabled monitors on the market with more on the way.
A couple of thoughts regarding this test:
-Because AMD FreeSync technology enables such a wide variety of display tech and refresh rates for vendors to productize, we believe a true Pepsi-style challenge for DRR displays should aim to keep the frame rates in the DRR zone all the time. It’s also important that all parties run at nearly identical frame rates, which greatly benefits a true Pepsi-style challenge. We’d love to see more emphasis on this in the future.
-Choosing games and settings carefully is paramount to make sure that the scenarios gamers look at are reproducible, consistent and glitch-free. Maybe there’s some room for improvement there.
-One could consider including a DRR specific benchmark, like the AMD Windmill application. While it certainly shouldn’t be the only method of testing, it would be interesting to add to the overall results.
We believe that some last minute changes made before the event (that didn’t necessarily guide the experiment in a true apples-to-apples comparison’s direction) make it difficult to call it a true Pepsi challenge. Regardless, we’re really happy to see Tom’s Hardware putting this much effort into pulling together this event, and are grateful for the opportunity to have participated in it. We’re sure with some of the above changes implemented, there are many more events like this coming in the future that benefit end users and the industry as a whole. Thank you!
The changes AMD is referring to are the zero-hour decision to leave v-sync off outside of its variable refresh range and the side experiment we put together in Battlefield 4. On the first point, I really wish we would have thought to specify v-sync behavior one way or the other back when we were disclosing everything to both companies. But given the majority vote of our readers and AMD's default behavior, I'm comfortable with where we ended up for the event.
Allowing Nvidia to set one of its systems up with different settings in Battlefield is a fair protest on AMD's part, even if we generated useful data from it. Done over, I would be more adamant that the settings selected before the event needed to be universal, and if we wanted to do a separate experiment, do it during lunch or with stragglers after the official proceedings.
I do, however, disagree that games and settings should be chosen to keep both solutions in their variable refresh range. G-Sync and FreeSync have dissimilar VRRs right now, and that has to factor into any buying decision. Forcing the technologies into their bands, however wide or narrow they might be, overlooks that the bands aren't equal.
My own feedback is more pointed than that of either AMD or Nvidia (both organizations were polite and professional each step of the way). I'd be more tempted, in retrospect, to use TN-based screens, giving AMD a more generous VRR of 40 to 144Hz, if only to see how the Borderlands results would change. This is disappointing because I've maintained for two years that I want three high-refresh IPS panels on my desk for gaming. Stepping back to TN for the broader VRR wouldn't interest me, personally. But that's a reflection of where we're at right now with FreeSync. Hopefully the initiative continues gaining momentum and we see its growing pains remedied.
Still, had we gone with the BenQ screen instead, a lot of the other issues we had on game day might not have arisen. Or maybe we would have figured out something else to argue about. This was a battle between two graphics giants, after all.
MORE: Best Graphics Cards For The Money
MORE: All Graphics Content
MORE: Graphics Cards in the Forum
MORE: Best Computer Monitors
MORE: All Monitor Content
MORE: Displays in the Forums
Chris Angelini is a Technical Editor at Tom's Hardware. Follow him on Twitter and Google+.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
-
Wisecracker With your wacky variables, and subsequent weak excuses, explanations and conclusions, this is not your best work.Reply
-
NethJC Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.Reply
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
-
AndrewJacksonZA Thank you for the event and thank you for your write up. Also, thank you for a great deal of transparency! :-)Reply -
Vlad Rose So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.Reply -
jkrui01 as always on toms, nvidia and intel wins, why bother making this stupid tests, just review nvida an intel hardware only, or better still, just post the pics and a "buy now" link on the page.Reply -
loki1944 Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way. -
Traciatim It seems with these hardware combinations and even through the mess of the problems with the tests that NVidia's G-Sync provides an obviously better user experience. It's really unfortunate about the costs and other drawbacks (like only full screen).Reply
I would really like to see this tech applied to larger panels like a TV. It would be interesting considering films being shot at 24, 29.97, 30, 48 and 60FPS from different sources all being able to be displayed at their native frame rate with no judder or frame pacing tools (like 120hz frame interpolation) that change the image. -
cats_Paw "Our community members in attendance now know what they were playing on. "Reply
Thats when you lost my interest.
It is proven that if you give a person this information they will be affected by it, and that unfortunatelly defeats the whole purpose of using subjective opinions of test subjects to evaluate real life performance rather than scientific facts (as frames per second).
too bad since the article seemed to be very interesting. -
omgBlur Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.
Speak for yourself, I invested in the ROG Swift and noticed the difference. This technology allows me to put out higher graphics settings on a single card @ 1440p and while the fps bounces around worse than a Mexican jumping bean, it all looks buttery smooth. Playing AAA games, I'm usually in the 80s, but when the game gets going with explosions and particle effects, it bounces to as low as 40. No stuttering, no screen tearing.