This article presented us with a lot more surprises than we were expecting to encounter. The concept seemed simple, but as we dug further, we found more and more anomalies, caveats, and issues to discuss.
As it stands, our previous assumption that you can manipulate anti-aliasing in the driver panel alone was turned upside-down. And yet, we’re only left with half of the bigger picture.
We now know what the different anti-aliasing modes are, how to enable them, and which ones are likely to work and which ones are likely to do nothing at all. But what we haven’t covered yet is performance, and that is key. For example, knowing that supersampling provides the best anti-aliasing quality doesn’t help when it cripples a $500 graphics card at 1920x1080. That’s why this article is part one of two, and that second part is coming in the very near future.
The lessons we learned are unmistakable. First and foremost, enhanced MSAA modes like CSAA, EQAA, and edge-detect don’t impress us all that much from an image quality-improvement standpoint. As it turns out, 2x MSAA is good, 4x MSAA is great, and 8x MSAA is superlative when it comes to removing aliasing artifacts from the edges of polygons. You can add coverage samples and edge-detect algorithms to 4x MSAA and it still doesn’t rival true 8x MSAA. At the end of the day, these proprietary modes can make slight improvements to visual quality, but realistically, you’re probably not going to notice their effects compared to the base MSAA mode on which they ride. We had to spend time zooming in to static screenshots to see if coverage samples and edge-detect processing were making any difference at all. But the difference between true 4x MSAA and 8x MSAA is relatively easy to notice.
Secondly, we learned that the aliasing that occurs on objects with texture transparencies is unaffected by MSAA, and despite newer DirectX 10/11 techniques like alpha-to-coverage, we see a need for further transparent texture anti-aliasing. When it comes to Radeons, adaptive anti-aliasing rarely works, but Nvidia’s transparent supersampling is relatively reliable in DirectX 10 and 11 games. This is an area we’d really like to see AMD improve.
What about AMD’s morphological anti-aliasing? MLAA works on every game we tested, and it can even clean up aliasing artifacts on transparent textures a little bit (although it won’t do nearly as good a job as a true texture transparency anti-aliasing method like adaptive or transparency anti-aliasing). Morphological AA also works in conjunction with MSAA. MLAA can be a great option, but it can have a detrimental effect on image quality when small text is critical in game play. As such, it isn’t viable as a set-and-forget method of anti-aliasing for everyone.
If you really want to generalize, most users are probably best served by setting their in-game anti-aliasing to 4x MSAA and leaving it at that, assuming their graphics hardware is fast enough to handle it. Otherwise, Nvidia owners may want to enable transparency supersampling through the driver in order to reduce aliasing artifacts on transparent textures. Hopefully, AMD will assign some driver development to this area so that adaptive anti-aliasing becomes a reliable option for Radeon owners, too.
In any case, our upcoming anti-aliasing analysis, which focuses on benchmark performance, will help folks see what settings their hardware can handle.
Current page: ConclusionPrev Page Driver-Forced Anti-Aliasing: Surprisingly Unreliable
Stay on the Cutting Edge
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
PERTAMAX gan. .Reply
Awesome article. I am unfortunately not one of the elite few who know all the ins and outs of graphics performance, so this was very enlightening for me.Reply
Great article, very informative. I've never really used forced anti-aliasing through the driver, and from what I've read it doesn't really sound like a good idea anyway, given the fact that most modern games provide adequate AA levels through in-game settings (these are usually better optimized as well). Seems like forced driver level AA is pretty hit-or-miss. With a few rare exceptions it just doesn't seem like it's worth the effort.Reply
...went to the link for Tom's Geforce3 article. The good old Geforce3, now that takes me back.
This is definitely one of the better articles I've readReply
what are you talking about?Reply
we can still force Supersampling
as of 266.58 on Nvidia cards
Why does it say here, http://www.geforce.com/#/Optimize/Guides/AA-AF-guide (go to the next page of this article), that it internally renders the frame at a resolution 4 times greater? But according to the 2nd page of this article, it says that at x4 AA it only internally renders a frame at a resolution 2 times greater.Reply
Great article, but this is really something Nvidia and AMD will have to fix together with game developers...Reply
The settings I select in the driver control panel should apply without me having to worry about coverage samples, multi samples, DirectX versions or the alignment of the planets. It should just work.
Similarly, any self-respecting game made in the last 6-8 years should have proper anti-aliasing options in its in-game menu. Not just an On/Off switch, but the full range of AA settings available with the video card being used.
MrBonkBonkwhat are you talking about?we can still force Supersampling as of 266.58 on Nvidia cardsReply
IF the game lets it! Drivers trying to force AA doens't mean the game will allow it. If the game doesnt' support it your not goting to get tehy type of AA.
Either way this AA fragmentation is almost as bad as all these custom versions of Driod.
Maybe I'm wrong, but no Batman or Mass Effect 2 forced AA settings?Reply
Nvidia article is wrong, or at the very least semantically sloppy.
4X samples generally means doubling of resolution for both axes. 4*(x*y)==(2*x)*(2*y).