I'm Anti-Aliasing. As In, I Won't Stand For Aliasing.
Back in April, we took an in-depth look at anti-aliasing technologies, the image quality they impart, and the driver settings needed to enable them. Imagine our surprise when we discovered that, in many cases, forcing a particular AA mode on via the GeForce or Radeon driver panel simply does not work. In case you missed that piece, or simply need to bone up on the different anti-aliasing settings offered by AMD and Nvidia, check out Anti-Aliasing Analysis, Part 1: Settings And Surprises.
Clearly, it took us an inordinate amount of time to prepare this follow-up, and that's partially due to the fact that there was a mountain of data to amass in order to create a meaningful comparison. Any game title we wanted to use for testing had to work on cards from both prevalent vendors under specific anti-aliasing technologies. As we learned from the previous story, this severely limits the playing field. Nevertheless, we managed to gather enough information to get a solid feel for how graphics cards perform when it comes to all of the different options available.
First, What's New?
But before we dig into the data, there are a couple notable events that transpired since the publication of our Part 1 coverage.
Back in April, we gave Nvidia a hard time for using misleading naming to describe its coverage sample anti-aliasing modes. As an example, the GeForce driver’s 8xAA mode did not take eight multi-samples per pixel as its name implied, but rather four multi-samples, plus four coverage samples. As a result, 8xAA on a GeForce card was not comparable to 8xAA on a Radeon card.
We’re happy to see that the company modified its nomenclature, and coverage sample modes are now easily identified with CSAA, per the following chart:
|GeForce CSAA vs. Radeon EQAAAnti-Aliasing Levels|
|Old GeForceDriver Mode||New GeForceDriver Mode||Combined Color/Coverage SamplesPlus Extra Coverage Samples||RadeonDriver Mode|
Although AMD’s naming system still makes more sense (it indicates the number of multi- and coverage-samples), Nvidia’s new scheme is much better than the previous one. Despite combining both samples into a single number, the CSAA designation lets you know that the target mode represents the sum of both sample types. Now, 8x anti-aliasing always means eight multi-samples, regardless of graphics hardware, giving us all a degree of consistency we can appreciate.
Aside from this, Nvidia now has a post-process anti-aliasing filter it calls FXAA. The technology is similar to AMD's morphological anti-aliasing, but it's implemented in-game instead of through the driver, and it works with any vendor’s graphics hardware. This new option is only included in a few games thus far (such as Duke Nukem Forever. F.3.A.R., Elder Scrolls V: Skyrim, and Battlefield 3), but the code is purportedly easy to integrate. A private programmer even released a non-commercial hack to enable FXAA in DirectX 9, 10, and 11 games. That's something you might want to hunt down on Google if you’re interested in toying with it.
I'm not a huge gamer and the games I do play mostly run awesome with my 2500K + GTX460. I decided that if it's going to be a while before the next generation of GPUs drop, I'd get another 460. So that's what I did, should be here in a few days. I was worried that even at 1920x1200 I'd have problems with AA and the lack of VRAM, but it's good to see that two 460s work pretty admirably.
As an aside, I'm totally on an efficiency kick, and I don't relish the thought of needing two cards to get decent performance, but the GTX 460 is one of the most efficient cards around well over a year after it's release.
Seriously, what is it?
Was thinking the same thing....part 1 and part 2 are contradicting each other hear...if i'm remembering part 1 correctly...
btw there's a typo at the start of page 2,
On release we tested StarCraft II because that was a game that choked with MSAA on Radeons. It turns out, that game is severely CPU limited, so it wasn't the best test subject for Morphological AA
As for #2, there's no worries as the Half Life 2 engine in Lost Coast that we used for the majority of comparison shots doesn't move the camera during idle times. We used a save game and reloaded the scene at exactly the same position, so its not an issue here.
As for your first concern, I was worried about that, too, at first. But I carefully scrutinize the uncompressed TIFF files before exporting them to GIF and in these cases there's no practical difference, it does an excellent job of demonstrating the result with different AA modes.
Also, as the first poster said, why is morphological so demanding all of a sudden? When I first tried using it, I barely saw an impact on performance and in a couple games it made everything look blurry. I just tried enabling it in Skyrim (a game that really needs better AA) and my performance plummeted - which these results confirm. What changed?
As it says in the article, EQAA is Radeon HD 6900-series exclusive. You probably don't have a 6900 card.
wolfram23Also, as the first poster said, why is morphological so demanding all of a sudden?
The answer is 5 posts above this comment. :) Depends on the game, you may have been using a CPU-bottlenecked title.