Texture Optimizations And The Radeon HD 7000s
As we go through the benchmarks, you’ll see that the Radeon HD 7800s do an incredibly good job of applying Adaptive anti-aliasing to our Skyrim benchmark sequence. The numbers were so impressive, in fact, that we went out of our way to snap screenshots in order to validate the resulting image quality, too. As it turns out, they do; Adaptive AA on the 7800s looks similar to the Radeon HD 6900s, putting our minds at ease.
While we were scrutinizing those Adaptive AA results, however, we couldn’t help but notice that some textures appeared noticeably blurrier in the Radeon HD 7800 screenshots. We first assumed a setting had changed in the game or driver. But double-checking proved that wasn’t the case. Further investigation showed that the Radeon HD 7800-series cards match the Radeon HD 6900’s crisper output if the Catalyst A.I. texture filtering quality slider is moved from its default (Quality) to the highest (High Quality) setting.
So, to be clear, using the exact same 8.95.5 driver at its default settings, the Radeon HD 7800s deliver blurrier textures than the Radeon HD 6900s. Take a look:
The differences are not colossal, and you probably wouldn’t notice them during game play (we didn’t). But they're easily identifiable in screen shots. We don’t want to overstate the impact of what we’re seeing. But, on the other hand, we take reductions in image quality seriously because the slope is slippery, and we’ve seen this before.
This issue came up very late in our testing. We asked AMD for comment, but don’t have an official response as of yet. Moreover, questions remain: Are the Radeon HD 7800 cards enjoying higher performance as a result of an optimization? Could this be an unintentional bug? How much better would the Radeon HD 6900s look if we also bumped their Catalyst A.I. slider up to High Quality? We absolutely plan to answer all of those questions after we collect more data. And we’ll update the story once we get some more feedback from AMD.