Sign in with
Sign up | Sign in

The Performance Impact Of Lower-Quality Textures

Do AMD's Radeon HD 7000s Trade Image Quality For Performance?

In light of the visual improvement we see when we tune the Catalyst A.I. slider, we want to know how performance gets impacted at each step of the way on AMD's Radeon HD 7870 and 6970 cards.

Crysis 2 definitely yields a quantifiable difference between the Radeon HD 7870 and 6970, as we'd expect. However, the difference between frame rates at the highest and lowest texture detail settings is negligible.

Battlefield 3 gives us an average 1.4 FPS spread on the Radeon HD 7870 and only a 0.3 FPS spread on the Radeon HD 6970, but that's hardly a notable difference on either card.

In Metro 2033, there's a more substantial 3.8 FPS difference between the highest and lowest texture quality settings on the Radeon HD 7870, while the Radeon HD 6970 incurs less than half of that.

Skyrim is the game with the most obvious texture differences between the 7870 and 6970, and it's also the game that shows the largest spread of frame rates between the highest and lowest texture quality settings on the Radeon HD 7870 (6.7 FPS). The Radeon HD 6970, on the other hand, sees a mere 2.1 FPS separate its Performance and High Quality settings.

To recap, there isn't much to report in either Battlefield 3 or Crysis when it comes to performance delta, regardless of the texture filtering quality setting. But in The Elder Scrolls V: Skyrim and Metro 2033, the High Quality setting appears to force more of a performance hit on the Radeon HD 7870 than on the 6970.

Considering the data we’ve seen up until this point, we have to come to the disturbing conclusion that AMD's Radeon HD 7000-series cards currently enjoy more aggressive benchmark results at their default driver settings, resulting in reduced texture quality compared to the Radeon HD 6000s and GeForce GTX 500s. Using the highest Catalyst A.I. setting appears to be the remedy, though it costs additional speed.

This is the kind of result that makes us uncomfortable. Is it possible that AMD knowingly sacrificed texture quality to gain marginally-better performance in some benchmarks? The company took a couple weeks to respond to our queries, and we wondered as we waited.

React To This Article