- Articles & News
- For IT Pros
- Your Opinion
We discovered blurry textures when we reviewed the Radeon HD 7800s, so now we're performing an in-depth investigation. Why does the Radeon HD 6000 series demonstrate crisper image quality? Is performance affected? Does AMD know about the issue?
When we were testing the Pitcairn-based Radeon HD 7800 cards for last month's launch, we stumbled upon image quality issues with AMD’s two most recent boards. Specifically, we noticed textures in popular games that appeared blurrier on all of the Radeon HD 7000s compared to the prior-generation offerings. And this was using identical image quality settings in the software driver and the games we were testing.
Here is an animation from AMD Radeon HD 7870 And 7850 Review: Pitcairn Gets Benchmarked that demonstrates the issue:
Unfortunately, AMD hasn't given us much time with any of its Radeon HD 7000-series cards prior to launching them, so our ability to go into more depth with our review was severely limited. But now we're all freed up, and ready to dig deeper.
Was the issue limited to just AMD's press-only beta driver? Does it affect the Radeon HD 7800s exclusively, or do all of the 7000s take a step backward? Does it change performance? If so, the implications there would be that AMD altered image quality to deliver more competitive performance. If not, the texture issue could just be a bug that needs to be fixed.
Of course, we always wanted to compare the Radeon HD 6900s and 7000s to Nvidia's default image quality as well. And naturally, we wanted to work with AMD each step of the way to figure out what went wrong, so we have the company's feedback as well. What does AMD have to say?