After reading this article, you likely have a decent understanding of the second-generation HQV benchmark. Let’s have a look at the test class scores and the final total for each card:
|HQV Benchmark version 2.0 Results (out of 210)|
|Row 0 - Cell 0||Radeon HD 6850||Radeon HD 5750||Radeon HD 5670||Radeon HD 5550||Radeon HD 5450|
|Test Class 1:Video Conversion||90||90||89||89||89|
|Test Class 1: Noise and Artifact Reduction||54||54||44||44||44|
|Test Class 3: Image Scaling and Enhancements||30||30||30||30||15|
|Test Class 4: Adaptive Processing||27||27||27||7||7|
|Header Cell - Column 0||GeForce GTX 470||GeForce GTX 460||GeForce 9800 GT||GeForce GT 240||GeForce GT 430||GeForce 210|
|Test Class 1:Video Conversion||90||47||87||87||46||47|
|Test Class 1: Noise and Artifact Reduction||20||20||20||20||20||20|
|Test Class 3: Image Scaling and Enhancements||30||30||30||30||30||30|
|Test Class 4: Adaptive Processing||20||20||20||20||20||0|
We’re seeing a fairly natural increase in quality across the Radeon line as you get to more expensive models, with the Radeon HD 5750 acting as the vanguard of maximum Radeon video playback quality. The Radeon HD 6850 does no better, and both are about as close to perfect as you’ll see on a PC. Indeed, all of the Radeons, other than the low-end Radeon HD 5450, have the same driver options. But the GPUs can handle more of the enhancements without stuttering as you get into pricier models.
The GeForce line is a little inconsistent, comparatively. There’s a hiccup, and the GeForce GT 240 and 9800 GT achieve a higher score than the more expensive GeForce GTX 460 according to our observations. This is mostly attributable to poor pulldown detection in some models, and while we’ve asked Nvidia about this disparity, the company hasn't responded.
Overall, when comparing Radeons to GeForces, AMD's cards get the nod for higher overall scores and better results per dollar spent. Frankly, the main reason for the superior results are AMD’s consistent cadence detection, better noise reduction (especially when it comes to compressed video), and a working flesh tone correction feature.
It’s important to note that the GeForce cards don’t suffer from shoddy video quality, and in our opinion, too many points are awarded for obscure multi-cadence detection. Thirty points are applied to this area, and that’s not including scores from important cadences like 2:2 and 3:2 pulldown. If you remove those 30 points, the playing field between GeForce and Radeon becomes much tighter. The GeForce cards offer excellent video playback when it comes to high-definition source material, and all of them handle the important 3:2 cadence without issue. Realistically, if you put a GeForce in an HTPC for DVD and Blu-ray playback duty, you’d probably never guess that it didn’t achieve the top score.
Having said that, the Radeons earn a well-deserved win here. While obscure multi-cadence support might be responsible for the bulk of point advantage, their real strength is superlative noise-reduction options. This comes in real handy with compressed video, so if you plan to play back any files that aren’t optimally encoded at HD resolution, the Radeons have a real advantage. It’s also noteworthy that the sub-$100 Radeon HD 5670 can offer slightly better playback quality than a GeForce GTX 470, even when multi-cadence tests are left out of the mix, and that a ~$120 Radeon 5750 card can boast the same ultimate PC playback quality right alongside more expensive Radeons like the 6850.
On a final note, the second-gen HQV benchmark Blu-ray can be purchased from www.hqv.com for $24.99, if you want to replicate the tests we ran here. There is also a standard-definition version of the benchmark for DVD that can be acquired for $19.99.
But as for the results, I am not that suprised. Even when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements, even before CCC. Thats an area of focus that nVidia might not see as important when it is. I want my Blu-Ray and DVDs to look great, not just ok.
My desktop, which uses a 460 really suffers from the lack of noise reduction options.
My Samsung BD player looks less spectacular that the others.
My Xbox looks a little better than the BD player.
My PS3 actually looks the best to me, no matter what display I use.
I'm not sure why, but it's the only one I could pick out just based on it's image quality. Netflix streaming is basically all I use my PS3 for. Compared to it, my desktop looks good and has several options to tweak but doesn't come close. I don't know how the PS3 stacks up, but I'm thinking about giving the test suite a spin.
Thanks for the awesome article.
That's Definitely on our to-do list!
Trying to organize that one now.
Noise reduction did next to nothing. And in many cases causes blockiness.
Dynamic Contrast in many cases does make things look better, but in some it revealed tons of noise in the greyscale which the noise reduction doesn't remove...not even a little.
Color correction seemed to make anything blueish bluer, even purples.
Edge correction seems to sharpen some details, but introduces noise after about 20%.
All in all, bunch of worthless settings.
ATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.
i thought they were loosing, not by enough to call it a loss, but not as good and the latest nvidia refreshes. but i got a 5770 due to its power consumption, i didn't have to swap out my psu to put it in and that was the deciding factor for me.
1. cadence tests ... why do you marginalise the 2:2 cadence ? these cards are not US exclusive. The rest of the world has the same requirements for picture quality.
2. skin tone correction: I see this as an error on the part of the card to even include this. why are you correcting something that the video creator wanted to be as it is ? I mean the movie is checked by video profesionals for anything they don't want there. not completely correct skin tones are part of the product by design. this test should not even exist.
3. dynamic contrast: cannot help it, but the example scene with the cats had blown higlights on my laptopt LCD in the "correct" part. how can you judge that if the constraint is the display device and not the GPU itself ? after all you can output on a 6-bit LCD or on a 10-bit LCD. the card does not have to know that ...
It's only worth more than half the world population, after all.
You misunderstand the text, I think.
To clear it up: I wasn't talking about 2:2 when I said that, I was talking about the Multi-Cadence Tests: 8 FPS animation, etc.