Video Quality Tested: GeForce Vs. Radeon In HQV 2.0

Conclusion

After reading this article, you likely have a decent understanding of the second-generation HQV benchmark. Let’s have a look at the test class scores and the final total for each card:

Swipe to scroll horizontally
HQV Benchmark version 2.0 Results (out of 210)
Row 0 - Cell 0 Radeon HD 6850Radeon HD 5750Radeon HD 5670Radeon HD 5550Radeon HD 5450
Test Class 1:Video Conversion9090898989
Test Class 1: Noise and Artifact Reduction5454444444
Test Class 3: Image Scaling and Enhancements3030303015
Test Class 4: Adaptive Processing27272777
Totals:201201190170155
Swipe to scroll horizontally
Header Cell - Column 0 GeForce GTX 470GeForce GTX 460GeForce 9800 GTGeForce GT 240GeForce GT 430GeForce 210
Test Class 1:Video Conversion904787874647
Test Class 1: Noise and Artifact Reduction202020202020
Test Class 3: Image Scaling and Enhancements303030303030
Test Class 4: Adaptive Processing20202020200
Totals:16011715715711697

We’re seeing a fairly natural increase in quality across the Radeon line as you get to more expensive models, with the Radeon HD 5750 acting as the vanguard of maximum Radeon video playback quality. The Radeon HD 6850 does no better, and both are about as close to perfect as you’ll see on a PC. Indeed, all of the Radeons, other than the low-end Radeon HD 5450, have the same driver options. But the GPUs can handle more of the enhancements without stuttering as you get into pricier models.

The GeForce line is a little inconsistent, comparatively. There’s a hiccup, and the GeForce GT 240 and 9800 GT achieve a higher score than the more expensive GeForce GTX 460 according to our observations. This is mostly attributable to poor pulldown detection in some models, and while we’ve asked Nvidia about this disparity, the company hasn't responded.

Overall, when comparing Radeons to GeForces, AMD's cards get the nod for higher overall scores and better results per dollar spent. Frankly, the main reason for the superior results are AMD’s consistent cadence detection, better noise reduction (especially when it comes to compressed video), and a working flesh tone correction feature.

It’s important to note that the GeForce cards don’t suffer from shoddy video quality, and in our opinion, too many points are awarded for obscure multi-cadence detection. Thirty points are applied to this area, and that’s not including scores from important cadences like 2:2 and 3:2 pulldown. If you remove those 30 points, the playing field between GeForce and Radeon becomes much tighter. The GeForce cards offer excellent video playback when it comes to high-definition source material, and all of them handle the important 3:2 cadence without issue. Realistically, if you put a GeForce in an HTPC for DVD and Blu-ray playback duty, you’d probably never guess that it didn’t achieve the top score.

Having said that, the Radeons earn a well-deserved win here. While obscure multi-cadence support might be responsible for the bulk of point advantage, their real strength is superlative noise-reduction options. This comes in real handy with compressed video, so if you plan to play back any files that aren’t optimally encoded at HD resolution, the Radeons have a real advantage. It’s also noteworthy that the sub-$100 Radeon HD 5670 can offer slightly better playback quality than a GeForce GTX 470, even when multi-cadence tests are left out of the mix, and that a ~$120 Radeon 5750 card can boast the same ultimate PC playback quality right alongside more expensive Radeons like the 6850.

On a final note, the second-gen HQV benchmark Blu-ray can be purchased from www.hqv.com for $24.99, if you want to replicate the tests we ran here. There is also a standard-definition version of the benchmark for DVD that can be acquired for $19.99.

  • rootheday
    Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.
    Reply
  • jimmysmitty
    I second the test using SB HD graphics. It might be just an IGP but I would like to see the quality in case I want to make a HTPC and since SB has amazing encoding/decoding results compared to anything else out there (even $500+ GPUs) it would be nice to see if it can give decent picture quality.

    But as for the results, I am not that suprised. Even when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements, even before CCC. Thats an area of focus that nVidia might not see as important when it is. I want my Blu-Ray and DVDs to look great, not just ok.
    Reply
  • compton
    Great article. I had wondered what the testing criteria was about, and Lo! Tom's to the rescue. I have 4 primary devices that I use to watch Netflix's streaming service. Each is radically different in terms of hardware. They all look pretty good. But they all work differently. Using my 47" LG LED TV I did an informal comparison of each.

    My desktop, which uses a 460 really suffers from the lack of noise reduction options.
    My Samsung BD player looks less spectacular that the others.
    My Xbox looks a little better than the BD player.
    My PS3 actually looks the best to me, no matter what display I use.

    I'm not sure why, but it's the only one I could pick out just based on it's image quality. Netflix streaming is basically all I use my PS3 for. Compared to it, my desktop looks good and has several options to tweak but doesn't come close. I don't know how the PS3 stacks up, but I'm thinking about giving the test suite a spin.

    Thanks for the awesome article.
    Reply
  • cleeve
    9508697 said:
    Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.

    That's Definitely on our to-do list!

    Trying to organize that one now.
    Reply
  • lucuis
    Too bad this stuff usually makes things look worse. I tried out the full array of settings on my GTX 470 in multiple BD Rips of varying quality, most very good.

    Noise reduction did next to nothing. And in many cases causes blockiness.

    Dynamic Contrast in many cases does make things look better, but in some it revealed tons of noise in the greyscale which the noise reduction doesn't remove...not even a little.

    Color correction seemed to make anything blueish bluer, even purples.

    Edge correction seems to sharpen some details, but introduces noise after about 20%.

    All in all, bunch of worthless settings.
    Reply
  • killerclick
    jimmysmittyEven when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements
    ATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.
    Reply
  • alidan
    killerclickATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.
    i thought they were loosing, not by enough to call it a loss, but not as good and the latest nvidia refreshes. but i got a 5770 due to its power consumption, i didn't have to swap out my psu to put it in and that was the deciding factor for me.
    Reply
  • haplo602
    this made me lol ...

    1. cadence tests ... why do you marginalise the 2:2 cadence ? these cards are not US exclusive. The rest of the world has the same requirements for picture quality.

    2. skin tone correction: I see this as an error on the part of the card to even include this. why are you correcting something that the video creator wanted to be as it is ? I mean the movie is checked by video profesionals for anything they don't want there. not completely correct skin tones are part of the product by design. this test should not even exist.

    3. dynamic contrast: cannot help it, but the example scene with the cats had blown higlights on my laptopt LCD in the "correct" part. how can you judge that if the constraint is the display device and not the GPU itself ? after all you can output on a 6-bit LCD or on a 10-bit LCD. the card does not have to know that ...
    Reply
  • mitch074
    "obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies, and that it's frustrating to those few people who watch movies outside of the Pacific zone... As in, Europe, Africa, and parts of Asia up to and including mainland China.

    It's only worth more than half the world population, after all.
    Reply
  • cleeve
    mitch074"obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies...
    You misunderstand the text, I think.

    To clear it up: I wasn't talking about 2:2 when I said that, I was talking about the Multi-Cadence Tests: 8 FPS animation, etc.
    Reply