Video Quality Tested: GeForce Vs. Radeon In HQV 2.0

Test Class 4: Adaptive Processing

The adaptive processing tests reveal the video processor’s ability to optimize contrast and color correction.

Chapter 1: Contrast Enhancement Tests

This chapter is composed of four tests, comprised of four different video clips that present different challenges for contrast enhancement. The first is a theme park during the day, the second is an overcast beach scene with driftwood, the third is a tropical beach at dusk, and the final scene has black and white cats laying beside each other. A perfect score of five is awarded for each scene where the contrast is expanded without loss of detail in the darkest and lightest regions. The score is lowered to two if there is slight loss of detail in the lightest and darkest regions, and a score of zero is given if there is moderate or high loss of detail.

This is one of the easier tests to score because it isn’t hard to notice if contrast has improved without loss of detail. It turns out that dynamic contrast is a little resource-intensive, and the low-end graphics hardware can’t handle the feature without choppy playback. But every card that is able to enable the dynamic contrast option scores a perfect five points for each test. We do notice that we have to disable the brighter whites option on Radeons cards so that the lightest regions will not overexpose and lose detail.

Swipe to scroll horizontally
Contrast Enhancement Test Results (out of 5)
Row 0 - Cell 0 Radeon HD 6850Radeon HD 5750Radeon HD 5670Radeon HD 5550Radeon HD 5450
Scrolling Text55500
Roller Coaster55500
Ferris Wheel55500
Bridge Traffic55500
Swipe to scroll horizontally
Header Cell - Column 0 GeForce GTX 470GeForce GTX 460GeForce 9800 GTGeForce GT 240GeForce GT 430GeForce 210
Scrolling Text555550
Roller Coaster555550
Ferris Wheel555550
Bridge Traffic555550

Chapter 2: Skin Tone Correction Tests

Human beings tend to be sensitive to unrealistic skin tones, so this test examines the video processor’s ability to detect and correct skin tones that are unrealistic. The test consists of a picture of people with various skin colors, and the skin tone hues are shifted off of their true colors over time. The maximum 10 points are awarded if off-hue skin tones are corrected to appear substantially closer to the original skin tone without affecting other colors in the scene. This drops to seven points if the skin tones are corrected somewhat, but problems in hue are still discernable, or three points if the skin tones are somewhat corrected, but non-skin colors are affected. If no improvement is observed, no points are awarded:

The Radeons have the advantage here, with the only dedicated flesh-tone setting in the driver. Even then, the test is admittedly difficult to judge. We could not detect any change when turning the GeForce color enhancement feature on while assessing the hardware.

[EDIT: We want to mention here that we've received some comments about skin tone correction noting that it can potentially mess with the way the director wants colors to look. This is certainly a valid argument, and there are those of us who will prefer to leave this setting off. However, this remains a valid comparative test as some video processors are equipped with this feature] 

Swipe to scroll horizontally
Skin Tone Correction Test Results (out of 10)
Radeon HD 6850Radeon HD 5750Radeon HD 5670Radeon HD 5550Radeon HD 5450
77777
Swipe to scroll horizontally
GeForce GTX 470GeForce GTX 460GeForce 9800 GTGeForce GT 240GeForce GT 430GeForce 210
000000
  • rootheday
    Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.
    Reply
  • jimmysmitty
    I second the test using SB HD graphics. It might be just an IGP but I would like to see the quality in case I want to make a HTPC and since SB has amazing encoding/decoding results compared to anything else out there (even $500+ GPUs) it would be nice to see if it can give decent picture quality.

    But as for the results, I am not that suprised. Even when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements, even before CCC. Thats an area of focus that nVidia might not see as important when it is. I want my Blu-Ray and DVDs to look great, not just ok.
    Reply
  • compton
    Great article. I had wondered what the testing criteria was about, and Lo! Tom's to the rescue. I have 4 primary devices that I use to watch Netflix's streaming service. Each is radically different in terms of hardware. They all look pretty good. But they all work differently. Using my 47" LG LED TV I did an informal comparison of each.

    My desktop, which uses a 460 really suffers from the lack of noise reduction options.
    My Samsung BD player looks less spectacular that the others.
    My Xbox looks a little better than the BD player.
    My PS3 actually looks the best to me, no matter what display I use.

    I'm not sure why, but it's the only one I could pick out just based on it's image quality. Netflix streaming is basically all I use my PS3 for. Compared to it, my desktop looks good and has several options to tweak but doesn't come close. I don't know how the PS3 stacks up, but I'm thinking about giving the test suite a spin.

    Thanks for the awesome article.
    Reply
  • cleeve
    9508697 said:
    Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.

    That's Definitely on our to-do list!

    Trying to organize that one now.
    Reply
  • lucuis
    Too bad this stuff usually makes things look worse. I tried out the full array of settings on my GTX 470 in multiple BD Rips of varying quality, most very good.

    Noise reduction did next to nothing. And in many cases causes blockiness.

    Dynamic Contrast in many cases does make things look better, but in some it revealed tons of noise in the greyscale which the noise reduction doesn't remove...not even a little.

    Color correction seemed to make anything blueish bluer, even purples.

    Edge correction seems to sharpen some details, but introduces noise after about 20%.

    All in all, bunch of worthless settings.
    Reply
  • killerclick
    jimmysmittyEven when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements
    ATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.
    Reply
  • alidan
    killerclickATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.
    i thought they were loosing, not by enough to call it a loss, but not as good and the latest nvidia refreshes. but i got a 5770 due to its power consumption, i didn't have to swap out my psu to put it in and that was the deciding factor for me.
    Reply
  • haplo602
    this made me lol ...

    1. cadence tests ... why do you marginalise the 2:2 cadence ? these cards are not US exclusive. The rest of the world has the same requirements for picture quality.

    2. skin tone correction: I see this as an error on the part of the card to even include this. why are you correcting something that the video creator wanted to be as it is ? I mean the movie is checked by video profesionals for anything they don't want there. not completely correct skin tones are part of the product by design. this test should not even exist.

    3. dynamic contrast: cannot help it, but the example scene with the cats had blown higlights on my laptopt LCD in the "correct" part. how can you judge that if the constraint is the display device and not the GPU itself ? after all you can output on a 6-bit LCD or on a 10-bit LCD. the card does not have to know that ...
    Reply
  • mitch074
    "obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies, and that it's frustrating to those few people who watch movies outside of the Pacific zone... As in, Europe, Africa, and parts of Asia up to and including mainland China.

    It's only worth more than half the world population, after all.
    Reply
  • cleeve
    mitch074"obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies...
    You misunderstand the text, I think.

    To clear it up: I wasn't talking about 2:2 when I said that, I was talking about the Multi-Cadence Tests: 8 FPS animation, etc.
    Reply