Video Quality Tested: GeForce Vs. Radeon In HQV 2.0

Nvidia Video Quality Driver Settings

The Nvidia control panel features the same solid interface that it has had for some time now, and that’s not a bad thing. Under the “Select a Task” menu on the left side, we find the “Video” section with two tasks below it that contain all of the settings we need: “Adjust video color settings” and “Adjust video image settings.”

With the “Adjust video color settings” task selected, there are two radio buttons on the right-hand pane under the question “How do you make your video color adjustments?” The two buttons contain text that says “With the video player settings” or “With the Nvidia settings.” To manually control the enhancements, we select the second option.

Now we can navigate the three tabs in this section: “Color,” “Gamma,” and “Advanced.” The color and gamma tabs are there to set the output to taste, but they aren’t relevant for our testing. The advanced tab contains all of the options we need, such as “Dynamic range,” “Dynamic contrast enhancement,” and “Color enhancement.”

[EDIT: We’ve left “Dynamic range” to the “Limited” default. Dynamic range doesn’t appear to have an impact on HQV scoring, but the Full setting does provide a wider range of brightness between white and black if it's appropriate for your display. Most televisions expect a 16-235 signal range, while monitors expect a 0-255 range. Choosing an inappropriate range can cause a loss of contrast, but we didn't experience any contrast issues that would affect scoring during our tests.]

The dynamic contrast enhancement option dynamically adjusts the brightness in video to provide an optimal contrast ratio on a scene-by-scene basis. Color enhancement adjusts blue, green, and skin tones to provide a more vibrant picture. We’ll enable these options if the GeForce card we’re using is powerful enough to handle them. See our individual card configurations below for details.

With the “Adjust video color settings" task complete, let’s move on to the “Adjust Video Image Settings” task. There are three options here: “Edge enhancement,” “Noise reduction,” and “De-interlacing.” Edge enhancement can sharpen the detail of edges and noise reduction will filter out artifacts and speckles in the video. De-interlacing can provide accurate movie playback and superior image quality by optimizing source video captured at different frame rates. Edge enhancement and noise reduction can be set to “Use the Nvidia setting” on the cards we’re testing today and the “Use inverse telecine” option under de-interlacing should always be checked.

With the relevant options detailed, let’s go over the settings we're using with the GeForce cards included in our comparison. As mentioned previously, low-end cards may have all of the options available, but might not be able to handle smooth playback with them enabled.

Swipe to scroll horizontally
GeForce 210: Video Quality Settings As Tested
Use inverse telecine:Enabled
Edge enhancement:60%
Noise reduction:70%
Dynamic Range:Disabled
Dynamic contrast enhancement:Disabled
Color enhancement:Disabled

In our testing, the GeForce 210 was able to run smoothly with all of the “Adjust Video Image Setting” features enabled, but was not able to handle any “Adjust video color setting” options without dropping frames during playback. The good news is that the card can handle edge enhancement at 60% and noise reduction at 70%, which are optimal values for the entire GeForce range, according to our observations.

Swipe to scroll horizontally
All Other GeForce Cards: GT 430, GT 240, 9800 GT, GTX 460, GTX 470: Video Quality Settings As Tested
Use inverse telecine:Enabled
Edge enhancement:60%
Noise reduction:70%
Dynamic Range:Limited
Dynamic contrast enhancement:Enabled
Color enhancement:Enabled

The rest of the GeForce range is able to handle smooth 1080p playback with all options enabled, including “Dynamic contrast enhancement” and “Color enhancement.” Dynamic range can be left at the “Limited” setting, as changing this value doesn’t seem to affect the HQV test results.

  • rootheday
    Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.
    Reply
  • jimmysmitty
    I second the test using SB HD graphics. It might be just an IGP but I would like to see the quality in case I want to make a HTPC and since SB has amazing encoding/decoding results compared to anything else out there (even $500+ GPUs) it would be nice to see if it can give decent picture quality.

    But as for the results, I am not that suprised. Even when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements, even before CCC. Thats an area of focus that nVidia might not see as important when it is. I want my Blu-Ray and DVDs to look great, not just ok.
    Reply
  • compton
    Great article. I had wondered what the testing criteria was about, and Lo! Tom's to the rescue. I have 4 primary devices that I use to watch Netflix's streaming service. Each is radically different in terms of hardware. They all look pretty good. But they all work differently. Using my 47" LG LED TV I did an informal comparison of each.

    My desktop, which uses a 460 really suffers from the lack of noise reduction options.
    My Samsung BD player looks less spectacular that the others.
    My Xbox looks a little better than the BD player.
    My PS3 actually looks the best to me, no matter what display I use.

    I'm not sure why, but it's the only one I could pick out just based on it's image quality. Netflix streaming is basically all I use my PS3 for. Compared to it, my desktop looks good and has several options to tweak but doesn't come close. I don't know how the PS3 stacks up, but I'm thinking about giving the test suite a spin.

    Thanks for the awesome article.
    Reply
  • cleeve
    9508697 said:
    Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.

    That's Definitely on our to-do list!

    Trying to organize that one now.
    Reply
  • lucuis
    Too bad this stuff usually makes things look worse. I tried out the full array of settings on my GTX 470 in multiple BD Rips of varying quality, most very good.

    Noise reduction did next to nothing. And in many cases causes blockiness.

    Dynamic Contrast in many cases does make things look better, but in some it revealed tons of noise in the greyscale which the noise reduction doesn't remove...not even a little.

    Color correction seemed to make anything blueish bluer, even purples.

    Edge correction seems to sharpen some details, but introduces noise after about 20%.

    All in all, bunch of worthless settings.
    Reply
  • killerclick
    jimmysmittyEven when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements
    ATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.
    Reply
  • alidan
    killerclickATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.
    i thought they were loosing, not by enough to call it a loss, but not as good and the latest nvidia refreshes. but i got a 5770 due to its power consumption, i didn't have to swap out my psu to put it in and that was the deciding factor for me.
    Reply
  • haplo602
    this made me lol ...

    1. cadence tests ... why do you marginalise the 2:2 cadence ? these cards are not US exclusive. The rest of the world has the same requirements for picture quality.

    2. skin tone correction: I see this as an error on the part of the card to even include this. why are you correcting something that the video creator wanted to be as it is ? I mean the movie is checked by video profesionals for anything they don't want there. not completely correct skin tones are part of the product by design. this test should not even exist.

    3. dynamic contrast: cannot help it, but the example scene with the cats had blown higlights on my laptopt LCD in the "correct" part. how can you judge that if the constraint is the display device and not the GPU itself ? after all you can output on a 6-bit LCD or on a 10-bit LCD. the card does not have to know that ...
    Reply
  • mitch074
    "obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies, and that it's frustrating to those few people who watch movies outside of the Pacific zone... As in, Europe, Africa, and parts of Asia up to and including mainland China.

    It's only worth more than half the world population, after all.
    Reply
  • cleeve
    mitch074"obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies...
    You misunderstand the text, I think.

    To clear it up: I wasn't talking about 2:2 when I said that, I was talking about the Multi-Cadence Tests: 8 FPS animation, etc.
    Reply