Sign in with
Sign up | Sign in

AMD Video Quality Driver Settings

Video Quality Tested: GeForce Vs. Radeon In HQV 2.0
By

With the Catalyst 10.12 driver, AMD has revamped the look of its driver settings interface, eliminating the last vestiges of ATI branding. It’s more or less functionally identical when you dig down into the actual options, though.

To avoid wasting time in sub-menu navigation levels, the first thing to do is select Preferences and switch to the Advanced view. After this, selecting Video and then Video Settings from the left-side menu delivers all of the relevant selections in one single window. The first section is Basic Video Color. This doesn’t have much to do with video quality, as it’s more geared toward setting the color and saturation to taste. We’ll leave the “Use application settings” checkbox enabled here.

Next is the Advanced Video Color section. Some of these settings, such as “Color vibrance” and “Video gamma,” can be set to taste and aren’t relevant to the benchmark. Flesh-tone correction is benchmarked though, and since this option removes excess reds from flesh tones, we’ll leave it enabled at the default value. The “Brighter whites” option increases the blue value of video for brighter shades of white, but we’ll disable this option because we find it can cause overexposure problems when combined with the dynamic contrast feature.

[EDIT: We’ve set “Dynamic range” to “Limited” to keep things on an even keel with the GeForce cards that default to this value when other enhancements are enabled. Dynamic range doesn’t appear to have an impact on HQV scoring, but the Full setting does provide a wider range of brightness between white and black if it's appropriate for your display. Most televisions expect a 16-235 signal range, while monitors expect a 0-255 range. Choosing an inappropriate range can cause a loss of contrast, but we didn't experience any contrast issues that would affect scoring during our tests.]

The Basic Video Quality section is just below. We want to make sure that “Use automatic deinterlacing” and “Pulldown detection” are both enabled. Automatic deinterlacing should smooth out the jaggies we’d otherwise see between fields of video playback, and pulldown detection converts videos of various frame rates into 30 frames per second (FPS) video for smooth playback.

The Advanced Quality section contains a lot of the more powerful toys. “Edge enhancement” can actually take blurry or lower-resolution video and sharpen the detail. “De-noise” is the next setting, which removes random noise from images and can make lower-quality recordings appear clearer by taking out some of the gritty imperfections. “Mosquito Noise Reduction” and “De-blocking” are optimizations designed to remove imperfections in highly compressed video. The former reduces the look of blotchy edges, and the latter smoothes images and decreases unwanted edges. Finally, the “Enable dynamic contrast” option automatically adjusts gamma and contrast to enhance clarity and color vibrance in scenes that are overly bright or dim.

In the video playback section there are only two options. The first is “Enforce Smooth Video Playback.” Enabling it prevents dropped frames by disabling some of the processor-intensive options we’ve just pointed out above. For today’s tests, we’ll disable this feature, as we want to make sure the graphics cards can handle the options we’ve set. But for actual movie playback, it might be a good idea to leave this option enabled if you have lower-end hardware. The next option is “Apply current settings to Internet video,” something you may very well want to enable if you want your YouTube clips to benefit from the quality enhancements we’re discussing.

The final section is called Video Demo Mode, and its only purpose is to allow the user to see the difference that these video enhancements can provide. While you can certainly play with the feature to see what the video hardware can do, it’s not something you need to enable during regular playback.

Now let's move on to the settings we used with each Radeon graphics card we tested. As mentioned previously, low-end cards may have all of the options available, but they can’t handle smooth playback with the complete suite enabled. We’ve tested the cards at the settings we list below to ensure that they can manage 1080p playback with the best quality possible.

Radeon HD 5450: Video Quality Settings As Tested
Color vibrance:40
Flesh tone correction:50
Video Gamma:
Disabled
Brighter whites: Disabled
Dynamic range:Limited
Use automatic deinterlacing:Enabled
Pulldown detection:Enabled
Edge enhancement:Disabled
De-noise:80
Mosquito noise reduction:N/A
De-blocking:N/A
Dynamic contrast:Disabled
Enforce smooth playback:Disabled

While the Radeon HD 5450 has the “Edge enhancement” and “Enable dynamic contrast” options available, we find that it isn’t able to provide smooth playback with these turned on. Fortunately, the card is able to handle a de-noise value of 80, which helps smoothing out regular noise, in addition to compression artifacts, despite its lack of “Mosquito Noise Reduction” and “de-blocking” settings.

Radeon HD 5550: Video Quality Settings As Tested
Color vibrance:40
Flesh tone correction:50
Video Gamma:
Disabled
Brighter whites: Disabled
Dynamic range:Limited
Use automatic deinterlacing:Enabled
Pulldown detection:Enabled
Edge enhancement:35
De-noise:80
Mosquito noise reduction:Disabled
De-blocking:Disabled
Dynamic contrast:Disabled
Enforce smooth playback:Disabled

The Radeon HD 5550 uses the same options as the 5450, but can also handle “Edge enhancement,” an option we think works best when set to 35. Note that the 5550 exposes “Mosquito Noise Reduction” and “De-blocking” enhancements in the driver, but our testing suggests the card is not fast enough to enable them during testing without performance compromises. 

Radeon HD 5670: Video Quality Settings As Tested
Color vibrance:40
Flesh tone correction:50
Video Gamma:
Disabled
Brighter whites: Disabled
Dynamic range:Limited
Use automatic deinterlacing:Enabled
Pulldown detection:Enabled
Edge enhancement:35
De-noise:80
Mosquito noise reduction:Disabled
De-blocking:50
Dynamic contrast:Enabled
Enforce smooth playback:Disabled

The Radeon 5670 is powerful enough for us to enable “De-blocking” and “Dynamic contrast,” but “Mosquito Noise Reduction” proves too much for the card to handle. The de-blocking setting doesn’t seem to make a noticeable difference in lowering compression artifacts, but dynamic contrast will help the 5670 gain points in the benchmark.

Radeon HD 5750 and 6850: Video Quality Settings As Tested
Color vibrance:40
Flesh tone correction:50
Video Gamma:
Disabled
Brighter whites: Disabled
Dynamic range:Limited
Use automatic deinterlacing:Enabled
Pulldown detection:Enabled
Edge enhancement:35
De-noise:64
Mosquito noise reduction:50
De-blocking:50
Dynamic contrast:Enabled
Enforce smooth playback:Disabled

The Radeon HD 5750 is the lowest-priced Radeon that can handle all of the driver’s video enhancements without choking up playback. With “Mosquito Noise Reduction” enabled, we can reduce the basic “De-noise” value to 64 and still get the highest benchmark score possible for Radeon cards. Note that the Radeon HD 6850 uses the equivalent settings and delivers the same HQV benchmark score.

Display all 84 comments.
This thread is closed for comments
Top Comments
  • 14 Hide
    rootheday , February 2, 2011 3:18 AM
    Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.
Other Comments
  • 14 Hide
    rootheday , February 2, 2011 3:18 AM
    Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.
  • 4 Hide
    jimmysmitty , February 2, 2011 3:44 AM
    I second the test using SB HD graphics. It might be just an IGP but I would like to see the quality in case I want to make a HTPC and since SB has amazing encoding/decoding results compared to anything else out there (even $500+ GPUs) it would be nice to see if it can give decent picture quality.

    But as for the results, I am not that suprised. Even when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements, even before CCC. Thats an area of focus that nVidia might not see as important when it is. I want my Blu-Ray and DVDs to look great, not just ok.
  • 1 Hide
    compton , February 2, 2011 4:10 AM
    Great article. I had wondered what the testing criteria was about, and Lo! Tom's to the rescue. I have 4 primary devices that I use to watch Netflix's streaming service. Each is radically different in terms of hardware. They all look pretty good. But they all work differently. Using my 47" LG LED TV I did an informal comparison of each.

    My desktop, which uses a 460 really suffers from the lack of noise reduction options.
    My Samsung BD player looks less spectacular that the others.
    My Xbox looks a little better than the BD player.
    My PS3 actually looks the best to me, no matter what display I use.

    I'm not sure why, but it's the only one I could pick out just based on it's image quality. Netflix streaming is basically all I use my PS3 for. Compared to it, my desktop looks good and has several options to tweak but doesn't come close. I don't know how the PS3 stacks up, but I'm thinking about giving the test suite a spin.

    Thanks for the awesome article.
  • 4 Hide
    cleeve , February 2, 2011 4:17 AM
    Quote:
    Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.


    That's Definitely on our to-do list!

    Trying to organize that one now.
  • 1 Hide
    lucuis , February 2, 2011 4:58 AM
    Too bad this stuff usually makes things look worse. I tried out the full array of settings on my GTX 470 in multiple BD Rips of varying quality, most very good.

    Noise reduction did next to nothing. And in many cases causes blockiness.

    Dynamic Contrast in many cases does make things look better, but in some it revealed tons of noise in the greyscale which the noise reduction doesn't remove...not even a little.

    Color correction seemed to make anything blueish bluer, even purples.

    Edge correction seems to sharpen some details, but introduces noise after about 20%.

    All in all, bunch of worthless settings.
  • 0 Hide
    killerclick , February 2, 2011 5:13 AM
    jimmysmittyEven when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements


    ATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.
  • -2 Hide
    alidan , February 2, 2011 5:34 AM
    killerclickATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.


    i thought they were loosing, not by enough to call it a loss, but not as good and the latest nvidia refreshes. but i got a 5770 due to its power consumption, i didn't have to swap out my psu to put it in and that was the deciding factor for me.
  • 3 Hide
    haplo602 , February 2, 2011 5:40 AM
    this made me lol ...

    1. cadence tests ... why do you marginalise the 2:2 cadence ? these cards are not US exclusive. The rest of the world has the same requirements for picture quality.

    2. skin tone correction: I see this as an error on the part of the card to even include this. why are you correcting something that the video creator wanted to be as it is ? I mean the movie is checked by video profesionals for anything they don't want there. not completely correct skin tones are part of the product by design. this test should not even exist.

    3. dynamic contrast: cannot help it, but the example scene with the cats had blown higlights on my laptopt LCD in the "correct" part. how can you judge that if the constraint is the display device and not the GPU itself ? after all you can output on a 6-bit LCD or on a 10-bit LCD. the card does not have to know that ...
  • 4 Hide
    mitch074 , February 2, 2011 5:43 AM
    "obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies, and that it's frustrating to those few people who watch movies outside of the Pacific zone... As in, Europe, Africa, and parts of Asia up to and including mainland China.

    It's only worth more than half the world population, after all.
  • 1 Hide
    cleeve , February 2, 2011 5:54 AM
    mitch074"obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies...


    You misunderstand the text, I think.

    To clear it up: I wasn't talking about 2:2 when I said that, I was talking about the Multi-Cadence Tests: 8 FPS animation, etc.
  • 0 Hide
    caeden , February 2, 2011 5:58 AM
    Now I am not sure how the radions stack up, but using these settings on my 9800GT made my video quality noticeably worse than normal. Especially around text (white over black background was painful) where noise was added. After messing with it a little I found that using the video defaults under the image settings was best. And checking the dynamic contrast under color settings helped the contrast a little, but not much, and I think it is adding noise, so I am going back to defaults on this as well.

    More important than your driver would be calibrating your monitor. PC monitors are very different than HDTVs (higher brightness and lower contrast, square pix to optimize text over video, and most of us have tn screens which are very limited on color range), but in spite of their shortcomings, if properly adjusted they can still look quite good.

    A note to those who are putting your DVDs on your computer, there is a very powerful program called AVS which can do much of this cleanup for you during the ripping process. This allows for smaller file size with good compressors (h264 for storage, Lagarith for editing), as well as making upscaling less of a problem. It is not a particularly intuitive tool, but there are many good tutorials out there on it. This is especially handy with old DVDs that were not encoded well in the first place, or super long DVDs (lord of the rings) where it was encoded poorly in order to fit the content on the disc.
    As for blue-Ray content, the quality you get will depend mostly on the software you are using. The driver options (at least for my card) hinder more than help, so it then depends on the software you are using.
    For HD content in general you will notice a huge difference between that nicely encoded music video you download vs Netflix HD streaming. Part of this is because Netflix is (for the most part) only 720p, and 2ndly because they have an automated process which will work better for some movies than others. It would seem their biggest flaw is color banding. This is still worlds better than cable, and other web services, but you are just not going to get that BD quality.
  • 1 Hide
    cleeve , February 2, 2011 6:03 AM
    haplo602 cadence tests ... why do you marginalise the 2:2 cadence ? these cards are not US exclusive. The rest of the world has the same requirements for picture quality.


    True enough, although I'm a US writer primarily tasked to write for the US audience. I do think readers from PAL countries can understand the implications, however.

    haplo602 2. skin tone correction: I see this as an error on the part of the card to even include this. why are you correcting something that the video creator wanted to be as it is ? I mean the movie is checked by video profesionals for anything they don't want there. not completely correct skin tones are part of the product by design. this test should not even exist.


    I understand the argument, and frankly the fact that user is free to turn it off is good enough for me. Regardless of whether or not we agree with its inclusion, it is something that's included in some high-end video processors and is therefore a point of comparison.

    haplo602 3. dynamic contrast: cannot help it, but the example scene with the cats had blown higlights on my laptopt LCD in the "correct" part. how can you judge that if the constraint is the display device and not the GPU itself ? after all you can output on a 6-bit LCD or on a 10-bit LCD. the card does not have to know that ...


    For me it was sufficient to use a good-quality display for testing the different graphics hardware. Frankly, the HQV test can be used for displays, too. But with our test display capable of differentiating contrast enhancement without overexposure I think that's the best way to test the video processor. Whether other folks have worse displays is kind of outside the scope of the test.
  • 3 Hide
    spac18 , February 2, 2011 6:42 AM
    @shinobi
    where did you get that stat from? your all american wet dream?
  • 3 Hide
    intelx , February 2, 2011 6:47 AM
    shin0bi272but 85% of the worlds computers are in the US and so your countries dont really matter.


    can you prove your statement?
  • 4 Hide
    andrewcutter , February 2, 2011 7:45 AM
    shin0bi272but 85% of the worlds computers are in the US and so your countries dont really matter.


    why am i not suprised by this statement. i hope you are not representative of your whole country
  • 4 Hide
    aldaia , February 2, 2011 8:46 AM
    shin0bi272but 85% of the worlds computers are in the US and so your countries dont really matter.


    I'm not sure if he is a troll or simply a dumb ass (probably both). The opposite is actually closer to reality, that is 85% of personal computers are outside of the US. In 2005-2004 only 28.4% of the world computers where in the US. Since then, countries like China or India have grown significantly faster than western countries where the market was already saturated 5 years ago. By the way the countries with more computers (per capita) are Switzerland: 864.584 per 1 million people, San Marino: 857.143 per 1 million people & Sweden: 763.012 per 1 million people.
  • 3 Hide
    haplo602 , February 2, 2011 9:32 AM
    CleeveTrue enough, although I'm a US writer primarily tasked to write for the US audience. I do think readers from PAL countries can understand the implications, however.


    I do not dispute the results as they speak clearly. I do have a problem with you downplaying an Nvidia disadvantage just because you are a US resident. The tone of the comment was not neutral (i.e. just to highlight that one cadence is used for 60Hz and the other for 50Hz).

    Nvidia does not issue non-US driver versions or non-US cards, so they should be blamed for the lack of features in this case.

    I just want to make you aware that some of your comments might not come across as correct to non-US residents. After all, Internet knows no borders :-)
  • -1 Hide
    marraco , February 2, 2011 9:52 AM
    It's useless without a video player capable of applying all those enhancements to different video formats.
  • 2 Hide
    hardcore_gamer , February 2, 2011 10:24 AM
    CleeveThat's Definitely on our to-do list!Trying to organize that one now.

    Please add gtx5xx series cards too
Display more comments