Video Quality Tested: GeForce Vs. Radeon In HQV 2.0
-
Page 1:How Do We Scrutinize High-Definition Video Quality?
-
Page 2:AMD Video Quality Driver Settings
-
Page 3:Nvidia Video Quality Driver Settings
-
Page 4:Test Setup And Benchmarks
-
Page 5:Test Class 1: Video Conversion
-
Page 6:Test Class 1: Video Conversion, Cont’d.
-
Page 7:Test Class 2: Noise And Artifact Reduction
-
Page 8:Test Class 3: Image Scaling And Enhancements
-
Page 9:Test Class 4: Adaptive Processing
-
Page 10:Conclusion
Test Class 4: Adaptive Processing
The adaptive processing tests reveal the video processor’s ability to optimize contrast and color correction.
Chapter 1: Contrast Enhancement Tests
This chapter is composed of four tests, comprised of four different video clips that present different challenges for contrast enhancement. The first is a theme park during the day, the second is an overcast beach scene with driftwood, the third is a tropical beach at dusk, and the final scene has black and white cats laying beside each other. A perfect score of five is awarded for each scene where the contrast is expanded without loss of detail in the darkest and lightest regions. The score is lowered to two if there is slight loss of detail in the lightest and darkest regions, and a score of zero is given if there is moderate or high loss of detail.
This is one of the easier tests to score because it isn’t hard to notice if contrast has improved without loss of detail. It turns out that dynamic contrast is a little resource-intensive, and the low-end graphics hardware can’t handle the feature without choppy playback. But every card that is able to enable the dynamic contrast option scores a perfect five points for each test. We do notice that we have to disable the brighter whites option on Radeons cards so that the lightest regions will not overexpose and lose detail.
Contrast Enhancement Test Results (out of 5) | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Radeon HD 6850 | Radeon HD 5750 | Radeon HD 5670 | Radeon HD 5550 | Radeon HD 5450 | ||||||
Scrolling Text | 5 | 5 | 5 | 0 | 0 | |||||
Roller Coaster | 5 | 5 | 5 | 0 | 0 | |||||
Ferris Wheel | 5 | 5 | 5 | 0 | 0 | |||||
Bridge Traffic | 5 | 5 | 5 | 0 | 0 |
GeForce GTX 470 | GeForce GTX 460 | GeForce 9800 GT | GeForce GT 240 | GeForce GT 430 | GeForce 210 | |||||
---|---|---|---|---|---|---|---|---|---|---|
Scrolling Text | 5 | 5 | 5 | 5 | 5 | 0 | ||||
Roller Coaster | 5 | 5 | 5 | 5 | 5 | 0 | ||||
Ferris Wheel | 5 | 5 | 5 | 5 | 5 | 0 | ||||
Bridge Traffic | 5 | 5 | 5 | 5 | 5 | 0 |
Chapter 2: Skin Tone Correction Tests
Human beings tend to be sensitive to unrealistic skin tones, so this test examines the video processor’s ability to detect and correct skin tones that are unrealistic. The test consists of a picture of people with various skin colors, and the skin tone hues are shifted off of their true colors over time. The maximum 10 points are awarded if off-hue skin tones are corrected to appear substantially closer to the original skin tone without affecting other colors in the scene. This drops to seven points if the skin tones are corrected somewhat, but problems in hue are still discernable, or three points if the skin tones are somewhat corrected, but non-skin colors are affected. If no improvement is observed, no points are awarded:
The Radeons have the advantage here, with the only dedicated flesh-tone setting in the driver. Even then, the test is admittedly difficult to judge. We could not detect any change when turning the GeForce color enhancement feature on while assessing the hardware.
[EDIT: We want to mention here that we've received some comments about skin tone correction noting that it can potentially mess with the way the director wants colors to look. This is certainly a valid argument, and there are those of us who will prefer to leave this setting off. However, this remains a valid comparative test as some video processors are equipped with this feature]
Skin Tone Correction Test Results (out of 10) | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Radeon HD 6850 | Radeon HD 5750 | Radeon HD 5670 | Radeon HD 5550 | Radeon HD 5450 | ||||||
7 | 7 | 7 | 7 | 7 |
GeForce GTX 470 | GeForce GTX 460 | GeForce 9800 GT | GeForce GT 240 | GeForce GT 430 | GeForce 210 | |||||
---|---|---|---|---|---|---|---|---|---|---|
0 | 0 | 0 | 0 | 0 | 0 |
- How Do We Scrutinize High-Definition Video Quality?
- AMD Video Quality Driver Settings
- Nvidia Video Quality Driver Settings
- Test Setup And Benchmarks
- Test Class 1: Video Conversion
- Test Class 1: Video Conversion, Cont’d.
- Test Class 2: Noise And Artifact Reduction
- Test Class 3: Image Scaling And Enhancements
- Test Class 4: Adaptive Processing
- Conclusion
But as for the results, I am not that suprised. Even when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements, even before CCC. Thats an area of focus that nVidia might not see as important when it is. I want my Blu-Ray and DVDs to look great, not just ok.
My desktop, which uses a 460 really suffers from the lack of noise reduction options.
My Samsung BD player looks less spectacular that the others.
My Xbox looks a little better than the BD player.
My PS3 actually looks the best to me, no matter what display I use.
I'm not sure why, but it's the only one I could pick out just based on it's image quality. Netflix streaming is basically all I use my PS3 for. Compared to it, my desktop looks good and has several options to tweak but doesn't come close. I don't know how the PS3 stacks up, but I'm thinking about giving the test suite a spin.
Thanks for the awesome article.
That's Definitely on our to-do list!
Trying to organize that one now.
Noise reduction did next to nothing. And in many cases causes blockiness.
Dynamic Contrast in many cases does make things look better, but in some it revealed tons of noise in the greyscale which the noise reduction doesn't remove...not even a little.
Color correction seemed to make anything blueish bluer, even purples.
Edge correction seems to sharpen some details, but introduces noise after about 20%.
All in all, bunch of worthless settings.
ATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.
i thought they were loosing, not by enough to call it a loss, but not as good and the latest nvidia refreshes. but i got a 5770 due to its power consumption, i didn't have to swap out my psu to put it in and that was the deciding factor for me.
1. cadence tests ... why do you marginalise the 2:2 cadence ? these cards are not US exclusive. The rest of the world has the same requirements for picture quality.
2. skin tone correction: I see this as an error on the part of the card to even include this. why are you correcting something that the video creator wanted to be as it is ? I mean the movie is checked by video profesionals for anything they don't want there. not completely correct skin tones are part of the product by design. this test should not even exist.
3. dynamic contrast: cannot help it, but the example scene with the cats had blown higlights on my laptopt LCD in the "correct" part. how can you judge that if the constraint is the display device and not the GPU itself ? after all you can output on a 6-bit LCD or on a 10-bit LCD. the card does not have to know that ...
It's only worth more than half the world population, after all.
You misunderstand the text, I think.
To clear it up: I wasn't talking about 2:2 when I said that, I was talking about the Multi-Cadence Tests: 8 FPS animation, etc.
More important than your driver would be calibrating your monitor. PC monitors are very different than HDTVs (higher brightness and lower contrast, square pix to optimize text over video, and most of us have tn screens which are very limited on color range), but in spite of their shortcomings, if properly adjusted they can still look quite good.
A note to those who are putting your DVDs on your computer, there is a very powerful program called AVS which can do much of this cleanup for you during the ripping process. This allows for smaller file size with good compressors (h264 for storage, Lagarith for editing), as well as making upscaling less of a problem. It is not a particularly intuitive tool, but there are many good tutorials out there on it. This is especially handy with old DVDs that were not encoded well in the first place, or super long DVDs (lord of the rings) where it was encoded poorly in order to fit the content on the disc.
As for blue-Ray content, the quality you get will depend mostly on the software you are using. The driver options (at least for my card) hinder more than help, so it then depends on the software you are using.
For HD content in general you will notice a huge difference between that nicely encoded music video you download vs Netflix HD streaming. Part of this is because Netflix is (for the most part) only 720p, and 2ndly because they have an automated process which will work better for some movies than others. It would seem their biggest flaw is color banding. This is still worlds better than cable, and other web services, but you are just not going to get that BD quality.
True enough, although I'm a US writer primarily tasked to write for the US audience. I do think readers from PAL countries can understand the implications, however.
I understand the argument, and frankly the fact that user is free to turn it off is good enough for me. Regardless of whether or not we agree with its inclusion, it is something that's included in some high-end video processors and is therefore a point of comparison.
For me it was sufficient to use a good-quality display for testing the different graphics hardware. Frankly, the HQV test can be used for displays, too. But with our test display capable of differentiating contrast enhancement without overexposure I think that's the best way to test the video processor. Whether other folks have worse displays is kind of outside the scope of the test.
but 85% of the worlds computers are in the US and so your countries dont really matter.
where did you get that stat from? your all american wet dream?
can you prove your statement?
why am i not suprised by this statement. i hope you are not representative of your whole country
I'm not sure if he is a troll or simply a dumb ass (probably both). The opposite is actually closer to reality, that is 85% of personal computers are outside of the US. In 2005-2004 only 28.4% of the world computers where in the US. Since then, countries like China or India have grown significantly faster than western countries where the market was already saturated 5 years ago. By the way the countries with more computers (per capita) are Switzerland: 864.584 per 1 million people, San Marino: 857.143 per 1 million people & Sweden: 763.012 per 1 million people.
I do not dispute the results as they speak clearly. I do have a problem with you downplaying an Nvidia disadvantage just because you are a US resident. The tone of the comment was not neutral (i.e. just to highlight that one cadence is used for 60Hz and the other for 50Hz).
Nvidia does not issue non-US driver versions or non-US cards, so they should be blamed for the lack of features in this case.
I just want to make you aware that some of your comments might not come across as correct to non-US residents. After all, Internet knows no borders :-)
Please add gtx5xx series cards too