How Do We Scrutinize High-Definition Video Quality?
We’ve used the HQV benchmark version 1.0 for a few years now, and have only recently adopted the newer 2.0 version for some of our reviews. The latest build is far more complex and demanding than the original benchmark, but the raw scores don’t mean much unless you understand how to interpret them. Because of this, we’re taking our readers through a step-by-step explanation of the HD HQV Benchmark, version 2.0.
We don’t just explain the significance of the individual tests, but we also test a broad cross-section of relevant graphics cards out there today. By the end of this article. you will understand the significance of the individual HQV 2.0 scores and know which graphics cards excel when it comes to HD video-playback quality.
We’re not going to waste a lot of time with preliminaries, so let’s jump straight in to driver settings.
Driver Settings Test Methodology
The graphics card manufacturers prefer that we change the driver settings for each test to achieve the best result and the highest overall score. We don’t feel this method produces a realistic result because users wouldn’t tweak their driver settings each time a movie scene changes. We prefer to lock the driver settings for each graphics card across the entire benchmark for a real-world score.
Even the lowest-end graphics cards can have the same driver quality options as their higher-end counterparts. But in some cases, bottom-rung hardware will stutter when tasked with more demanding video enhancements, such as de-noise or dynamic contrast. Our goal is to find the best settings that each graphics card can handle without dropping frames. This should provide the realistic overall score for which we’re looking.
Current page: How Do We Scrutinize High-Definition Video Quality?Next Page AMD Video Quality Driver Settings
Stay on the Cutting Edge
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.Reply
I second the test using SB HD graphics. It might be just an IGP but I would like to see the quality in case I want to make a HTPC and since SB has amazing encoding/decoding results compared to anything else out there (even $500+ GPUs) it would be nice to see if it can give decent picture quality.Reply
But as for the results, I am not that suprised. Even when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancements, even before CCC. Thats an area of focus that nVidia might not see as important when it is. I want my Blu-Ray and DVDs to look great, not just ok.
Great article. I had wondered what the testing criteria was about, and Lo! Tom's to the rescue. I have 4 primary devices that I use to watch Netflix's streaming service. Each is radically different in terms of hardware. They all look pretty good. But they all work differently. Using my 47" LG LED TV I did an informal comparison of each.Reply
My desktop, which uses a 460 really suffers from the lack of noise reduction options.
My Samsung BD player looks less spectacular that the others.
My Xbox looks a little better than the BD player.
My PS3 actually looks the best to me, no matter what display I use.
I'm not sure why, but it's the only one I could pick out just based on it's image quality. Netflix streaming is basically all I use my PS3 for. Compared to it, my desktop looks good and has several options to tweak but doesn't come close. I don't know how the PS3 stacks up, but I'm thinking about giving the test suite a spin.
Thanks for the awesome article.
9508697 said:Could you give the same evaluation to Sandy Bridge's Intel HD Graphics?.
That's Definitely on our to-do list!
Trying to organize that one now.
Too bad this stuff usually makes things look worse. I tried out the full array of settings on my GTX 470 in multiple BD Rips of varying quality, most very good.Reply
Noise reduction did next to nothing. And in many cases causes blockiness.
Dynamic Contrast in many cases does make things look better, but in some it revealed tons of noise in the greyscale which the noise reduction doesn't remove...not even a little.
Color correction seemed to make anything blueish bluer, even purples.
Edge correction seems to sharpen some details, but introduces noise after about 20%.
All in all, bunch of worthless settings.
jimmysmittyEven when their GPUs might not perform the same as nVidia, ATI has always had great image quality enhancementsReply
ATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.
killerclickATI/AMD is demolishing nVidia in all price segments on performance and power efficiency... and image quality.Reply
i thought they were loosing, not by enough to call it a loss, but not as good and the latest nvidia refreshes. but i got a 5770 due to its power consumption, i didn't have to swap out my psu to put it in and that was the deciding factor for me.
this made me lol ...Reply
1. cadence tests ... why do you marginalise the 2:2 cadence ? these cards are not US exclusive. The rest of the world has the same requirements for picture quality.
2. skin tone correction: I see this as an error on the part of the card to even include this. why are you correcting something that the video creator wanted to be as it is ? I mean the movie is checked by video profesionals for anything they don't want there. not completely correct skin tones are part of the product by design. this test should not even exist.
3. dynamic contrast: cannot help it, but the example scene with the cats had blown higlights on my laptopt LCD in the "correct" part. how can you judge that if the constraint is the display device and not the GPU itself ? after all you can output on a 6-bit LCD or on a 10-bit LCD. the card does not have to know that ...
"obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies, and that it's frustrating to those few people who watch movies outside of the Pacific zone... As in, Europe, Africa, and parts of Asia up to and including mainland China.Reply
It's only worth more than half the world population, after all.
mitch074"obscure" cadence detection? Oh, of course... Nevermind that a few countries do use PAL and its 50Hz cadence on movies...Reply
You misunderstand the text, I think.
To clear it up: I wasn't talking about 2:2 when I said that, I was talking about the Multi-Cadence Tests: 8 FPS animation, etc.