Part 4: Avivo HD vs. PureVideo HD

HQV’s High-Definition Video Quality Benchmark, Cont'd

Film Resolution Loss Test: Out Of 25 Points

This test simply shows a pattern of lines and color bars. If the DVD player can show the smallest lines without flickering, it is successfully de-interlacing the image.

Swipe to scroll horizontally
Graphics ProcessorScore
Integrated Radeon HD 32000
Integrated GeForce 82000
Radeon 2400 PRO0
GeForce 8400 GS25

Again, all of the integrated solutions fail to produce results and the lone discrete 8400 GS provides a glimmer of hope by de-interlacing the image. Unfortunately, the video playback demonstrates some stuttering.

Film Resolution Loss Test—STADIUM: Out Of 10 Points

This test simply shows video of a stadium captured on film. If there are no visual artifacts, then there is no resolution loss.

Swipe to scroll horizontally
Graphics ProcessorScore
Integrated Radeon HD 32000
Integrated GeForce 82000
Radeon 2400 PRO0
GeForce 8400 GS0

And here we see that every contestant fails once again. Only the GeForce 8400 GS appeared to offer limited Resolution Loss, but it wasn’t perfect and still showed some moiré patterns. HQV’s scoring guidelines seem to suggest allocating the 10 points is an all-or-nothing affair, so we didn’t give it the 10 points although it appeared to offer an improvement.

Now that we’ve examined the specific tests, let’s have a look at the totals:

Swipe to scroll horizontally
Graphics ProcessorScore
Integrated Radeon HD 32000
Integrated GeForce 82000
Radeon 2400 PRO0
GeForce 8400 GS45

It’s pretty easy math to add up the integrated graphics solution’s scores, and it’s definitely sobering to see the results. Clearly, the integrated solutions provide only the most basic Blu-ray playback. Having said that, cheap standalone Blu-ray players will likely provide a similar experience without the benefits that a personal computer will provide.

Although we left them out of the charts for clarity’s sake, we once again need to mention the Radeon 2600/3650/4650 series and GeForce 8600/9500 series. For under $100, these cards can provide massive image quality improvements when playing back HD video. Both the Radeon 2600 XT and GeForce 8600 GTS achieved excellent scores of 105 and 95, respectively, in our last Avivo HD vs. PureVideo HD review. We haven’t yet tested the Radeon 4000 series for Blu-ray playback, but the inclusion of eight-channel audio support is a nice addition.

  • abzillah
    Don't the 780G chips have hybrid technology? It would have been great to see what kind of performance difference it would make to add a discrete card with a 780G chip. Motherboards with integrated graphics cost about the same as those without integrated graphics, and so I would choose an integrated graphics + a discrete graphic card for hybrid performance.
    Reply
  • liemfukliang
    Wao, you should update this article part 5 in tuesday when NDA 9300 lift out. 9300 vs 790GX. Does this NVidia VGA also defect?
    Reply
  • TheGreatGrapeApe
    Nice job Don !
    Interesting seeing the theoretical HQV difference being a realistic nil due to playability (does image enhancement of a skipping image matter?)

    I'll be linking to this one again.

    Next round HD4K vs GTX vs GF9 integrated, complete with dual view decoding. >B~)
    Reply
  • kingraven
    Great article, specially liked the decrypted video benchmarks as I was indeed expecting a much higher difference.

    Also was expecting that the single core handled it better as I use a old laptop with pentium M 1500mhz & ATI 9600 as a HTPC and it plays nearly all HD media I trow at it smoothly (Including 1080P) trough ffdshow. Notice the files are usually Matroska or AVI and the codecs vary but usually are H264.

    I admit since its an old PC without blueray or HD-DVD I have no idea how the "real deal" would perform, probably as bad or worse as the article says :P
    Reply
  • modtech
    A refreshingly informative article. Well done.
    Reply
  • I have a gigabyte GA-MA78GM-S2H m/b (780G)
    I just bought a Samsung LE46A656 TV and I have the following problem:

    When I connect the TV with standard VGA (D-SUB) cable,
    I can use Full HD (1920 X 1080) correctly.

    If I use the HDMI or DVI (with DVI-> HDMI adaptor) I can not use 1920 X 1080 correctly.
    The screen has black borders on all sides (about 3cm) and the picture is weird, like the monitor was not driven in its native resolution, but the 1920 X 1080 signal was compressed to the resolution that was visible on my TV.

    I also tried my old laptop (also ATI, x700) and had the same problem.
    I thought that my TV was defective but then I tried an old NVIDIA card I had and everything worked perfect!!!
    Full 1920 X 1080 with my HDMI input (with DVI-> HDMI adaptor).

    I don't know if this is a ATI driver problem or a general ATI hardware limitation,
    but I WILL NEVER BUY ATI AGAIN.
    They claim HDMI with full HD support. Well they are lying!
    Reply
  • That's funny, bit-tech had some rather different numbers for HQV tests for the 780g board.

    http://www.bit-tech.net/hardware/2008/03/04/amd_780g_integrated_graphics_chipset/10

    What's going on here? I assume bit-tech tweaked player settings to improve results, and you guys left everything at default?
    Reply
  • puet
    What about the image enhacements in the HQV test posible with a 780G and a Phenom procesor?, would this mix stand up in front of the discrete solution chosen?.
    This one could be an interesting part V in the articles series.
    Reply
  • genored
    azraelI have a gigabyte GA-MA78GM-S2H m/b (780G)I just bought a Samsung LE46A656 TV and I have the following problem:When I connect the TV with standard VGA (D-SUB) cable, I can use Full HD (1920 X 1080) correctly.If I use the HDMI or DVI (with DVI-> HDMI adaptor) I can not use 1920 X 1080 correctly. The screen has black borders on all sides (about 3cm) and the picture is weird, like the monitor was not driven in its native resolution, but the 1920 X 1080 signal was compressed to the resolution that was visible on my TV.I also tried my old laptop (also ATI, x700) and had the same problem.I thought that my TV was defective but then I tried an old NVIDIA card I had and everything worked perfect!!!Full 1920 X 1080 with my HDMI input (with DVI-> HDMI adaptor).I don't know if this is a ATI driver problem or a general ATI hardware limitation, but I WILL NEVER BUY ATI AGAIN.They claim HDMI with full HD support. Well they are lying!
    LEARN TO DOWNLOAD DRIVERS
    Reply
  • Guys...I own this Gigabyte board. HDCP works over DVI because that's what I use at home. Albeit I go from DVI from the motherboard to HDMI on the TV (don't ask why, it's just the cable I had). I don't have ANYDVD so, I know that it works.

    As for the guy having issues with HDMI with the ATI 3200 onboard, dude, there were some problems with the initial BIOS. Update them, update your drivers and you won't have a problem. My brother has the same board too and he uses HDMI and it works just fine. Noob...
    Reply