Part 4: Avivo HD vs. PureVideo HD

Decryption Benchmarks: Hardware vs. Software Decryption

We were curious to see if using Slysoft’s AnyDVD HD to defeat hardware AACS encryption on the Blu-ray disks would result in any difference in CPU usage. We have noticed that graphics cards seem to have a much easier time playing back high-def video files that aren’t encrypted off of the hard drive, so we thought we’d run this test to see if using AnyDVD to remove encryption would give us a performance boost.

We tried all three codecs with and without using AnyDVD HD on both the Radeon and GeForce motherboards with the following results:

Clearly, defeating the AACS encryption with AnyDVD HD provides no CPU usage benefit or detriment on either platform, but it was a worthwhile test to perform. Now we know for sure.

  • abzillah
    Don't the 780G chips have hybrid technology? It would have been great to see what kind of performance difference it would make to add a discrete card with a 780G chip. Motherboards with integrated graphics cost about the same as those without integrated graphics, and so I would choose an integrated graphics + a discrete graphic card for hybrid performance.
    Reply
  • liemfukliang
    Wao, you should update this article part 5 in tuesday when NDA 9300 lift out. 9300 vs 790GX. Does this NVidia VGA also defect?
    Reply
  • TheGreatGrapeApe
    Nice job Don !
    Interesting seeing the theoretical HQV difference being a realistic nil due to playability (does image enhancement of a skipping image matter?)

    I'll be linking to this one again.

    Next round HD4K vs GTX vs GF9 integrated, complete with dual view decoding. >B~)
    Reply
  • kingraven
    Great article, specially liked the decrypted video benchmarks as I was indeed expecting a much higher difference.

    Also was expecting that the single core handled it better as I use a old laptop with pentium M 1500mhz & ATI 9600 as a HTPC and it plays nearly all HD media I trow at it smoothly (Including 1080P) trough ffdshow. Notice the files are usually Matroska or AVI and the codecs vary but usually are H264.

    I admit since its an old PC without blueray or HD-DVD I have no idea how the "real deal" would perform, probably as bad or worse as the article says :P
    Reply
  • modtech
    A refreshingly informative article. Well done.
    Reply
  • I have a gigabyte GA-MA78GM-S2H m/b (780G)
    I just bought a Samsung LE46A656 TV and I have the following problem:

    When I connect the TV with standard VGA (D-SUB) cable,
    I can use Full HD (1920 X 1080) correctly.

    If I use the HDMI or DVI (with DVI-> HDMI adaptor) I can not use 1920 X 1080 correctly.
    The screen has black borders on all sides (about 3cm) and the picture is weird, like the monitor was not driven in its native resolution, but the 1920 X 1080 signal was compressed to the resolution that was visible on my TV.

    I also tried my old laptop (also ATI, x700) and had the same problem.
    I thought that my TV was defective but then I tried an old NVIDIA card I had and everything worked perfect!!!
    Full 1920 X 1080 with my HDMI input (with DVI-> HDMI adaptor).

    I don't know if this is a ATI driver problem or a general ATI hardware limitation,
    but I WILL NEVER BUY ATI AGAIN.
    They claim HDMI with full HD support. Well they are lying!
    Reply
  • That's funny, bit-tech had some rather different numbers for HQV tests for the 780g board.

    http://www.bit-tech.net/hardware/2008/03/04/amd_780g_integrated_graphics_chipset/10

    What's going on here? I assume bit-tech tweaked player settings to improve results, and you guys left everything at default?
    Reply
  • puet
    What about the image enhacements in the HQV test posible with a 780G and a Phenom procesor?, would this mix stand up in front of the discrete solution chosen?.
    This one could be an interesting part V in the articles series.
    Reply
  • genored
    azraelI have a gigabyte GA-MA78GM-S2H m/b (780G)I just bought a Samsung LE46A656 TV and I have the following problem:When I connect the TV with standard VGA (D-SUB) cable, I can use Full HD (1920 X 1080) correctly.If I use the HDMI or DVI (with DVI-> HDMI adaptor) I can not use 1920 X 1080 correctly. The screen has black borders on all sides (about 3cm) and the picture is weird, like the monitor was not driven in its native resolution, but the 1920 X 1080 signal was compressed to the resolution that was visible on my TV.I also tried my old laptop (also ATI, x700) and had the same problem.I thought that my TV was defective but then I tried an old NVIDIA card I had and everything worked perfect!!!Full 1920 X 1080 with my HDMI input (with DVI-> HDMI adaptor).I don't know if this is a ATI driver problem or a general ATI hardware limitation, but I WILL NEVER BUY ATI AGAIN.They claim HDMI with full HD support. Well they are lying!
    LEARN TO DOWNLOAD DRIVERS
    Reply
  • Guys...I own this Gigabyte board. HDCP works over DVI because that's what I use at home. Albeit I go from DVI from the motherboard to HDMI on the TV (don't ask why, it's just the cable I had). I don't have ANYDVD so, I know that it works.

    As for the guy having issues with HDMI with the ATI 3200 onboard, dude, there were some problems with the initial BIOS. Update them, update your drivers and you won't have a problem. My brother has the same board too and he uses HDMI and it works just fine. Noob...
    Reply