Skip to main content

Part 4: Avivo HD vs. PureVideo HD

Introduction

In part one of our Avivo vs. PureVideo comparison, we concentrated on DVD playback. In our second article in this series, we focused on high-definition video formats—HD DVD in particular—and we dug into what it takes to get them to work on the PC. In the third article, we evaluated the end-user experience with low-end video accelerators.

In this article, we will dig deep into the performance of the first integrated high-def video accelerators—the Radeon HD 3200 (used in AMD’s 780G chipset) and the GeForce 8200 (used in Nvidia’s MCP78S chipset). Both chipsets are for AMD socket AM2+ CPUs, so we can perform a real apples-to-apples comparison here.

Will these solutions perform similarly, or will a true leader emerge? How well will these solutions work with slower single-core budget CPUs? What amount of memory needs to be allocated to the on-board graphics for Blu-ray playback? How do they compare to discrete video cards? We will answer all of these questions and more.

The Competitors

Let’s begin by having a closer look at the technical differences between the integrated Radeon HD 3200 and GeForce 8200 graphics processors.

Codename:780G
Process:55nm
Universal Shaders:40
Texture Units:4
ROPs:4
Memory Bus:64-bit
Core Speed MHz:500
Memory Speed MHz:400 (800 effective)
DirectX / Shader ModelDX 10 / SM 4.0

In terms of hardware, the integrated Radeon HD 3200 is a carbon copy of the discrete Radeon HD 3450 (and the previous 2400 series as well), with the same number of shaders, texture units, and ROPs. I can’t recall another instance where a contemporary discrete desktop video card shared exact specifications with an integrated part; usually integrated motherboard graphics are a generation behind even the low-end discrete cards. In this respect, the 780G is special. Of course, it still suffers from the plague of sharing system memory over a 64-bit bus, but out of the gate, the 3200 shows promise.

Codename:7CP78S
Process:80nm
Universal Shaders:16
Texture Units:4
ROPs:4
Memory Bus:64-bit
Core Speed MHz:500
Memory Speed MHz:400 (800 effective)
DirectX / Shader ModelDX 10 / SM 4.0

The GeForce 8200 is a very close relative to the discrete 8400 GS, based on the older 80 nm process and sporting the same 16 universal shaders. However, the number of texture units and ROPs have been cut in half compared to the 8400.

The specs look incredibly close between these two competitors, with near-identical clock speeds, texture units, and ROPs, but the GeForce 8200’s 16 universal shaders look a bit weak in comparison to the Radeon 3200’s 40 universal shaders. Consider, however, that the GeForce and Radeon architectures are vastly different, and that this is not an accurate measure of performance between the two architectures—GeForce cards have done more with fewer shaders since the GeForce 8000 series. So it’s still a good match-up.

  • abzillah
    Don't the 780G chips have hybrid technology? It would have been great to see what kind of performance difference it would make to add a discrete card with a 780G chip. Motherboards with integrated graphics cost about the same as those without integrated graphics, and so I would choose an integrated graphics + a discrete graphic card for hybrid performance.
    Reply
  • liemfukliang
    Wao, you should update this article part 5 in tuesday when NDA 9300 lift out. 9300 vs 790GX. Does this NVidia VGA also defect?
    Reply
  • TheGreatGrapeApe
    Nice job Don !
    Interesting seeing the theoretical HQV difference being a realistic nil due to playability (does image enhancement of a skipping image matter?)

    I'll be linking to this one again.

    Next round HD4K vs GTX vs GF9 integrated, complete with dual view decoding. >B~)
    Reply
  • kingraven
    Great article, specially liked the decrypted video benchmarks as I was indeed expecting a much higher difference.

    Also was expecting that the single core handled it better as I use a old laptop with pentium M 1500mhz & ATI 9600 as a HTPC and it plays nearly all HD media I trow at it smoothly (Including 1080P) trough ffdshow. Notice the files are usually Matroska or AVI and the codecs vary but usually are H264.

    I admit since its an old PC without blueray or HD-DVD I have no idea how the "real deal" would perform, probably as bad or worse as the article says :P
    Reply
  • modtech
    A refreshingly informative article. Well done.
    Reply
  • I have a gigabyte GA-MA78GM-S2H m/b (780G)
    I just bought a Samsung LE46A656 TV and I have the following problem:

    When I connect the TV with standard VGA (D-SUB) cable,
    I can use Full HD (1920 X 1080) correctly.

    If I use the HDMI or DVI (with DVI-> HDMI adaptor) I can not use 1920 X 1080 correctly.
    The screen has black borders on all sides (about 3cm) and the picture is weird, like the monitor was not driven in its native resolution, but the 1920 X 1080 signal was compressed to the resolution that was visible on my TV.

    I also tried my old laptop (also ATI, x700) and had the same problem.
    I thought that my TV was defective but then I tried an old NVIDIA card I had and everything worked perfect!!!
    Full 1920 X 1080 with my HDMI input (with DVI-> HDMI adaptor).

    I don't know if this is a ATI driver problem or a general ATI hardware limitation,
    but I WILL NEVER BUY ATI AGAIN.
    They claim HDMI with full HD support. Well they are lying!
    Reply
  • That's funny, bit-tech had some rather different numbers for HQV tests for the 780g board.

    http://www.bit-tech.net/hardware/2008/03/04/amd_780g_integrated_graphics_chipset/10

    What's going on here? I assume bit-tech tweaked player settings to improve results, and you guys left everything at default?
    Reply
  • puet
    What about the image enhacements in the HQV test posible with a 780G and a Phenom procesor?, would this mix stand up in front of the discrete solution chosen?.
    This one could be an interesting part V in the articles series.
    Reply
  • genored
    azraelI have a gigabyte GA-MA78GM-S2H m/b (780G)I just bought a Samsung LE46A656 TV and I have the following problem:When I connect the TV with standard VGA (D-SUB) cable, I can use Full HD (1920 X 1080) correctly.If I use the HDMI or DVI (with DVI-> HDMI adaptor) I can not use 1920 X 1080 correctly. The screen has black borders on all sides (about 3cm) and the picture is weird, like the monitor was not driven in its native resolution, but the 1920 X 1080 signal was compressed to the resolution that was visible on my TV.I also tried my old laptop (also ATI, x700) and had the same problem.I thought that my TV was defective but then I tried an old NVIDIA card I had and everything worked perfect!!!Full 1920 X 1080 with my HDMI input (with DVI-> HDMI adaptor).I don't know if this is a ATI driver problem or a general ATI hardware limitation, but I WILL NEVER BUY ATI AGAIN.They claim HDMI with full HD support. Well they are lying!
    LEARN TO DOWNLOAD DRIVERS
    Reply
  • Guys...I own this Gigabyte board. HDCP works over DVI because that's what I use at home. Albeit I go from DVI from the motherboard to HDMI on the TV (don't ask why, it's just the cable I had). I don't have ANYDVD so, I know that it works.

    As for the guy having issues with HDMI with the ATI 3200 onboard, dude, there were some problems with the initial BIOS. Update them, update your drivers and you won't have a problem. My brother has the same board too and he uses HDMI and it works just fine. Noob...
    Reply