In part one of our Avivo vs. PureVideo comparison, we concentrated on DVD playback. In our second article in this series, we focused on high-definition video formats—HD DVD in particular—and we dug into what it takes to get them to work on the PC. In the third article, we evaluated the end-user experience with low-end video accelerators.
In this article, we will dig deep into the performance of the first integrated high-def video accelerators—the Radeon HD 3200 (used in AMD’s 780G chipset) and the GeForce 8200 (used in Nvidia’s MCP78S chipset). Both chipsets are for AMD socket AM2+ CPUs, so we can perform a real apples-to-apples comparison here.
Will these solutions perform similarly, or will a true leader emerge? How well will these solutions work with slower single-core budget CPUs? What amount of memory needs to be allocated to the on-board graphics for Blu-ray playback? How do they compare to discrete video cards? We will answer all of these questions and more.
The Competitors
Let’s begin by having a closer look at the technical differences between the integrated Radeon HD 3200 and GeForce 8200 graphics processors.
| Codename: | 780G |
| Process: | 55nm |
| Universal Shaders: | 40 |
| Texture Units: | 4 |
| ROPs: | 4 |
| Memory Bus: | 64-bit |
| Core Speed MHz: | 500 |
| Memory Speed MHz: | 400 (800 effective) |
| DirectX / Shader Model | DX 10 / SM 4.0 |
In terms of hardware, the integrated Radeon HD 3200 is a carbon copy of the discrete Radeon HD 3450 (and the previous 2400 series as well), with the same number of shaders, texture units, and ROPs. I can’t recall another instance where a contemporary discrete desktop video card shared exact specifications with an integrated part; usually integrated motherboard graphics are a generation behind even the low-end discrete cards. In this respect, the 780G is special. Of course, it still suffers from the plague of sharing system memory over a 64-bit bus, but out of the gate, the 3200 shows promise.
| Codename: | 7CP78S |
| Process: | 80nm |
| Universal Shaders: | 16 |
| Texture Units: | 4 |
| ROPs: | 4 |
| Memory Bus: | 64-bit |
| Core Speed MHz: | 500 |
| Memory Speed MHz: | 400 (800 effective) |
| DirectX / Shader Model | DX 10 / SM 4.0 |
The GeForce 8200 is a very close relative to the discrete 8400 GS, based on the older 80 nm process and sporting the same 16 universal shaders. However, the number of texture units and ROPs have been cut in half compared to the 8400.
The specs look incredibly close between these two competitors, with near-identical clock speeds, texture units, and ROPs, but the GeForce 8200’s 16 universal shaders look a bit weak in comparison to the Radeon 3200’s 40 universal shaders. Consider, however, that the GeForce and Radeon architectures are vastly different, and that this is not an accurate measure of performance between the two architectures—GeForce cards have done more with fewer shaders since the GeForce 8000 series. So it’s still a good match-up.
- Introduction
- HDCP, HDMI, DVI, 1080p, And Other Definitions
- Test System Components And Software
- Quirks, Frustrations, And Compliance Woes
- HQV’s High Definition Video Quality Benchmark
- HQV’s High-Definition Video Quality Benchmark, Cont'd
- CPU Usage Benchmarks: Radeon 3200 vs. GeForce 8200
- Resolution Benchmarks: 1080p vs. 780p
- CPU Benchmarks: Dual-Core Athlon 4800+ vs. Single-Core Sempron 3200+
- Graphics Memory Benchmarks: 256MB vs. 128MB
- Decryption Benchmarks: Hardware vs. Software Decryption
- Discrete vs. Integrated Graphics Benchmarks
- Conclusion

Interesting seeing the theoretical HQV difference being a realistic nil due to playability (does image enhancement of a skipping image matter?)
I'll be linking to this one again.
Next round HD4K vs GTX vs GF9 integrated, complete with dual view decoding. >B~)
Also was expecting that the single core handled it better as I use a old laptop with pentium M 1500mhz & ATI 9600 as a HTPC and it plays nearly all HD media I trow at it smoothly (Including 1080P) trough ffdshow. Notice the files are usually Matroska or AVI and the codecs vary but usually are H264.
I admit since its an old PC without blueray or HD-DVD I have no idea how the "real deal" would perform, probably as bad or worse as the article says
I just bought a Samsung LE46A656 TV and I have the following problem:
When I connect the TV with standard VGA (D-SUB) cable,
I can use Full HD (1920 X 1080) correctly.
If I use the HDMI or DVI (with DVI-> HDMI adaptor) I can not use 1920 X 1080 correctly.
The screen has black borders on all sides (about 3cm) and the picture is weird, like the monitor was not driven in its native resolution, but the 1920 X 1080 signal was compressed to the resolution that was visible on my TV.
I also tried my old laptop (also ATI, x700) and had the same problem.
I thought that my TV was defective but then I tried an old NVIDIA card I had and everything worked perfect!!!
Full 1920 X 1080 with my HDMI input (with DVI-> HDMI adaptor).
I don't know if this is a ATI driver problem or a general ATI hardware limitation,
but I WILL NEVER BUY ATI AGAIN.
They claim HDMI with full HD support. Well they are lying!
http://www.bit-tech.net/hardware/2008/03/04/amd_780g_integrated_graphics_chipset/10
What's going on here? I assume bit-tech tweaked player settings to improve results, and you guys left everything at default?
This one could be an interesting part V in the articles series.
LEARN TO DOWNLOAD DRIVERS
As for the guy having issues with HDMI with the ATI 3200 onboard, dude, there were some problems with the initial BIOS. Update them, update your drivers and you won't have a problem. My brother has the same board too and he uses HDMI and it works just fine. Noob...
HQV is, unfortunately, somewhat subjective - but I don't know how they could have gotten these scores. Ati told me directly that their low-end cards won't provide any HD enhancements. That was some time ago, and bit-tech aren't a bunch of incompetents, so it's hard to say exactly what's causing the diffrence here.
I don't believe the quality will change with faster processors - I tested a Phenom 9500 and didn't see any diffrence. But I'll run a proper test and let you know.
Wow. You seem to be REALLY missing the point here. The whole point of this is for HTCPs, and the nice thing about HDMI is that it can send both video and audio to your tv over the cable. If they had a DVI port on the motherboard and you had to use an HDMI adapter, you would have no sound. Doing it the other way around though ensures that people who use it for an HTCP like it is intended get their sound, and those who want to use a monitor buy a DVI adapter. You didn't even seem to mention or test the capabilities of sending audio over HDMI!
I don't complain about my laptop not having a DVI out, and when I want to hook it up to my friends 32" HDTV all I have to do is plug in a simple HDMI cable. If I wanted to run and LCD at home I'd buy an adapter.
I have found that even slow CPUs can playback H.264 AVI files; something's going on when playing back a blu-ray, probably the encryption. I was hoping the AnyDVD HD benchmarks would expose this but that's not what we saw. I'll be digging into it further in a future review for sure.
That is really bizarre... are you sure it works? Have you played back protected Blu-ray or HD DVD's over the DVI cable?
During my testing it didn't work at all, and then when I checked around I was led to understand HDCP wasn't supported on the DVI out of that board.
Well, I'll disagree with you that these boards will ONLY be used for HTPCs, as I've stated I believe that these value-priced boards would be attractive to a whole lot of people with decent monitors who would like HD playback.
As for sound, you can certainly play it back over the integrated sound chip instead of HDMI, so I don't think "you would have no sound." is an accurate discription.