Resolution Benchmarks: 1080p vs. 780p
First off, since we noticed smoother playback on the GeForce 8200 at lower resolutions, let’s examine the CPU utilization difference between 720p (1280x720) and 1080p (1920x1080):
Now we’re getting somewhere. While overall CPU utilization didn’t drop that much, note that the CPU utilization is much more consistent at the lower 720p resolution. As a result, we’re not seeing the dropped frames and skipping performance.
Let’s compare this to what happens to the Radeon 3200 when we lower the resolution:
There doesn’t appear to be any CPU spiking at the higher resolution with the Radeon 3200. As the resolution is dropped, CPU utilization lowered a bit. But playback appeared skip-free at either resolution.
Based on CPU utilization, it looks like the GeForce 8200 can decode the demanding H.264 content just fine, but it seems that as the resolution is raised past 720p there’s a bottleneck happening somewhere else and playback suffers.
Frankly, this is a serious blow to the integrated GeForce 8200’s potential as a home theater PC platform. While it’s true that it was able to play back two of the three titles we tested at the 1080p resolution, it’s a pretty serious limitation that resolution potentially has to be lowered to 720p for demanding titles, especially since the Radeon 3200 displays no such limitation.
With this in mind, we are curious as to how these platforms will perform with CPUs that are slower than the 4800+. Are either of these system viable HTPC platforms with cheaper, single-core processors?