- Articles & News
- For IT Pros
- Your Opinion
Both ATI and Nvidia would like you to believe that HD Content will suck your CPU dry, and you really need a card with the respective company's processor to enable smooth playback via hardware acceleration.
Using Power DVD 6, all three cards consumed approximately 4% of available CPU cycles while playing DVD video with hardware acceleration disabled; the percentage dropped to around 1% with hardware acceleration enabled. Of course, 480p MPEG2 content isn't HD.
We downloaded the "1080p" movie trailer "The Rules of Attraction" from Microsoft's Windows Media Showcase, but were dumbfounded when all three cards again reached the same CPU cycle consumption at approximately 14% in Windows Media Player, regardless of whether or not WMV Acceleration was enabled in the Catalyst driver. Perhaps we should try H.264?
Apple's Quick Time HD Gallery offers a wide variety of H.264 content. We chose the documentary clip "One Six Right" in true 1080p format, only to find all three cards once more using the same amount of CPU cycles to play content, this time approximately 22%. Given the identical results, it would seem as though none of the downloadable HD formats require or use hardware acceleration, and further testing will require the procurement of an HD-DVD or BRD drive. If the DVD playback results showing a 75% reduction in CPU use indicate what to expect, we might be in for a treat!
|Catalyst High-Quality 3D Settings|
|Smoothvision HD AA||Application Preference|
|Smoothvision HD AF||Application Preference
High Quality AF Enabled
|Mipmap Detail Level||Quality|
|Forceware High-Quality 3D Settings|
|Anisotropic Filtering||Application Preference|
|AF MIP Filter Optimization||Off|
|Anisotropic Sample Optimization||Off|
|Anti-Aliasing Settings||Application Preference|
|Conformant Texture Clamp||Use Hardware|
|Gamma Correct Anti-Aliasing||Off|
|Hardware Acceleration||Single Display Performance Mode|
|Ignore OpenGL Error Reporting||Off|
|Negative LOD Bias||Allow|
|Texture Filtering||High Quality|
|Vertical Sync||Application Preference (all apps set off)|
I'm not personally much of a "gamer" in the popular sense, preferring the fast action of racing games to RPG and FPS. As such, I was unable to reach "the perfect scenes" for screenshots, but grabbed what I could instead. Though most X1950XTX, X1900GT and 7600GT screenshots looked comparable, one scene stood out:
At maximum quality settings in FEAR, the 7600GT (above) displayed the building on the left normally, but with little detail in the right-side roof or rear parking lot. Compare the X1950XTX, which displays a strange moiré affect in the building, but with greater detail in the right-side roof and rear parking lot. This is the only place I found where this visual effect occurred, but I've never seen it mentioned before. It affects both the X1900GT and X1950XTX.
The sample above would appear to show ATI with stronger AF and Nvidia with more accurate AA results. It's difficult to make any definitive statements regarding ATI vs. Nvidia in the remaining shots, but feel free to browse the photo gallery for argument's sake.