Skip to main content

Curbing Your GPU's Power Use: Is It Worthwhile?

A Quick Gaming Test: Crysis

A Quick Gaming Test: Crysis

After seeing the results we generated using desktop-oriented apps, we were interested to see how badly our modifications would affect gaming performance. Our APAC lab already returned its Radeon HD 5870, so we're sticking to the Radeon HD 6970 for this brief exploration.

We're using the same settings from our last article. We chose the default CPU test under DirectX 9 (with the high-quality preset) at several resolutions (1024x768, 1280x720, 1680x1050, and 1920x1080).

We picked the CPU benchmark instead of the default GPU test for two reasons. First, its point-of-view is closer to the real gaming experience. And second, the average frame rate from this test corresponds well to the performance you see throughout the single-player campaign.

Given the results, it's obvious that higher resolutions are better for testing because they put more burden on the GPU. Keep in mind that the base system’s power consumption with integrated Radeon HD 3300 graphics is about 121 watts. So, between 170 and 180 watts of the 300 watt result we measure is attributable to the graphics card.

If you're willing to take the 15% frame rate hit, you'll get 50 to 85 watts lower peak power consumption. That's a 27 to 55 watt reduction in average power consumption. Lowering the card's operating voltage is the most effective way of procuring those results. Reducing memory frequencies helps a bit too, particularly when it comes to dropping peak power consumption.

The numbers above show that AMD’s Radeon HD 6970 consumes between 70 to 95 watts, which is about half of its original power consumption running at full-speed (170 to 180 watts). If this applies to all games, and not just Crysis, that's not a bad tradeoff. You still get higher, more playable frame rates with the underclocked Radeon HD 6970 than with the HD 5770, and power consumption is about the same.