In many cases, the graphics card is the most power-hungry component in a PC. The enthusiast community is no stranger to CPU tweaking, so why hasn't GPU modification caught on? We're going to see just how much you stand to gain (or lose) from tweaking.
Introduction
An increasing number of enthusiasts are becoming aware that their GPUs are the primary consumers of power in their PCs. For some, especially for our readers outside of the U.S., power consumption is an important factor in choosing a graphics card, along with performance, price, and noise levels. At the same time, as we begin to focus more intently on GPU power consumption, it is disconcerting to see the real cost of owning a high-end graphics card. If you read What Do High-End Graphics Cards Cost In Terms Of Electricity?, then you already know what we’re talking about.
Now, you're probably wondering what you can do to help alleviate the issue. Graphics vendors like AMD and Nvidia build in technologies that help cut power use during idle periods, but the only surefire way to slash consumption is using a mainstream graphics card instead of a high-end model. At the end of the day, simpler cards based on less-complex GPUs require less power than their high-end siblings.
You end up making sacrifices when you give up the displacement of a big graphics engine, though. Most mainstream cards don't offer enough performance to play the latest games at in the highest resolutions using the most realistic detail settings. If you want to play games completely maxed out, a high-end card is the only option.
Is there really no alternative to using mainstream graphics cards for the power-conscious? What if there was a way to manually cut down the power consumption of faster graphics cards? These are questions we ask (and try to answer) today.
A Short Overview of GPU Power Management
Although add-in graphics cards for desktop PCs don't curb power consumption as aggressively as discrete notebook GPUs, they still employ power management. In fact, power management technologies for desktop graphics cards have been available for quite a while. They usually manifest themselves as separate clocks for 2D (desktop) mode and 3D mode. Think of these as P-states on modern processors. With the availability of hardware-accelerated video playback, vendors have also added a new mode for video playback.
What's missing in most graphics cards is an option to limit power consumption, which is what the “Power saver” preset in the Windows Control Panel does. Enable this and the graphics card runs at lower clocks to keep power consumption down. There are new approaches to this problem. AMD introduced PowerTune for its Radeon HD 6900–series cards. We’ll look into the effectiveness of this capability later in this piece.
Finding the Right Combination
Lowering operating clocks is one way to reduce power consumption. This is as true for GPUs as it is for CPUs. However, clock speed (core and memory) is only one part of the equation. As with CPUs, the graphics processor’s operating voltage also plays a role.
If we wanted to limit or lower power consumption, couldn't we just manually set clocks and voltages? It’s actually really easy to modify frequencies using vendor-provided utilities and third-party software. And why not? Finding the right combination of clock and voltage can offer significant power savings.
Altering voltages is a different matter. Most graphics cards don't offer an easy way to adjust voltage. And in fact, in light of the fact that certain individuals have blown up their GeForce GTX 590s using unrealistic voltage settings, Nvidia even locks out voltage manipulation of those cards altogether. It’s not clear if that’ll apply to just the GTX 590 or a broader sampling of the company’s portfolio, but it demonstrates the bad that can come from too much tinkering.
How about the other cards out there that can still be modified? Unfortunately, voltage adjustments are limited to 3D-mode. Most cards do not offer a way to adjust voltages at idle or some intermediate mode.
But today we’re going to perform an experiment. We're going to see just how much power we can save by lowering clocks and voltages. In the process, we’re going to measure the associated performance hit to gauge whether those changes are worthwhile. We’ll be using two cards: AMD’s Radeon HD 5870 and 6970, with a 5770 for comparison.
- Saving Power On AMD Graphics Cards
- The Tests, Test Setup, And A Side Note
- Test System
- The First Experiment: AMD’s Radeon HD 5870
- Benchmark Results: Cinebench R11 And PowerDirector 9
- Benchmark Results: PowerDVD 9 And Desktop Idle
- AMD’s Radeon HD 6970
- Benchmark Results: Cinebench R11 And PowerDirector 9
- A Quick Gaming Test: Crysis
- A Quick Gaming Test: Medal Of Honor
- PowerTune: Taming Cayman
- Conclusion: Doable, But No Walk In The Park



This is neat though
I have 2 gfx cards pushing 3 displays, but I'm all for saving watts wherever I can. Our society has advanced to the point where sustainability is a very important buzzword that is widely ignored by mainstream media and many corporations, and this ignorance trickles down to the mainstream like Reaganomics. Minuscule reductions such as 30w savings across hundreds of thousands if not millions of users adds up to a significant reduction in carcinogenic emissions and saves valuable resources for future consumption.
I want to know, cause for instance in a raid, I'd sometimes watch video content on another screen while waiting around for whatever there is to wait for. I already lose the crossfire performance because of window mode. I don't want to lose even more.
Does my ancient 4870x2 support uvd?
With proper engine mapping, good throttle control, weight reduction from materials used, loosing carpets, spare tyre, radio etc. and the aerodynamic design of a Ferrari it isn't as stupid as you think.
At a full 8 hours per day, every day per month, you're talking $2.40 or basically the cost of one 12 oz. latte at Starbucks.
I typically use CCC's Overdrive to manually reduce memory speed while in 2D-mode, which works fine. Despite having been manually set, whenever it senses UVD playback (be it Flash, DivX, etc.) the memory frequency changes to it's default speed (1000MHz on the 4850, 975 on the 4890). Now I know why - once UVD mode is active, it's clocks reign supreme. It seems the only way to change that is to mod the BIOS.
When it comes to general users, the end result of all this tweaking seems to unfortunately only save rather minuscule amounts of power.
Those are performance modifications, less gas consumption might be a side effect but it is not the goal.
The 599 is stupid, the 430 is godly.