Many reviews analyze the minimum and maximum power consumption of a given graphics card. But just how much power does a high-end graphics card really need during the course of standard operation? This long-term test sheds some light on that question.
We wanted to take a less conventional approach to a question that comes up in just about every graphics card review, and actually measure the power that gets consumed between turning your computer on (let’s say for a gaming and email session) and up until it is turned off again.

Our feeling was that the usual extrapolations and estimates using minimum and maximum power readings don’t do justice to everyday operation. Therefore, we decided to measure the actual power consumption over a certain period of time and with different usage models, because most people do not just turn on their computers and play games without ever doing something else.
Defining that "something else" is actually rather important. As we measure and monitor power consumption during a longer testing period, we also add frequently-used programs and services to check whether or not these would increase power consumption compared to true idle operation. In addition to games in windowed and full-screen applications, other hardware-accelerated tasks include video playback and D2D/D3D content in windowed mode.
Obviously, we are mostly interested in finding out the true, total power consumption in real life, rather than the peak load or idle values. This brings us to the core of today’s examination: it is no secret that powerful graphics cards are expensive, but do they really use that much more power? Will a gaming PC with a powerful 3D graphics board really bring your electricity bill up over time? If so, by how much?
We set out to do some testing, and it turned out to be a lot like mixed fuel consumption and mileage testing on a car. The difference is that we're representing our results in watt-hours instead of miles per gallon or kilometers per liter.
- The Cost Of High-End Graphics: Truly Expensive Or Just Exaggerated?
- Initial Idea And Power Consumption Definition
- Explanation Of The Calculation Method
- Creating The Application Usage Profiles
- Measuring Specific Power Consumption Per Application
- Test System And Measured Applications
- Base Configuration And Tested Video Cards
- Maximum/Minimum Power Measurements
- Power Analysis: The Gamer
- Power Analysis: The Average User
- Power Analysis: The Enthusiast
- Power Analysis: Average Energy Consumption
- Conclusion And Summary
Typo on the enthusiast graph. calculations are correct, but it should be 13ct/kWh, not 22ct/kWh.
Also I can't imagine having 8 hours of gaming time every day. 5 hours even seems extreme. Sometimes, you just can't game AT ALL in a day, or a week.
Some people do have lives...
who cares....if you have the money to buy them you can pay for the electricity...it's just like SUVs, you have the money to buy them you can keep them running
i run my 480 sli rig to fold almost 24/7...do i care about my bill...HELL NO
By the way, space heater ftw!
It would be really useful to know what a folding setup running 24/7 costs. Perhaps one day you could use it to get a "Folding for the Future" tax credit on the books. Maybe Toms can lead the lobbying effort in Washington.
Compared to the 4000w, 240v industrial space heater I was using over Christmas, my computer will have to work all year to match the utility cost.
I second "space heater ftw!"
I am able to lower the heat in my Minnesota corner room tx to the pc on the floor and the screens on the desk!
what i got from this article is that it really pays to have a power profile schedule and making use of puting yoru computer in sleep mode when your not useing it. and useing the windows power profile "balenced" and only use the high performance profile when you are gaming/number cruncher/redering/video editing