The Cost Of High-End Graphics: Truly Expensive Or Just Exaggerated?
We wanted to take a less conventional approach to a question that comes up in just about every graphics card review, and actually measure the power that gets consumed between turning your computer on (let’s say for a gaming and email session) and up until it is turned off again.
Our feeling was that the usual extrapolations and estimates using minimum and maximum power readings don’t do justice to everyday operation. Therefore, we decided to measure the actual power consumption over a certain period of time and with different usage models, because most people do not just turn on their computers and play games without ever doing something else.
Defining that "something else" is actually rather important. As we measure and monitor power consumption during a longer testing period, we also add frequently-used programs and services to check whether or not these would increase power consumption compared to true idle operation. In addition to games in windowed and full-screen applications, other hardware-accelerated tasks include video playback and D2D/D3D content in windowed mode.
Obviously, we are mostly interested in finding out the true, total power consumption in real life, rather than the peak load or idle values. This brings us to the core of today’s examination: it is no secret that powerful graphics cards are expensive, but do they really use that much more power? Will a gaming PC with a powerful 3D graphics board really bring your electricity bill up over time? If so, by how much?
We set out to do some testing, and it turned out to be a lot like mixed fuel consumption and mileage testing on a car. The difference is that we're representing our results in watt-hours instead of miles per gallon or kilometers per liter.