What Do High-End Graphics Cards Cost In Terms Of Electricity?

The Cost Of High-End Graphics: Truly Expensive Or Just Exaggerated?

We wanted to take a less conventional approach to a question that comes up in just about every graphics card review, and actually measure the power that gets consumed between turning your computer on (let’s say for a gaming and email session) and up until it is turned off again. 

Our feeling was that the usual extrapolations and estimates using minimum and maximum power readings don’t do justice to everyday operation. Therefore, we decided to measure the actual power consumption over a certain period of time and with different usage models, because most people do not just turn on their computers and play games without ever doing something else.

Defining that "something else" is actually rather important. As we measure and monitor power consumption during a longer testing period, we also add frequently-used programs and services to check whether or not these would increase power consumption compared to true idle operation. In addition to games in windowed and full-screen applications, other hardware-accelerated tasks include video playback and D2D/D3D content in windowed mode.

Obviously, we are mostly interested in finding out the true, total power consumption in real life, rather than the peak load or idle values. This brings us to the core of today’s examination: it is no secret that powerful graphics cards are expensive, but do they really use that much more power? Will a gaming PC with a powerful 3D graphics board really bring your electricity bill up over time? If so, by how much?

We set out to do some testing, and it turned out to be a lot like mixed fuel consumption and mileage testing on a car. The difference is that we're representing our results in watt-hours instead of miles per gallon or kilometers per liter.

  • alikum
    Nvidia cards consume power like crazy
    Reply
  • damric
    I don't get it. Are they saying that a GTX 480 will cost a hard core gamer $90/year in electricity? Seems like a drop in the bucket considering my power bills are over $90/month in the winter and over $250/month in the summer. Just think of all the money the hard core gamer saves from not having a girlfriend :D
    Reply
  • scook9
    They are also neglecting the positive side effects like not needing a space heater in the winter....you recoup alot of energy right there :D
    Reply
  • porksmuggler
    ^Tell me about it, warmest room in the house right here. Turn the thermostat down, and boot the rig up.

    Typo on the enthusiast graph. calculations are correct, but it should be 13ct/kWh, not 22ct/kWh.
    Reply
  • jimslaid2
    Glad I bought the 6870 over the gtx 460 1g
    Reply
  • aznshinobi
    The fact that you mentioned a porsche. no matter what the context. I love that you mentioned it :D
    Reply
  • AMW1011
    So at worst, my GTX 480 is costing me $90 a year? Sorry if I'm not alarmed...

    Also I can't imagine having 8 hours of gaming time every day. 5 hours even seems extreme. Sometimes, you just can't game AT ALL in a day, or a week.

    Some people do have lives...
    Reply
  • nebun
    alikumNvidia cards consume power like crazywho cares....if you have the money to buy them you can pay for the electricity...it's just like SUVs, you have the money to buy them you can keep them running
    Reply
  • nebun
    AMW1011So at worst, my GTX 480 is costing me $90 a year? Sorry if I'm not alarmed...Also I can't imagine having 8 hours of gaming time every day. 5 hours even seems extreme. Sometimes, you just can't game AT ALL in a day, or a week.Some people do have lives...i run my 480 sli rig to fold almost 24/7...do i care about my bill...HELL NO
    Reply
  • Darkerson
    Very nice article! Keep it up!
    Reply