Why are today's video cards power hogs?

Status
Not open for further replies.

ulillillia

Distinguished
Jul 10, 2011
551
0
19,010
Compared to video cards from only 4 years ago, today's cards use up far, far more power (electrical energy). Why is that? Take my mid-range-at-the-time GeForce 7600 GT at the moment. I'm seeing that this only uses 36 watts or so. Today's mid-range cards use not 36, but 200 to 300+ watts, way more. What in the manufacturing or design is causing today's video cards to require so much more electrical power to run compared to video cards only a few generations ago? Are the manufacturers making design changes to make video cards more energy efficient (like going back to the 50 to 100-watt realm for mid range cards)?
 
Solution
There is a simple reason why GPUs have become more power hungry...because they can.

The AGP and PCI slot did not provide much power to work with, the PCI-E slot provides 75 W and they designed auxiliary cables so it could support something up to 300 W, this gave them a lot more head room to design video cards with. GPUs are vastly more efficient per watt than older generations were, but are also much larger and more powerful than the old ones were so the total power consumption has increased.


CPUs ran into a similar increase in power consumption with the switch to CPUs being power from the 12 V rail instead of the 5 V rail, the 8 pin CPU power connector allows for CPUs up to 150 W. The Pentium 4 led to the introduction of the CPU...

joytech22

Distinguished
Jun 4, 2008
1,687
0
19,810
Todays video cards are significantly more complex compared to older cards.

They have hundreds of millions, even billions more transistors compared to years ago, more memory, more complexity, more features etc..

There is always going to be a trade off.
 

ulillillia

Distinguished
Jul 10, 2011
551
0
19,010
If that's the case, then why don't CPUs use any significant amount of power? My core i7-2600K probably uses less power (it's not overclocked... yet) than my GeForce 7600 GT. I don't know how much power each individual component uses so it's hard to say.

Edit: I'm seeing that my current card uses 39.6 watts under full load and a puny 19.8 watts when idling. My processor uses 76 watts when idling so it's using 4 times as much as my video card. A GeForce GTX 550 Ti, the kind of card I'm looking into, appears to use a rather crazy 139 watts at idle, 7 times as much as my current, and, compared to what I'm seeing on Wikipedia, it's only twice as powerful as far as pixels and texels go.
 

Something is amiss, my 2600K uses about 5w when idling according to HWmonitor and the graphic cards use a bit more but it's nowhere near three figures.
 

ulillillia

Distinguished
Jul 10, 2011
551
0
19,010
I didn't catch that when I viewed Anandtech and bit-tech - it's the entire system when idling:

http://www.anandtech.com/show/4221/nvidias-gtx-550-ti-coming-up-short-at-150/16
http://www.bit-tech.net/hardware/graphics/2011/03/15/nvidia-geforce-gtx-550-ti-1gb-review/10

I'm only considering the 550 Ti because, with the 7600 GT that I currently have, I cannot play back H.264-encoded 1920x1080 video at 29.97 fps in Virtual Dub (no editing being made either, just open and play; I can record faster than I can play back, rather ironic as it's normally the other way around). I don't do gaming on my computer although I'm making a game of my own (a game that could run fine on cards even 3 generations before my current one though only at minimum settings).
 
There is a simple reason why GPUs have become more power hungry...because they can.

The AGP and PCI slot did not provide much power to work with, the PCI-E slot provides 75 W and they designed auxiliary cables so it could support something up to 300 W, this gave them a lot more head room to design video cards with. GPUs are vastly more efficient per watt than older generations were, but are also much larger and more powerful than the old ones were so the total power consumption has increased.


CPUs ran into a similar increase in power consumption with the switch to CPUs being power from the 12 V rail instead of the 5 V rail, the 8 pin CPU power connector allows for CPUs up to 150 W. The Pentium 4 led to the introduction of the CPU power connector(similar to the PCI-e power connector), the most powerful P3 used only 38 W, the same as the least powerful desktop pentium 4 with the most power hungry using up to 115 W.


When power limits get raised it lets manufacturers create a much more powerful component, even if they don't manage to raise the performance per watt significantly.
 
Solution
Status
Not open for further replies.