Sign in with
Sign up | Sign in
Your question

AMD & Intel power consumption: Do I pay more with AMD?

Last response: in CPUs
Share
March 4, 2012 8:15:45 PM

I've been looking at an AMD FX 6-100 build for quite some time now.

As it's a cost driven project, am I paying more on the electricity meter over time with the Bulldozer chip, than say, the Intel i3 or i5? Will it be significant?
a c 184 à CPUs
a b À AMD
a b å Intel
March 4, 2012 8:21:50 PM

It draws around 2-3x the amount of power.
March 4, 2012 8:47:31 PM

amuffin said:
It draws around 2-3x the amount of power.


Ah OK.

I've been running http://extreme.outervision.com/psucalculator.jsp for either build I've been considering, because of my current PSU, and so forth.

Both the i5-2300 and the FX-6100 builds have a recommended PSU wattage of 353W. The i3-2120 build has a recommended PSU wattage of 323W.

But then, perhaps power draw and wattage are two different things entirely. Even if the Bulldozer needs more power out of the socket, will it be significant in my utility bills?
Related resources
a c 446 à CPUs
a c 111 À AMD
a c 110 å Intel
March 4, 2012 8:48:51 PM

The AMD FX 6100 and a comparable i3 or i5 CPU consumes about the same amount of power when idling; the FX 6100 consumes marginally less. This represents most typical situations if you are just surfing the web. Or typing up a Word document.




With 1 core being used power consumption on the FX 6100 starts ramping up. Encoding audio typically uses one core.




If you are doing video encoding or anything else that can push the CPU to 100%, power consumption really goes up for the i5 and FX 6100. The FX6100 uses the most power. The i3-2120 being just a dual core CPU does not consume as much power as quad core CPUs.



The above is for total system power consumption. Of course if the test included playing games then the total power consumption would increase because of the video card. How much power a video card uses when playing a game depends on the video card. A Radeon HD 5450 consumes about 18w under load; then again that is a poor choice for playing video games. A monster Radeon HD 6990 for those people who wants one of the best performing cards and can afford the price tag consumes up to 404w of power by itself.


Comparing the i3-2120 and FX-6100 CPUs, assuming 100% load (50w difference) for 40 hours a week, 52 weeks per year the FX-6100 will consume 104KWH more than the Core i3-2120. Assuming you pay $0.10 per KWH, that works out to an extra $10.40 per year.

All the above charts are from the following review which also has performance benchmarks of various applications and games as well.

http://www.xbitlabs.com/articles/cpu/display/amd-fx-812...
March 4, 2012 9:05:41 PM

Thanks for your help Jag.

I was curious to note how the efficiency would translate into utility bills. I don't think I'm that cheap.
a c 184 à CPUs
a b À AMD
a b å Intel
March 4, 2012 10:48:36 PM

If you are concerned with power usage, don't upgrade to an 8150! It draws twice as much as an i7-2600k!
Anonymous
a b à CPUs
March 5, 2012 5:29:24 AM

jaguarskx said:

All the above charts are from the following review which also has performance benchmarks of various applications and games as well.

http://www.xbitlabs.com/articles/cpu/display/amd-fx-812...


thanks for that post, bookmarked the article.
!