Sign in with
Sign up | Sign in
Your question

Power consumption and electrical costs

Last response: in Systems
Share
March 1, 2011 11:58:42 AM

im building a new machine for myself, but strangely, am concerned about power efficiency versus electrical costs. it's weird, i know. for example, one of my problems is deciding on a cpu, of which i had had my mind on the phenom 2 x4 965be. its only $160 or so. but then i see it runs at 125w while some lower phenom x4's can run at just 95w. to me, the price of the 965 is well worth it for its speed, but since it runs at 20 watts higher, what are the extra costs (in electricity bill) after a years worth of usage? $20-$200 over a 95w? ballpark? what about after several years? suddenly, it's not just a $60 flat difference between the cpu's.

this also got me thinking about other components, like motherboard and ram power consumption. of course lower consumption is better, and a lot of component reviews do state how many watts a device uses. i would just like to know an average figure of what a person is saving in comparable wattage differences. this would help me decide what is most cost effective to buy, in regards to how much power/speed i actually need.

if anyone knows about this kind of stuff please drop your opinion / advice. or if you got a link that goes into detail about it id appreciate it.
March 1, 2011 12:06:28 PM

Hello allen200;
That 125w is the thermal dissipation load and not the actual wattage the CPU use at all times.
Is this a gaming machine? What video card(s) will you be using?
m
0
l
Related resources
March 1, 2011 12:33:58 PM

WR2 said:
Hello allen200;
That 125w is the thermal dissipation load and not the actual wattage the CPU use at all times.
Is this a gaming machine? What video card(s) will you be using?


radeon 5670, seems to be strongest card that doesnt require external power. right up my alley.

CopaMundial said:
Right here on Tom's Hardware they have a pretty interesting article on this topic.
What Do High-End Graphics Cards Cost In Terms Of Electricity?


ok i will have a look.

do either of you know how many more watts 2x 4gb is versus 2x 2gb ddr3 memory?
m
0
l
March 1, 2011 12:40:52 PM

Have you switched over from incandescent to fluorescent lighting?
Just swapping over 1x 150w incandescent that's used 8hrs a day will more than make up any energy needs for an upgraded quad core/8GB PC being used 4-6hrs a day.

How Much Power Does Low-Voltage DDR3 Memory Really Save?
http://www.tomshardware.com/reviews/lovo-ddr3-power,265...
m
0
l
March 1, 2011 2:00:49 PM

WR2 said:
Have you switched over from incandescent to fluorescent lighting?
Just swapping over 1x 150w incandescent that's used 8hrs a day will more than make up any energy needs for an upgraded quad core/8GB PC being used 4-6hrs a day.


thats very true, and ive switched lights years ago :) 

i heard that each gb of ram takes about 10 watts. if this is true (which i have no clue if it is) the extra 4gb would be 40 watts difference in just memory needs alone, on top of system consumption. that would be like having a 40w lightbulb on continuously while your pc is power on? im not an electrician, but lets do some fuzzy math. i guess this will help me answer some of my own questions too.

i dont know what my power company charges per KLW per hour, but im going to guess its around $.10, this might be on the high end.

.0001 watts/hour.

40w = $.004/hour (cheap)

my pc is on about 17 hours a day.

so that extra 4gb of ram costs $0.068 per day / $24.82 per year.


i guess thats ok.
m
0
l
March 1, 2011 2:26:39 PM

allen200 said:
i heard that each gb of ram takes about 10 watts.
It would be per stick of 1GB/2GB/4GB RAM and maybe if it's DDR 2.6V RAM.
DDR2 RAM uses about 1.8-2.2V and DDR3 uses about 1.5-1.65V
40W is just seems way out of line to me.
m
0
l
!