pretty simple question that Ive been really curious about. in your electric bill how much more a year would it cost you(with your own $/kw bill) to have a 125w cpu in your rig when compared to a rig with a 95w cpu or a 77w cpu?
as an example amd fx4170(125w) vs amd fx4100(95w) vs ivy i5(77w).
First of all the actual CPU power consumption depends on the load on the CPU. Just because the TDP is 125w does not mean the CPU will always consume 125w. Most of the time your CPU and GPU will be idling if you are simply surfing the web can your PC should not consume much power. On the other hand if you are playing games, then at least two cores will be used (very few games can use 4 cores), and your GPU will be running at full throttle.
See following charts to get an estimate of power consumption.
if you're running at full power most of the time
(difference in cpu power)x(hours on per month)/1000 = Z
Z x (cost per killowatt) = how much your bill would go up
if you left the 125w cpu running 24/7 (about 50w more than the 77w cpu)
24x30 = 720 hrs
50wx720h=36000 wH (watt-hours)
36000/1000 = 36KwH of electric power used.
My rate is 12 cents a KwH so thats a whopping 4 dollars and 32 cent increase in my bill, yes thats right only $4.32 for a whole month. Imaging if I shut it off 20 hrs a day, I might save $3. <sob> I couldn't even buy an extra Whopper... LoL
Now some areas also have an electric delivery charge on top of the killowatt usage charge.
Some areas have municple power and pay really low rates. One near me only charges 3.1cents per Kwh and that would be $1.12 in the above example.