Financial Intersection of Price and Power consumption for Intel and AMD

ionosphere

Distinguished
Jul 25, 2010
85
0
18,630
I am looking for an answer for CPU choices: i5-3470 or FX-6300. I am stuck on power consumption part.

At first, I was worried about power consumption, as I see AMD's table on power consumption is a lot higher than Intel. But soon, after I calculated some electric bills, I realize these 77 watts (from i5-3470) and a few hundreds watts from several AMD CPUs don't really have much impact. AMD is just cheaper but drain a bit more electric bills in the long run.

A better question would be "when will the lines of Intel and AMD intersect?" Suppose I invest a bit more in Intel right now, I will use tad bit more electricity. But if I go with AMD, I save some bucks right away, but eventually, the line will eventually intersect and I will pay a bit more in the long run. But I want more details.

So my questions are as follow:

1. About the watt usage, suppose we got ourselves the usual kWatt per hour rate like in the USA, let's say, comparing i5-3470's 77 watts and AMD FX-6300's 95 watts. With the price difference of $150 and $120 for both CPU respectively, when will i5-3470's power consumption starts to become "a profit" over FX-6300? Assuming that I use the computer 24 hours a day, how many months will it take to get to the $30 difference between the 2 processors?

2. What are the AMD equivalent choices for i5-3470, in term of performance? (Not heavy gamer, some modern games on medium graphical setting, not doing any video editing, just plain normal casual work,)
 
you buy a chip for its performance not so much for the power consumption aspect. there is no amd eqivalent to the 3470 in terms of gaming performance. if you were talking about video work, the 6300 is about the same as the 3470.

you willl have to overclock the AMD chip in order for you to match the performance of the 3470 in games
 

jackson1420

Distinguished
May 10, 2010
487
0
18,860
Agreed with TheBigTroll. You should never purchase a CPU based on power unless you are building a brand new mobile device and therefor you wouldn't go after the desktop series.

The difference is so minimal it isn't even worth breathing at the stats and specs. Your GPU will consume the majority of your power - you CPU won't.

I wouldn't even compare AMD to Intel at this point. I have gone pure Intel until AMD can prove itself (and they are still falling behind)
 

DSzymborski

Curmudgeon Pursuivant
Moderator


18 watts for 24 hours is an additional kWh every 2.3 days. With average price in the use of $0.12 per kWh, that's about $0.05 a day. So to get $30, that's about a year and a half.

Obviously quicker in areas with higher electricity costs. For example, in Germany as of 2012, the average consumer price for kWh (including taxes, etc) in dollars was $0.33. With that same assumptions, above, the German would save $0.14 day and get to $30 in 210 days, or nearly 7 months. Over 3 years (again, with your assumptions of using 24 hours a day and a constant 18W in savings) that comes to $156.

In other words, not so big a deal in most of the US -- and even marginal in, say, New York which has the highest rate in the lower 48 at $0.18 per kWh -- but a serious consideration where power is more expensive. The EU area *average* comes out to the equivalent of $0.25 per kWh, with only a few countries with similar electricity rates to the US (Bulgaria, Estonia, Romania). Denmark's is the highest, with the average consumer cost being $0.38/kWh, 3-and-a-half times of that average American consumer.



 

DSzymborski

Curmudgeon Pursuivant
Moderator


Assuming he's in the US, of course. Depending on the country, it can potentially be a significant concern.
 

USAFRet

Titan
Moderator
Many, many other factors contribute to the (tiny) overall running cost of a particular CPU vs a different one.

You have to factor in idle vs full load wattage. My 3570k idles at ~8-9 watts. Given that the thing is at idle the majority of the time, actual power consumption is minimal.

And you have to take into account performance. If the AMD takes 6 seconds (at full power) to do a job that the Intel can do in 4 seconds (at full power) before returning to idle, which has 'cost more money'?

Which chip runs hotter, in the specific case that it is in? More heat pumped into the room may be good (in a cold climate) or may cause more A/C use (in a hot climate). If one causes the A/C to kick on 3 minutes earlier, is that better or worse?

Don't stress over the literally pennies difference here of one CPU vs another. Get whichever suits your usage.