Higher Clock Speed vs Higher load power consumption

SithNinjaTyler

Reputable
May 6, 2014
115
0
4,690
So, I have a question. Which one of these scenarios would give more power consumption:

A CPU at a higher clock speed and a lower load
or
A CPU with a lower clock speed and a higher load

And then, would splitting it between more cores use more power? Or less? I'm curious about this since I would like to apply this scenario to my android tablet. (Sorry if it's classified in the wrong section) It's very slow, but is rooted. If I set the higher clock speed, it has a lower CPU usage, but if I set it to a lower clock speed, the CPU usage goes up. Also, it's a quad core, so I would like to know if splitting the load between the cores would be better or worse for power consumption.
 
Solution
The easiest way to determine that is probably testing battery life in both situations with everything else that can affect battery life disabled. Trying to figure it out mathematically would be more difficult because there are other factors involved as well like what the exact load/frequency differences are, the specific tablet in question, and more.

Generally speaking, a higher clock speed will increase power consumption. If the load is constant, than higher clock speed with the load being lower (by percentage of CPU time) will usually result in higher power consumption.

Spreading a job across multiple cores can allow for the cores to run at a lower clock speed and this usually means lower power consumption, especially if CPU voltage...
Higher clock requires higher voltage, so power consumption doing a given amount of work goes up cubicly for a linear increase in work done.

More cores gives a linear increase in power consumption for a linear increase in work done.

A lower clockspeed, to a point, gives a cubic improvement in power consumption for work done.


I made some charts of the power consumption of my Ivy Bridge i5 at various clockspeeds:

mMBmP2A.png



Here is efficiency of the CPU at various clockspeeds - notice how much efficiency drops as clocks increase:

sdF3LQC.png



Computers have a lot of fixed power draws, such as fans, hard drives, motherboards, etc., and when you factor that into the efficiency equation, below a certain clockspeed you're not saving any more power, because the extra time you take to get something done is offset by the fixed power draws:

ATi4U0h.png



^ Looking at this, it's no surprise why Intel set the stock clockspeed of most 22nm CPUs around 2.7-3.3ghz, with turbo no higher than 3.9ghz.
 
The easiest way to determine that is probably testing battery life in both situations with everything else that can affect battery life disabled. Trying to figure it out mathematically would be more difficult because there are other factors involved as well like what the exact load/frequency differences are, the specific tablet in question, and more.

Generally speaking, a higher clock speed will increase power consumption. If the load is constant, than higher clock speed with the load being lower (by percentage of CPU time) will usually result in higher power consumption.

Spreading a job across multiple cores can allow for the cores to run at a lower clock speed and this usually means lower power consumption, especially if CPU voltage can drop as well. Again though, this can depend on the specific tablet and the workload.
 
Solution
As blazorthon says, it's definitely a complicated matter. There's an idea called "race to idle", where CPUs ramp up to get things done as quickly as possible, so the rest of the system can go into a low power state as quickly as possible. Even though the CPU is far less efficient in that state, you spend less time with the motherboard, RAM, HDD, screen, etc. powered, which can result in net power savings.