Sign in with
Sign up | Sign in
Your question
Solved

Difference between CPU usage and clock speed?

Last response: in CPUs
Share
January 11, 2014 5:37:08 AM

I don't really understand, but in theory, would a 1Ghz processor at full load process roughly the same amount of information as a 4Ghz processor at 25% load?

Best solution

a b à CPUs
January 11, 2014 5:46:47 AM

In theory, yes.

If it was an identical architecture and the throughput of everything else was also quadroopled with latencies staying the same (in numbers of clocks) then pretty darn close.

If just the processor speed changed then the processor can still carry out 4 times the calculations, but could be waiting on data.
Share
January 11, 2014 6:04:16 AM

ewood said:
In theory, yes.

If it was an identical architecture and the throughput of everything else was also quadroopled with latencies staying the same (in numbers of clocks) then pretty darn close.

If just the processor speed changed then the processor can still carry out 4 times the calculations, but could be waiting on data.


Alright, so it's sort of like: If a Quad Core 4Ghz processor was only using one core because the program told it to, it would be doing 4,000,000,000 Cycles per second. But if I increased the clock speed to say, 5Ghz, at the same load, it would process 5,000,000,000 Cycles per second as oppose to the program using two cores which would output 8,000,000,000 Cycles per second? But if the clock speed at one core usage was 5Ghz, would the CPU lower it's usage from 25% to 20% to achieve the same amount of data processed as if it were 4Ghz at the one core usage?

m
0
l
Related resources
a b à CPUs
January 11, 2014 1:33:05 PM

Xanderrhs said:
ewood said:
In theory, yes.

If it was an identical architecture and the throughput of everything else was also quadroopled with latencies staying the same (in numbers of clocks) then pretty darn close.

If just the processor speed changed then the processor can still carry out 4 times the calculations, but could be waiting on data.


Alright, so it's sort of like: If a Quad Core 4Ghz processor was only using one core because the program told it to, it would be doing 4,000,000,000 Cycles per second. But if I increased the clock speed to say, 5Ghz, at the same load, it would process 5,000,000,000 Cycles per second as oppose to the program using two cores which would output 8,000,000,000 Cycles per second? But if the clock speed at one core usage was 5Ghz, would the CPU lower it's usage from 25% to 20% to achieve the same amount of data processed as if it were 4Ghz at the one core usage?



In theory, again yes. 25% usage at 4ghz should work out to 20% at 5 ghz... in theory
m
0
l
January 11, 2014 3:00:41 PM

Bonecrushrr said:
Xanderrhs said:
ewood said:
In theory, yes.

If it was an identical architecture and the throughput of everything else was also quadroopled with latencies staying the same (in numbers of clocks) then pretty darn close.

If just the processor speed changed then the processor can still carry out 4 times the calculations, but could be waiting on data.


Alright, so it's sort of like: If a Quad Core 4Ghz processor was only using one core because the program told it to, it would be doing 4,000,000,000 Cycles per second. But if I increased the clock speed to say, 5Ghz, at the same load, it would process 5,000,000,000 Cycles per second as oppose to the program using two cores which would output 8,000,000,000 Cycles per second? But if the clock speed at one core usage was 5Ghz, would the CPU lower it's usage from 25% to 20% to achieve the same amount of data processed as if it were 4Ghz at the one core usage?



In theory, again yes. 25% usage at 4ghz should work out to 20% at 5 ghz... in theory


Awesome, thanks.
m
0
l
!