Sign in with
Sign up | Sign in
Your question

What is CPU clock rate in layman's terms?

Last response: in Overclocking
Share
January 19, 2014 10:27:41 PM

I used to think CPU is clock rate is associated with speed with which the processor can do "calculations/work".
Now, I Know WEI is "garbage" but I wanted to test out my rig's score. CPU score 7.1 with FX-6300 at stock 3.5GHz. I overclocked my CPU to 4.1GHz. Now when I say 4.1GHz, does that mean all six cores work at 4.1GHz? However, the increase in score was just 0.2 with score being 7.3.
I have seen people with 3.xGHz processor with 7.5+ sub score. How is that possible if its about speed.
Also, why do game makers advise a higher clock rate requirement for AMD compared to Intel? For e.g. Watch Dogs has recommended processor of "Core i7 3770 @ 3.5Ghz or AMD FX-8350 @ 4.0Ghz". If the AMDs processor was a FX-6300 @ 3.5GHz, shouldn't the speeds be same as Core i7 3770 @ 3.5GHz?
Would appreciate help understanding clock rates.
January 20, 2014 8:44:28 AM

You should see a cpu clock rate as the frequency at it is working, people think of it as speed but it is not.

cpus with a better technology can operate faster at a lower frequency.

cpus with same technology with a higher frequency are faster of course.

Quote:
The clock rate of a CPU is most useful for providing comparisons between CPUs in the same family. The clock rate is only one of several factors that can influence performance when comparing processors in different families. For example, an IBM PC with an Intel 80486 CPU running at 50 MHz will be about twice as fast (internally only) as one with the same CPU and memory running at 25 MHz, while the same will not be true for MIPS R4000 running at the same clock rate as the two are different processors that implement different architectures and microarchitectures. There are many other factors to consider when comparing the performance of CPUs, like the clock rate and width of the CPU's data bus, the latency of the memory, and the cache architecture.
The clock rate alone is generally considered to be an inaccurate measure of performance when comparing different CPUs families. Software benchmarks are more useful. Clock rates can sometimes be misleading since the amount of work different CPUs can do in one cycle varies. For example, superscalar processors can execute more than one instruction per cycle (on average), yet it is not uncommon for them to do "less" in a clock cycle. In addition, subscalar CPUs or use of parallelism can also affect the performance of the computer regardless of clock rate.
!