I used to think CPU is clock rate is associated with speed with which the processor can do "calculations/work".
Now, I Know WEI is "garbage" but I wanted to test out my rig's score. CPU score 7.1 with FX-6300 at stock 3.5GHz. I overclocked my CPU to 4.1GHz. Now when I say 4.1GHz, does that mean all six cores work at 4.1GHz? However, the increase in score was just 0.2 with score being 7.3.
I have seen people with 3.xGHz processor with 7.5+ sub score. How is that possible if its about speed.
Also, why do game makers advise a higher clock rate requirement for AMD compared to Intel? For e.g. Watch Dogs has recommended processor of "Core i7 3770 @ 3.5Ghz or AMD FX-8350 @ 4.0Ghz". If the AMDs processor was a FX-6300 @ 3.5GHz, shouldn't the speeds be same as Core i7 3770 @ 3.5GHz?
Would appreciate help understanding clock rates.
Now, I Know WEI is "garbage" but I wanted to test out my rig's score. CPU score 7.1 with FX-6300 at stock 3.5GHz. I overclocked my CPU to 4.1GHz. Now when I say 4.1GHz, does that mean all six cores work at 4.1GHz? However, the increase in score was just 0.2 with score being 7.3.
I have seen people with 3.xGHz processor with 7.5+ sub score. How is that possible if its about speed.
Also, why do game makers advise a higher clock rate requirement for AMD compared to Intel? For e.g. Watch Dogs has recommended processor of "Core i7 3770 @ 3.5Ghz or AMD FX-8350 @ 4.0Ghz". If the AMDs processor was a FX-6300 @ 3.5GHz, shouldn't the speeds be same as Core i7 3770 @ 3.5GHz?
Would appreciate help understanding clock rates.