I have - what to some - might seem like a simple question; however, I can't seem to get a straight answer about it.
How do FSB and multiplier relate in overall speed? Let me explain...
When I run Flight Simulator (a CPU intensive game opposed to GPU) with my system clocked at 3.4ghz (9x378) the frame rate is slightly less than when I run it at 3.2ghz (8x400). Why is this? I would expect the first senario to be faster.
Hence, I am wonder if someone could explain why this is, and if this is to be expected.
Thanks.
How do FSB and multiplier relate in overall speed? Let me explain...
When I run Flight Simulator (a CPU intensive game opposed to GPU) with my system clocked at 3.4ghz (9x378) the frame rate is slightly less than when I run it at 3.2ghz (8x400). Why is this? I would expect the first senario to be faster.
Hence, I am wonder if someone could explain why this is, and if this is to be expected.
Thanks.