Ok..so im having a discussion with my cousin ..and we were wondering if i take a computer that is (hypotheticallY) 8.0 ghz..and compare it to a 4.0 ghz....and they each have the same configuration of memory and video card ....how will they conmpare when playing a game such as F.E.A.R or FARCRY..or BF2....is there a difference seen..in the game..fps....or ..would they be exactly similar..??
On a side note.how does the mhz..or ghz of a system.affect the outcome of a computer..meaning..how is a 2.0 ghz..different...froma 4.0 ghz.??.is there a marked difference in speeds???? CAN we feell the difference when im using it..in a normal day(i.E..normal processes?)....??
Most games aren't that CPU dependent. I would think that the difference in CPU speed from 4Ghz to 8Ghz in a game would not be very noticeable. However, in other processes such as encoding, compressing, and rendering, there would be a pretty big difference.
Exactly, there would not be a huge difference, persay in the fps of a computer game because the processor speed, as stated above, doesnt affect computer games in a major way. Dual Processor's, with all the whistles are often compared with others through various tests and one of the most common tests are the 3D Benchmarking. This among other video compatibility tests, are ran to test how well the processor does with such applications, such as a computer game. Go check on google or other search sites. Fast GHz mainly helps your computer overall in applications. With AMD's new line of Dual Processors, basically in a nut shell, it shows how fast GHz and dual core processors are performing with the multi-tasking of different applications. This as well can be seen on search sites. Overall, GHz means nothing major in computer games, unless you have a very weak processor running at a low speed with very little Ram from your system and video card, high end games such as FEAR will be very hard to play. More like a strobe light room. If you want faster game play, look into a better Graphics card and better Ram. Plenty versions of improvement are out, good luck.
For lower resolutions, modern GPUs have enough power to crank out the image at maximum speed and so the limitation is the CPU which is to keep up with the physics calculations. The turning point is around 1024x768. After that, the textures and resolution gets so large that the GPU is bogged down in calculations and becomes the limiting factor. Of course activating antialiasing and filtering techniques would skew it against the GPU.
People would say that a 3200+ would bottleneck a 7800GTX because its quite possible that even at high resolutions the sheer power of the GPU can process the image faster than the CPU can calculate physics. As well, slower CPUs would have to deal with background processes while faster CPUs will have more spare processing power to divert to the game.
But that would all depend on what game you're playing as the physics of each game differ, correct? Also, even if the CPU couldn't keep up with the GPU, it would still be going pretty fast, so really it wouldn't matter that much. Like if you were playing HL2 and your GPU could pump out 200FPS, but your CPU was only able to do 150FPS, then ultimately it wouldn't matter in real life.
As everyone already said. The CPU isn't important anymore when it comes to gaming. At lower resolutions there will be a difference in FPS but you won't see any difference between 200fps and 300fps and you won't be playing at those low resolutions. At higher resolutions the GPU will be the bottleneck and CPU power doesn't matter anymore.