So Im trying to figure out how to build a really cheap system that will get me through CS:S (Hl2) in low settings at a high fps even if I have to switch the game to a lower direct x version. I look in the cpu benchmarks and notice that these games I want to play are receiving high FPS ratings.
So for example a 7750 x2 recieves 101fps for crysis in 1680x1050 (setting quality unspecified) so that whould lead me to believe that hey I can get 101fps running this game off the CPU alone. Im pretty sure thats wrong but hey who knows. If it is wrong then the benchmark is completely useless. Anyone care to lend my a helping hand?
CPU power affects gaming performance, but not nearly as much as the GPU in most cases. The purpose of CPU gaming benchmarks is to show the performance difference in games between different CPU's using the same graphics card, Tom's should have it listed some where what GPU they use for those tests.
One word: Practicality
Hence the relic-ness of those tests.
Since no one uses low-res for gaming anymore and the common 1680x1050 or 1920x1200 with max. eye candy produces minimal FPS difference once CPU clock reaches over a certain speed on modern games, do you see the point now?
You want raw CPU processing power comparison that's going to save you time? Go read multithreaded Cinebench or x264 encoding results for that. Again; practicality.
A proper CPU test for games would be something like testing which games under a typical/practical setting that uses CPU more than others e.g. Supreme Commander
There you see, first hand example of a reader who didn't want to engage their brain in thinking what they need to extract from a pile of info in front of them. I rest my case.
Incorrect. You can not test any component without removing the others as a factor. Even in a gme like Supreme Commander, the GPU still has an effect if the settings are raised, and has the net effect of causing the results to come closer together then they actually are. To prove this: Lower all settings as low as they can go, and then bump all the CPU-Independent settings up one notch. The FPS drop (usually at least 20%) is all GPU overhead, and the reason why you can't compare CPU's for gaming without low-res testing, as the result set will always leave a smaller diffrence between CPU's if you don't reduce settings.
Its that same reason why after a certain speed you get no increase; thats the GPU not being able to render enough frames.
FPS is two parts: How quickly the CPU can send the GPU information to draw a frame, and how many frames the GPU can draw, per second.
If the CPU is too slow, the GPU is unused, and draws whatever frames the CPU sends. Increasing CPU speed will cause a jump in FPS.
If the GPU is too slow, the CPU will send more frames then the GPU is capable of drawing. Increasing the CPU speed will cause no jump in FPS.
As such, there are two main components for getting FPS, and testing one while the other is still a factor will lead to incorrect results. Lowering all settings to minimum removes any effect the GPU has (and in some games, the GPU can still be a factor, such as Crysis, even at lowest settings). The FPS you get during testing will then be how many frames the CPU is capable of producing per second, before the GPU is taken into account. You can then increase settings and see how much the GPU is affecting performance.
*Remember to mark which settings involve the GPU, CPU, and both. Avoid changing settings that can affect both components, as that also can skew testing.