This is sort of a technical question, but I think if I'm asking it others are probably curious as well.
Which elements of games are more GPU dependant and which are more CPU dependant? Obviously effects like lens flare and glowing effects are more GPU dependant, but what about increasing the texture details? Increasing polygons? Increasing maximum draw distance?
It is also my impression that game with a back end that tracks a lot of stuff or renders a lot of independantly moving objects is an example something CPU dependant. What are some other in-game variables that are CPU dependant?
Or is it more of a question of how efficiently the programmers have coded the game, and therefore some games are faster and some are just hogs?
The question behind this question is one of when and what to upgrade? (BTW, this question is also purely out of nerdy curiousity and reflects my observation of many many games over the years, not out of any pressing need to upgrade.)
He that but looketh on a plate of ham and eggs to lust after it, hath already committed breakfast with it in his heart. -C.S. Lewis
Some games are more CPU dependant than others (Unreal engines usually use more CPU power than, say, Quake engines), but generally the GPU is king of the processing where games are concerned. The CPU will be doing calculations such as the physics / in game AI etc.. and as such does affect game performance but not in the same way that the GPU does.
Texture detail is dependant on memory (larger textures = more memory required) and if the engine makes full use of the AGP port to utilise system RAM for textures as well as the memory on the video card then the CPU and mobo would have much more work to do.
To give you an idea, I would say that most of todays games would work just as well on a top end Pentium III with the best GPU you could buy as they would on a P4 with the same card. Before I get flamed, I do accept that the frame rates would be higher on the P4 but not more playable. AFter all when we see 300 fps on Quake 3 it is meaningless - once you have got past 60 fps you cannot get any smoother for gameplay than that.
In terms of your system performance, you would notice a bigger difference in Windows / Desktop mode by upgrading the CPU because that normally goes hand in hand with a new mobo and faster memory - ie the whole subsystem is faster at every level.
For gaming though, the GPU will give you the best gains with an upgrade. There are some situations where your CPU could be so old that it actually limits the game no matter what video card you have but this is unlikely as most newer games have a higher minimum requirement in the CPU bracket anyway.
For overall system performance, an average CPU and video card will be a better bet than a great CPU and poor GPU or vica versa. So for example an FX5600 coupled to a P4 2.4 would be better than a 600MHz Celeron coupled with a Radeon 9800XT.
In the end it is all based on your what you will mostly use your PC for.
4.77MHz to 4.0GHz in 10 years. Imagine the space year 2020
AA, AF, hardcoded directX effects and shader effects, and texture size that can be held in video card memory are GPU dependant.
Artificial intelligence is always CPU dependant. Texture size is also dependant on your CPU/memory/hard disk if it can't all fit into video card memory. The size of the game level and the memory it takes is also CPU/memory/hard disk dependant.
It also varies alot depending on the game in question and how it is coded, but these generalizations usually hold true.
All I know is when I upgraded my cpu from 1.5willimette with pc133sdram to a 2.66ghz northwood dual channel memory while retaining the same radeon9600 it made all my games much more playable. BF1942 went from around 30fps to 100fps. I dont know but I still think having the best gpu with an obsolete cpu is a waste.