As the above user says, things like limiters/vsync can cause low gpu usage. Vsync on a system that can not hold 60 fps may drop to 30fps to keep frames in sync with the screens refresh. triple buffering fixes it, but at a cost of input delay.
Here is the problem. Not all games are made the same.
For some reason people around here like to tell users they can run ANYTHING at ultra with XXX cpu, but the problem is that some games simply do not use the cpu to its potential or simply push it harder than it can handle. MMO type games simply have this as an issue. I do not think this will change any time soon.
Getting a new cpu seems to be very drastic and most stock clocked units will not be better.
Very few games will perform noticeably better on an i7 because HT is A. not used by games and B. is not a real cpu.
This is one reason so many users like to overclock. Pushing a cpu 1ghz faster can have a noticeable difference on a CPU bound game.
EDIT
If you look
here, you will see that some games(those that are not cpu bound) will not even get better performance out of a new cpu, so you better make sure this is what you want/need. You will also notice that the i5 vs i7 gives you nothing in games(some games may take advantage still).
Only certain games are that cpu hungry like
Starcraft II at 1920 x XXXX look at how close all these cards perform. So that is limited by the cpu used in the test(i7 920 @ 3.8). I do not think these tests are done with a huge battle or the fps would be lower.