Not really. It depends on the game, and what res you run at, as some games use more CPU power than others.
In general, I consider 2.0 Ghz (or 2000+) as the baseline for solid gaming. If your CPU is at least this good, you should see a framerate increase with each tier of card, assuming you're playing at LEAST 1024x768.
At lower resolutions than that, the CPU becomes a bottleneck much faster. At higher resolutions, the videocard tends to be the limiting factor.
On a side note, a better CPU than 2.0 Ghz (or 2000+) will also show you framerate increases. An Athlon64 3000+ will probably get 25 fps higher in a demanding game than a Pentium4 2.0 Ghz, assuming the videocard isn't bottlenecking...
________________
<b>Radeon <font color=red>9700 PRO</b></font color=red> <i>(o/c 332/345)</i>
<b>AthlonXP <font color=red>3200+</b></font color=red> <i>(Barton 2500+ o/c 400 FSB)</i>
<b>3dMark03: <font color=red>5,354</b>