you will struggle to see improvement anything over (for most humans) 30fps although some people can detect higher, although for nvidia 3d you need bare minmum 60fps, and ideally 120 fps else it will feel nasty and cause headaches.
I think most of the desire for high fps is pushing the average up. When running most 3d applications it doesn't just sit at a single frame rate. When you change camera views, etc., it changes the frame rate. Using a more powerful GPU will allow you to have more breathing room while running the program.
Another reason for 60 fps is getting vertical sync (vsync) where the framerate is capped at the refresh rate of the monitor(if the monitor has a different refresh rate the number changes), which reduces tearing, lines or any other glitches.
Mi1ez is right as well. Some people swear that anything below 60fps is "unplayable".
Plus there are those of us who are benching junkies and like to see how high we can get our fps.
While technically the human eye wont notice if you show something at 30FPS or 300FPS if the images are done properly, this doesnt quite apply to games due to the nature in which the images are displayed. In movies when something is moving across the screen its location gets blurred between frames, this helps convince your eyes that it is moving and it seems a bit more realistic. In games however, items do not blur from one location to another, they have a specific location in each frame and items that are in motion are generally not rendered with a blur so higher FPS help smooth this out as your eyes are blurring the frames together anyway, it just makes each item jump less between frames so it seems smoother and more realistic, this is why some games come with an option to enable motion blur.
Most people are only affected once it drops below about 40FPS which is why tom's uses that as their "playable" threshhold.