Hi knobcreekman,
Depends on your genes mate,,,,no seriously,,,, and what TYPE of monitor you use. People see between +- 45-70 fps with our eyes. new LCD screens (not HD TV's) are usually 60 or 59 Hz. More advanced and Professional screens can reach 75 Hz and higher.
If you have an old friend with a CRT Tube TV look slightly over the TV and youll notice a flicker effect in your peripheril vision.Tiny cells in your eye around the center point that are very sensitive towards 'motion' so as to draw your attention to an oncoming object for example^^(Thanks evolution).
But if you look straight at the TV the flicker will seem to disappear. That TV is quickly flashing from the top left pixel,pixel-by-pixel left-to-right, row below that left-to-right again until it reaches a resolution of 320 pixels across, by 240 pixels down around 60 frames a second onto the phosphor grill to create the illusion on continous motion. (For the generations that missed the Cathode Ray Tube type screen era)
You dont see this in LCD's as they normally have 5ms response time, meaning the pixel stays that color shade for 5ms before a new shade is realised by your eye. A 2ms LCD is good and less strenious on your eyes. I remeber when the very first bulky LCD came out with 16ms reponse, It was aweful for fast paced games as the image literally left a smear-trail, almost like a Motion-Blur effect.
On the other hand if you are aiming for prolonged gaming and a nice Big 40" and bigger HDTV option, get a 200Hz or better like the Bravia's from Sony.There are even 600Hz's available. Because the flicker effect is more notable on larger screens.
Its not such an easy question to answer
But now you have a broader perpective of factors, at least a few anyway ^^