I'm on lunch so I will try to educate you one last time on reality versus perception. I am not trying to redefine the word "seeing" I used the words "capable of seeing." You seem to be the one who does not know the definition.
Let me show you something from earlier in this thread where you proved my point. I have changed a few words, but they still work.
"Your eye (monitor) refreshes itself at 24 (60) Hz (24 (60) times per second), which is totally independent of what framerate the monitor (GPU) is feeding it. Your eye (monitor) refreshes 24 (60) times per second rather your Monitor (GPUs)"
You proved yourself wrong right there. You already knew about the differences of what is being drawn in reality, vs what is being perceived by the end user, you just forgot to put the eye's into the equation.
The problem is we can not control the "refresh" rate of our eye's as well as a computer / monitor can. So we must compensate by having the displaying data source be many times faster than what we can actually perceive.
Watch this video of a 10000 FPS camera recording an LCD.
http://www.youtube.com/watch?v=lRidfW_l4vs
If your eye's capture data faster than what is being displayed, then eventually what your eye's perceive will be out of sync with what you are looking at, just like the video above.
Now if you eye's were capable of seeing faster than 24 FPS, then every movie you have ever gone to would look something like this video (just not as drastic.) So once again, I assure you that the human eye cannot detect more than 24 FPS.
I hope this makes sense, but if not thats ok. I understand that some people "must be right because they said so." Fortunately for the rest of the world these people usually get their asses kicked enough enough in real life that they don't exhibit this behaviour in public, unfortunately for the internet no one is there to slap some sense into you.