Why is it that games running at 120fps or higher look smoother and faster than games running at 60fps even though I have a 60hz lcd which only does 60fps? I have read a lot of postings where people say that anything over 60fps is useless, but I fully disagree with this statement. I play a lot of iRacing and when I set it to run at 60fps I can see a noticeable difference in the smoothness of the game, and this in turn makes the racing feel slower. And it is not a matter of it fluctuating frame rates, It stays at exactly 60fps or I can set it to 120fps and it stays there contently, no fluctuation what so ever. If i set it to no limit it runs at about 250-300fps, but i do see no difference between 120 and letting it max out and I read that you should set it at multiples of 60 if possible to get better quality, so that is why I leave it at 120fps since i can run all the tracks at 120 or higher without it ever dipping under. I also do this because my sli setup pulls 50w less power and makes much less fan noise by limiting it 120fps rather than letting it work as hard as it can and I do not see a visual benefit to higher than 120fps. I just want to know why it does look better. Does it have something to do with the video card itself doing some kind of smoothing or averaging before it sends it to the monitor?
I forget the exact number off the top of my head now, but the human visual pathways are only capable of processing information at something like 72fps. That number may not be quite right, but the specific number isn't really important. Even though the pathways become saturated at that point, we can still perceive a difference between say 72fps and 150fps, it just will not be a roughly 2 fold increase in perceived quality.
Probably the best way to think about this, is to say that the higher the FPS, the more finely broken down the animation is. The change between any given frame is much smaller. So at any given moment looking at the display, you are seeing the result of that more finely grained animation.
So if you put these two things together, I would say that 60fps on a 60Hz display would mark the point of diminishing returns to bring in an economic term. Every fps after 60fps will net you an increased return that is forever diminishing. So say 50fps to 60fps will net you a rather large benefit in overall quality, let's say 25% for the sake of this discussion, but going from 60fps to 70fps might only net you a benefit of say 15%, and then going from 70fps to 80fps is only 6%, and so on until the gain is effectively zero.
Thank you for all the responses.
I understand there is a threshold that we can perceive. My point is why does it look better above 60fps when my 60hz monitor can only display 60fps? Logically it makes no sense, unless there is more going on before sending it to the monitor if it is at a higher fps. I was hoping someone could explain the process that is going on in the hardware.
Theres more to it than just 60fps
Look at most all reviews, they give max, average and low fps, where the speeding up and slowing are seen as well as delays caused by various means, such as the game engine, a driver or the hardware itself, even if that hardware is very good.
Look at Tech Reports gpu reviews, as they go thru this very well
Smoothness isn't changed at all, but you may notice a latency difference, which you feel as just being more responsive.
However, you also have to turn off v-sync for it to gain responsiveness, which means you'll have to live with tearing.
Watching in game cut-scenes/movies which are usually at 30 or 60 FPS, and they look a lot smoother than the same FPS are when you control the action. That is the latency or responsiveness giving you that feel.