Why is it that games running at 120fps or higher look smoother and faster than games running at 60fps even though I have a 60hz lcd which only does 60fps? I have read a lot of postings where people say that anything over 60fps is useless, but I fully disagree with this statement. I play a lot of iRacing and when I set it to run at 60fps I can see a noticeable difference in the smoothness of the game, and this in turn makes the racing feel slower. And it is not a matter of it fluctuating frame rates, It stays at exactly 60fps or I can set it to 120fps and it stays there contently, no fluctuation what so ever. If i set it to no limit it runs at about 250-300fps, but i do see no difference between 120 and letting it max out and I read that you should set it at multiples of 60 if possible to get better quality, so that is why I leave it at 120fps since i can run all the tracks at 120 or higher without it ever dipping under. I also do this because my sli setup pulls 50w less power and makes much less fan noise by limiting it 120fps rather than letting it work as hard as it can and I do not see a visual benefit to higher than 120fps. I just want to know why it does look better. Does it have something to do with the video card itself doing some kind of smoothing or averaging before it sends it to the monitor?