romulus1 :
Fist of all I’m not talking about over-clockers or people who want to get hundreds of frames per second on the latest game. For me, I just want the game to be playable because I don't have the money to compete with overclockers or buy really expensive hardware. I would love to be able to afford a high end system and show off to my friends but alas I can't do that.
So any way, I have noticed that in the bench marks they suggest a game should have at least 30fps (seems like 40fps now) to be considered playable, why?
I live in the US and for most of my life I have been fine with TV at 25p. I have heard that most people can’t tell the difference for anything faster than 17 fps or something like that.
So why do games require so much more to be considered "playable?" I have played many games at around 25 frames per second and it seemed perfectly fine to me. I could blame my suckyness on chopyness (at around 25fps) but I would be lying my ass off.
I know some people have super vision and can probably tell the difference between 17 fps, 25, 30 and so on but that's probably a very small percentage of the population. So why all the fuss about not getting 30 or 40fps?
Whilst not many people are sensitive enough to put a definite label on 25 fps or 30 fps, there is a big and easily perceived difference in smoothness of animations between a game running at 30 fps and a game running at 60 fps. Its easiest spotted in running characters where you see limbs moving and even at 30 fps you are more likely to see limbs "jumping" from one point to another rather than smoothly moving in the space between.
The reason low frame rates work for movies is because of the way they are shot as a series of "long exposure" photos, each frame of a movie is "blurred" by the motion of period of a time, the "blur" in each frame is what tricks your eyes into seeing smooth motion when in fact the frame rate isnt high enough to see it smoothly. Video game "frames" though however are hard edged with no motion blur covering the point from one frame to the next its a "new" "solid picture" Human eyes are VERY sensitive to this sort of thing, whilst a game is certainly playable enough with 30fps (providing the minimum fps arent too low) the smoothness of animation is poor at 30 fps.
The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft! Thats 220fps the human eye was able to see! Our vision is NOT limited to 30fps we dont see in frames at all!
There is a easily noticed difference in smoothness of animation in video games between 30 and 60fps Many people may not care about this difference however its very easy to see even with average vision. If your frame rate was a solid 30 with no lower or higher deviation your game play would be great however many people would find the animation quality annoying! its not so much a game play issue as a image quality issue. Minimum frame rates are the game play issue, and in most cases a high average frame rate reduces the chance of bad minimum frame rates.
this question has been asked so many times it's unreal. however, tv's use prerecorded images and show them in a way that although quite a low fps, causes the eye to blend them together to form a fluid image.
games however, are not prerendered and so require a minimum fps of around 30FPS depending on the speed of the games action to maintain what appears as fluidity.
a big problem and one which alot of people on this site are guilty of is when they claim you need such and such a processor to get decent fps or avoid "bottlenecking". it is complete nonsense of course as monitors are for the most part limited to 60FPS through electronics and usually less than that by response times of the actual panels. even with today's fast response times advertised they are usually still not capable of displaying that many fps unless anyone has evidence otherwise.
True modern LCD displays are often limited by 60hz and at times by response times generally the response time issue is reduced by one frame being similiar to the preceeding one meaning pixels dont need to change as drastically as if it was a complete redrawing event the quoted 2ms grey to grey response times of a modern gaming lcd works out as a 50fps limit - IF every pixel on the screen is being redrawn with a completely contrasting new colour EVERY frame.
However dinosaur CRT's have an advantage over newer lcd panels in frame rates many CRT's can display 130 frames per second or more happily
(depending on resolution normally) and with no response times to worry about.