Sign in with
Sign up | Sign in
Your question

Why the demand for very high FPS in games?

Last response: in Video Games
Share
July 28, 2008 8:26:40 AM

Fist of all I’m not talking about over-clockers or people who want to get hundreds of frames per second on the latest game. For me, I just want the game to be playable because I don't have the money to compete with overclockers or buy really expensive hardware. I would love to be able to afford a high end system and show off to my friends but alas I can't do that.

So any way, I have noticed that in the bench marks they suggest a game should have at least 30fps (seems like 40fps now) to be considered playable, why?

I live in the US and for most of my life I have been fine with TV at 25p. I have heard that most people can’t tell the difference for anything faster than 17 fps or something like that.

So why do games require so much more to be considered "playable?" I have played many games at around 25 frames per second and it seemed perfectly fine to me. I could blame my suckyness on chopyness (at around 25fps) but I would be lying my ass off.

I know some people have super vision and can probably tell the difference between 17 fps, 25, 30 and so on but that's probably a very small percentage of the population. So why all the fuss about not getting 30 or 40fps?

More about : demand high fps games

July 28, 2008 8:57:25 AM

40fps in a normal scene leaves some (normally just enough) power for when heavy scenes pop up. You wouldn't want to play Half Life 2's first few stages at 25fps (kinda playable) and get half way into the game, hit a graphically intense scene and operate at 15-20fps(definately not playable).
Related resources
July 28, 2008 9:45:55 AM

romulus1 said:
Fist of all I’m not talking about over-clockers or people who want to get hundreds of frames per second on the latest game. For me, I just want the game to be playable because I don't have the money to compete with overclockers or buy really expensive hardware. I would love to be able to afford a high end system and show off to my friends but alas I can't do that.

So any way, I have noticed that in the bench marks they suggest a game should have at least 30fps (seems like 40fps now) to be considered playable, why?

I live in the US and for most of my life I have been fine with TV at 25p. I have heard that most people can’t tell the difference for anything faster than 17 fps or something like that.

So why do games require so much more to be considered "playable?" I have played many games at around 25 frames per second and it seemed perfectly fine to me. I could blame my suckyness on chopyness (at around 25fps) but I would be lying my ass off.

I know some people have super vision and can probably tell the difference between 17 fps, 25, 30 and so on but that's probably a very small percentage of the population. So why all the fuss about not getting 30 or 40fps?


I dunno. I've never really seen much more than 35 fps with my 690G... I reckon its only noticeable below 12fps... Except most people tend to mean fps were speed is really critical. I normally play rts and rpg. It doesn't seem to be as important. However World in Conflict tends to get my computer at an unacceptable 7fps =D
July 28, 2008 10:40:16 AM

romulus1 said:
Fist of all I’m not talking about over-clockers or people who want to get hundreds of frames per second on the latest game. For me, I just want the game to be playable because I don't have the money to compete with overclockers or buy really expensive hardware. I would love to be able to afford a high end system and show off to my friends but alas I can't do that.

So any way, I have noticed that in the bench marks they suggest a game should have at least 30fps (seems like 40fps now) to be considered playable, why?

I live in the US and for most of my life I have been fine with TV at 25p. I have heard that most people can’t tell the difference for anything faster than 17 fps or something like that.

So why do games require so much more to be considered "playable?" I have played many games at around 25 frames per second and it seemed perfectly fine to me. I could blame my suckyness on chopyness (at around 25fps) but I would be lying my ass off.

I know some people have super vision and can probably tell the difference between 17 fps, 25, 30 and so on but that's probably a very small percentage of the population. So why all the fuss about not getting 30 or 40fps?


Whilst not many people are sensitive enough to put a definite label on 25 fps or 30 fps, there is a big and easily perceived difference in smoothness of animations between a game running at 30 fps and a game running at 60 fps. Its easiest spotted in running characters where you see limbs moving and even at 30 fps you are more likely to see limbs "jumping" from one point to another rather than smoothly moving in the space between.

The reason low frame rates work for movies is because of the way they are shot as a series of "long exposure" photos, each frame of a movie is "blurred" by the motion of period of a time, the "blur" in each frame is what tricks your eyes into seeing smooth motion when in fact the frame rate isnt high enough to see it smoothly. Video game "frames" though however are hard edged with no motion blur covering the point from one frame to the next its a "new" "solid picture" Human eyes are VERY sensitive to this sort of thing, whilst a game is certainly playable enough with 30fps (providing the minimum fps arent too low) the smoothness of animation is poor at 30 fps.

The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft! Thats 220fps the human eye was able to see! Our vision is NOT limited to 30fps we dont see in frames at all!

There is a easily noticed difference in smoothness of animation in video games between 30 and 60fps Many people may not care about this difference however its very easy to see even with average vision. If your frame rate was a solid 30 with no lower or higher deviation your game play would be great however many people would find the animation quality annoying! its not so much a game play issue as a image quality issue. Minimum frame rates are the game play issue, and in most cases a high average frame rate reduces the chance of bad minimum frame rates.

Quote:
this question has been asked so many times it's unreal. however, tv's use prerecorded images and show them in a way that although quite a low fps, causes the eye to blend them together to form a fluid image.

games however, are not prerendered and so require a minimum fps of around 30FPS depending on the speed of the games action to maintain what appears as fluidity.

a big problem and one which alot of people on this site are guilty of is when they claim you need such and such a processor to get decent fps or avoid "bottlenecking". it is complete nonsense of course as monitors are for the most part limited to 60FPS through electronics and usually less than that by response times of the actual panels. even with today's fast response times advertised they are usually still not capable of displaying that many fps unless anyone has evidence otherwise.


True modern LCD displays are often limited by 60hz and at times by response times generally the response time issue is reduced by one frame being similiar to the preceeding one meaning pixels dont need to change as drastically as if it was a complete redrawing event the quoted 2ms grey to grey response times of a modern gaming lcd works out as a 50fps limit - IF every pixel on the screen is being redrawn with a completely contrasting new colour EVERY frame.

However dinosaur CRT's have an advantage over newer lcd panels in frame rates many CRT's can display 130 frames per second or more happily :D  (depending on resolution normally) and with no response times to worry about.
July 28, 2008 10:53:53 AM

^ which is why hardcore gamers/ graphic artists (not me =P) use CRTs...

true colours and insta-imaging =)

The eye can't see that fast anyways =P

Then again, RTSes do require some good fps to run smoothly. Not as high as fps tho... That's why I'm worried about Mirror's Edge... 690G + Mirror's Edge = XP
July 28, 2008 2:42:34 PM

Thanks for the responses.

I guess I do play RTS and RPGs the most. Witch is probably why higher frame rates are not as important to me.

July 28, 2008 6:35:17 PM

I think it is more consistency. TV is prerecorded so its a constant 25/30fps, whereas games can vary. If you play games long enough at say 20fps (having never played above 20), you'd expect to perceive 20fps as smooth. However, playing for example Crysis, at 50/60fps indoors, then walking outside to around 30, it will appear choppy.

My $0.02.
July 28, 2008 6:36:30 PM

'Was raised on Sonic the Hedgehog, on Sega 16bit, can easily distinguish between 30 and 60fps, around 40 I start to think it's choppy..

I do however, play twitch shooters a lot, like Unreal Tournament 3 and the likes, pretty sure it's a skill you can "learn" and if you never play faster games then possibly you're blind to it.

I notice flickering on CRT TV sets, in fact its noticeable enough that I rarely to never watch TV, it makes my eyes tired.
To me it's like staring at a disco strobe.. :p 

.. and I find xbox 360 unplayable since it easily dips into 20fps... GTA4 is one serious lag fest on that thing.
July 28, 2008 9:09:47 PM

play BHD and you will know what 25fps is lmao but then its not something that bothers me.

in the end of the day aslong as the game runs and doesnt go super choppy to where its unplayable then i dont care.

i worry about gameplay not fps
July 28, 2008 9:23:49 PM

This topic has been beat to death, but I'll throw my 2 cents in anyways. To me the most important thing is a stable from rate. I'll take Crysis at a steady 20fps over a varying 20-30fps. I play MVP Baseball 2005, which being a few hours old is nothing for my rig. However since I normally get huge framerates with it any drop (which happens frequently durring the game due to poor programming) causes a noticable differnce. I actually cranked up the AA to 16x which is far beyond what the game needs just to get a more stable frame rate.
July 31, 2008 7:11:14 PM

People care about fps because:

1. Bragging Rights
2. Smoother Gameplay (but there's no noticeable difference between 160 and 210.) Anything

I only expect extreme high fps on CS:S because it is such an old game, even the 6800gt can get decent frames (around 60) on it. My 9800gtx could get over 250+ on the stress test with 1680x1050, hdr, all settings high, and 16xQ CSAA. 17 fps on CS:S is WAAAAAAAAAY different than 25, which is WAAAAAAAAAAAAY different from 30, which is WAAAAAY different from 190. but 95 isn't WAAAAAAAAAAAY different from 190. Maybe when you turn theres just a small choppyness, but I can't tell about that because I wear glasses. I've gotten 9 fps, 15 fps, 30 fps, 40fps, and 60 fps on my 945gm intel graphics accelerator on my laptop. way different from the 190 i get on my 9800gtx.

30-40 fps is NOT acceptable unless you're playing crysis.

If most of the population in the world (75%) can't see the letters on a book, then they cannot spot the diffrence of 16 fps and 190 fps, but us super visioners are seriously unlucky.
August 1, 2008 2:25:03 PM

If it's a title on a console, 30fps is fine as long as there aren't many framerate drops. On the PC, I generally prefer 60+ as titles on the PC usually aren't as well optimized, so while it may be 60fps in the room you're currently in, you step out of it and that framerate may get cut in half.
August 23, 2008 9:52:32 AM

60 FPS is all the human eye can see....

anything over that is simply for bragging rights. and anyone who says they can tell the difference between 60 and 80 or something needs to go visit nasr or something to get tested.

30FPS is what standard TV runs at.

August 23, 2008 11:07:51 AM

^ NTSC or PAL will determine that...
!