Sign in with
Sign up | Sign in
Your question
Solved

Frame rate confusion.

Last response: in Graphics & Displays
Share
January 7, 2013 4:53:17 PM

Other sites indicate that a game must display at least 25 fps to be playable. For modern games that means a fairly expensive setup but this minimum seems too high.

For most of the twentieth century animations were projected at 24 fps but each cell was displayed twice for a rate of 12 different frames per second. Even a modest computer system today can run modern games at more than 12 unique fps and a overall display rate of 60 fps.

Why is the minimum for computers more than double that of films?

More about : frame rate confusion

a c 130 U Graphics card
January 7, 2013 5:22:43 PM

Its basically down to film using motion blur as I understand it. Film was only ever shot at 24 FPS out of convenience to editing and as it balanced that convenience with the cost of film.

These days some games do use a version of motion blur but its not the same as how it is used in film. A lot of its down to POV and known motion direction which is not as set in a game as it is in a film which means you need more FPS for it to look smooth.

There are no set numbers to this as far as gaming is concerned. You might need totally different settings from the next guy to see a smooth looking experience in a game. Its game dependent as well with some games seeming to handle playing at lower FPS much better than others.

I have tried to keep this simple

Hope it helps.

Mactronix :) 
m
0
l
a c 87 U Graphics card
January 7, 2013 5:40:53 PM

Try it yourself - play a game with the framerate capped at 12 frames a second. Your monitor, of course, will still be producing 60/second - it always does. Now change the framerate to 60/second. It makes a HUGE difference.

Here's a bit of extracurricular:
http://www.threadmeters.com/wow/v-84AQ/quoteyes_can_onl...
m
0
l
Related resources
Can't find your answer ? Ask !
a b U Graphics card
January 7, 2013 5:45:05 PM

My thing is people say they cant see a difference between 30 and 60 fps..I can..30 frames its like very small jerks inbetween motion and 60 just looks smoother. Try playing need for speed on 30 then play it on 60 you will see the difference
m
0
l
a b U Graphics card
January 7, 2013 5:52:50 PM

Keep in mind that the shutter speed on the camera filming will generally be equal to the amount of time the frame would be displayed on the screen for a movie. This causes very accurate motion blur in film, where as in most cases for games each frame is very distinct from the next making them much easier distinguish. Fake motion blur techniques can sometimes mask this, but generally looks really fake...

Also, constant frame rates are usually perceived as much smoother looking than a constantly changing one. For a movie seeing 24FPS all the time seems smooth because you get used to the way that movies look. If you are running from a closed in indoor scene on a PC game that's running at 60FPS and you run out through a door where an open vista battle is going on where you drop to 20FPS you really notice the drop.

Add in to this that frames on a PC generally are used in sync with the monitor refresh rate so ever single frame is on the screen for different periods of time in cases where your frame rate being supplied doesn't match some multiple of the screens refresh rate. As an example, 20FPS on a 60Hz screen will just 'feel' smoother than 24FPS on a 60Hz screen if your Vsync is on because in 20FPS each frame is displayed for 48ms (3 screen refreshes) where as in the 24FPS scenario where some frames will be on the screen different amounts of time. You don't notice this as much, but it makes it just seem off in some way. This is the main reason for 120Hz TV's, which can display 24p 30p and 60p data all in even number of screen cycles where a 60hz TV has a judder effect when displaying 24p content. Again, you don't really notice it as much, but it just feels/looks wrong when you are seeing it.

Edit: as a visual reference, try this: http://boallen.com/fps-compare.html
m
0
l
a b U Graphics card
January 7, 2013 5:58:32 PM

Traciatim said:
Keep in mind that the shutter speed on the camera filming will generally be equal to the amount of time the frame would be displayed on the screen for a movie. This causes very accurate motion blur in film, where as in most cases for games each frame is very distinct from the next making them much easier distinguish. Fake motion blur techniques can sometimes mask this, but generally looks really fake...

Also, constant frame rates are usually perceived as much smoother looking than a constantly changing one. For a movie seeing 24FPS all the time seems smooth because you get used to the way that movies look. If you are running from a closed in indoor scene on a PC game that's running at 60FPS and you run out through a door where an open vista battle is going on where you drop to 20FPS you really notice the drop.

Add in to this that frames on a PC generally are used in sync with the monitor refresh rate so ever single frame is on the screen for different periods of time in cases where your frame rate being supplied doesn't match some multiple of the screens refresh rate. As an example, 20FPS on a 60Hz screen will just 'feel' smoother than 24FPS on a 60Hz screen if your Vsync is on because in 20FPS each frame is displayed for 48ms (3 screen refreshes) where as in the 24FPS scenario where some frames will be on the screen different amounts of time. You don't notice this as much, but it makes it just seem off in some way. This is the main reason for 120Hz TV's, which can display 24p 30p and 60p data all in even number of screen cycles where a 60hz TV has a judder effect when displaying 24p content. Again, you don't really notice it as much, but it just feels/looks wrong when you are seeing it.

Edit: as a visual reference, try this: http://boallen.com/fps-compare.html


http://boallen.com/fps-compare.html= Which is why i love pc gaming :) 
m
0
l
a b U Graphics card
January 7, 2013 6:18:57 PM

Murray B said:
Other sites indicate that a game must display at least 25 fps to be playable. For modern games that means a fairly expensive setup but this minimum seems too high.

For most of the twentieth century animations were projected at 24 fps but each cell was displayed twice for a rate of 12 different frames per second. Even a modest computer system today can run modern games at more than 12 unique fps and a overall display rate of 60 fps.

Why is the minimum for computers more than double that of films?


The hobbit was filmed at 48 fps. Audiences at the screenings complained that it was *too* real. I.e. the props and costumes weren't able to keep up with the clarity of the picture. For example Gandalf's staff looks like gnarled wood at 24 fps. At 48 fps, however, it looked like what it was...a plaster prop.

Also, the statemet that the human eye can only perceive 30 fps is a vast oversimplification at best, focusing on just one aspect of sight.

http://www.100fps.com/how_many_frames_can_humans_see.ht...
m
0
l
a c 173 U Graphics card
January 7, 2013 6:28:41 PM

determinologyz said:
My thing is people say they cant see a difference between 30 and 60 fps..I can..30 frames its like very small jerks inbetween motion and 60 just looks smoother.

thats high frame latency giving you the jerkiness. fps may be 30, but some frames are taking longer to render than others, giving the impression of low/jerky framerate. you must have an amd card ;) 
m
0
l
a b U Graphics card
January 7, 2013 6:36:59 PM

iam2thecrowe said:
thats high frame latency giving you the jerkiness. fps may be 30, but some frames are taking longer to render than others, giving the impression of low/jerky framerate. you must have an amd card ;) 



I have a gtx 660ti. But in general gaming 60 frames runs better then 60 and from what i heard ati has problems in that area
m
0
l
a c 173 U Graphics card
January 7, 2013 7:01:59 PM

determinologyz said:
I have a gtx 660ti. But in general gaming 60 frames runs better then 60 and from what i heard ati has problems in that area

yeah, the problem isnt that bad, i was just stirring. but the 7950 does seem to have a more significant problem with latency than other cards.
m
0
l
a b U Graphics card
January 7, 2013 7:06:21 PM

iam2thecrowe said:
yeah, the problem isnt that bad, i was just stirring. but the 7950 does seem to have a more significant problem with latency than other cards.

What about the 7970 7970 and would say the 8970 and the 8990 would they have the same problems ?
m
0
l
January 7, 2013 7:19:37 PM

mactronix said:


...I have tried to keep this simple

Hope it helps.

Mactronix :) 

Thank you for responding, macrtronix, but it is not what I am looking for. I will try to explain in more detail why i am trying to understand the meaning of these specificatons.

Last year I found out that SWTOR could be downloaded and played for free. Having played most of these games since Dark Forces for DOS this made me so excited that I forgot to check the minimum requirements. The game ran sluggish at first but after setting Windows to optimize for performance and choosing 1024 X 600 resoulution my sistem managed 15 - 30 fps and averaging about 18. Since vision persists at only 12 fps I found the game was very playable and thought I was playing it just fine.

Towards the end of December I went on the forums to find out why the game stutters in certan areas and at certain times of day. The answer was server lag and there is nothing end users can do about it. At the same time I discovered my E-450 based machine is far below the minimum specification. Most "experts" seem to agree that my machine cannot run modern games at all.

Now I'm worried that SWTOR might be overloading my machine somehow. Gameplay is fine but my BrasosTweaker tool indicated that the GPU temperature rises from a normal 42 degrees C to as high as 57 degrees C while playing the game. So far i have been unable to discover what the maximum allowable temperature of a HD 6320 is.

Since I need the computer for other things and cannot afford to replace it any time soon I will just have to stop playing SWTOR until i figure out why my machine is not supposed to be good enough to run modern games. This is hard for me because I am retired and my health prevents me from going outside right now so playing SWTOR was a good way to pass the time.
m
0
l

Best solution

a c 130 U Graphics card
January 7, 2013 8:21:14 PM

What the issue here is that you are gaming on a machine that is not really designed to play games properly. It helps that the minimum specs or SWTOR are very low. This is enabling you to just manage to play the game.

Minimum specs are generally accepted to actually mean the game will just about play on the hardware. Recommended specs are required to play most games to a reasonable level.

Its testament to how well the AMD APU packages perform AMD that the game plays at all in my opinion.

Temperature wise you are fine the Max temperature is 90.


Mactronix :) 
Share
January 7, 2013 8:36:43 PM

Vision does not persist at 12 fps. A movie shown at only 12 fps would look like, well, like an old movie, where you pretty much see the gaps in between the film frames. At 24 fps you are approaching 30 fps at least, thus having much more blur between the frames so that it's not as detectable. Showing the same image over 12 frames at 24 fps is still 24 fps, not 12 fps.

You are confusing the number of frames an image spans with the number of frames shown per second. The more frames an image spans the longer it's on the screen thus creates faster or slower charactor movement. The more frames per second you see the more blurred together all those individual frames are, creating a more seamless experience.
m
0
l
January 7, 2013 9:03:55 PM

mactronix said:
What the issue here is that you are gaming on a machine that is not really designed to play games properly...Temperature wise you are fine the Max temperature is 90.Mactronix :) 

Thank you for the data, macrtronix, it is a relief to know that I am not overheating the system. My fear was that the doubled fps requirement had something to do with having headroom and running SWTOR would be harmful to my machine. Now I can play without worry and I thank you for that.

Before I retired I used to teach and one of my students brought in his alien made laptop to show me. That was an impressive gaming machine but cost about three times as much as mine did. As it was, being on a fixed income, the E-450 was the best I could afford but I would still really rather have one of those alien machines. Tradesies anyone?
m
0
l
a c 130 U Graphics card
January 8, 2013 7:54:54 AM

Don't think your going to get many takers.

Glad to have helped.

Mactronix :) 
m
0
l
a b U Graphics card
January 8, 2013 10:46:45 AM

I used brazostweaker to overclock my E-350 a bit. It also works for the 450. As always, keep an eye on your temps if you want to use this. :) 

http://code.google.com/p/brazostweaker/

good luck!
m
0
l
January 8, 2013 6:45:20 PM

mactronix said:
Don't think your going to get many takers...

Well, i think you are right. My nephew won't trade my folding, dual-screen phone with extendable, replacable antenna, for his mono-screen antenna-less i-thing either. Such is life I guess.
quilciri said:
I used brazostweaker to overclock my E-350 a bit. It also works for the 450. As always, keep an eye on your temps if you want to use this:...

It probably isn't possible to overclock an E-450. Even if you set the clock to 3.2 GHz the CPU still doesn't process information any faster. Overclocking is not something I want to do anyway, lest my machine suffers the fate of tens of millions of XBox 360s. Those machines failed with about three years of average use probably because of excessive GPU temperatures.
Thirty years ago it was common knowledge among the technical that for every 7 degrees C a semi-conductor's temperature rose the remaining life expectancy was halved. This knowledge seems to have been mostly forgotten.
SWTOR runs great for me at 18 unique frames per second and that is much better than the 12 ufps that most cartoons ran at. i am planning to add another SODIMM because that should increase sustained memory thoughput by about 20%. This isn't to improve SWTOR but simply because parts can be hard to find after too much time has passed and it is easy to find them now.
m
0
l
January 9, 2013 4:21:35 PM

Best answer selected by Murray B.
m
0
l
!