Gaming FPS and Movie FPS

Wonderwill

Distinguished
Sep 28, 2006
558
0
18,990
Movies are normally displayed with 30 or 24 FPS, and look perfectly fine. What is with gaming FPS's that reach from the 50s to 150s? Isn't this a little overkill? I am fine with a card that displays 45 FPSs at a resolution of 1024 by 1280. I may be wrong if the term FPS has different implications.
 

mpjesse

Splendid
Here's why it's not (totally) overkill. Though a card may be able to produce an "average" frame rate of 50-120fps, the same card may dip below 30fps in very elaborate/complex scenes. For gamers (and me) it's extremely annoying; esp. in FPS's which require split second reactions and decisions. If the game drops below 30fps at just the wrong time, it could mean a frag.

When shopping for a card you should always research the minimum FPS benchmarks, which THG (unfortunately) doesn't do often. Anandtech and Xbit does though. Average FPS doesn't mean a whole lot, it's the minimum FPS that's probably the most important if you want that "awesome" gaming experience.

Does that make sense?
 

Blacken

Distinguished
Aug 27, 2004
641
0
18,980
Alright! I get to be the first post 8) [edit]damn you barney![edit]
Saw an article on this a while back, 2 or 3 years ago. They pointed out that the major difference is where a movie camera (unless your watching the blair witch proj.) doesn't move from side to side a lot, and when it does, it creates a 'blur' type of filler which give your eye the illusion of fluidity. On a game, most games today, have no blur effect. It's a raw frame to frame. Therefor, more fps is needed to give your eyes the "filler". They also stated that most movies are predictable, heh... I guess thats where YOUR cpu (your brain) does the work.
 

Wonderwill

Distinguished
Sep 28, 2006
558
0
18,990
Thanks for the explanation! But where can I find out the average FPS required by a game and still retain fluidity? And which types of games take the most?
 

niz

Distinguished
Feb 5, 2003
903
0
18,980
Thanks for the explanation! But where can I find out the average FPS required by a game and still retain fluidity? And which types of games take the most?

More performance than gets you a consistent 60fps in all cases is often redundant as many larger LCD panels only support 60Hz frame refresh at native res. (especially if you use DVI).

Your GPU could be doing 9999 fps but it doesn't make any difference if your monitor is only doing 60.
 

CompuTronix

Intel Master
Moderator
The following is part of something I wrote for another thread, and I believe it applies here:

"An Optometrist will tell you that the Flicker Frequency Threshold for the human eye is 48, so we rarely detect faster image rates. This is why the video industry can squeak by with 30FPS interlaced, and is also why most people don't see the 60Hz flicker in florescent lights, and won't see the flicker in a CTR at 60Hz refresh rate. For flat panel LCD monitors, this is 16.7mS response time.

Remember that when you increase refresh rate (72 vs 60) and/or image matrix (1280 vs 1024), you decrease FPS. The goal is to design a rig that can consistently render FPS that never fall below 45. Anything above 60Hz is meaningless to the human eye, although the numbers are impressive for benchmarking and bragging rights."

Hope this helps to answer your question.
 

runt23

Distinguished
May 29, 2006
86
0
18,630
The reason why video games need higher FPS is as follows.

For example...
Frames for a film;
1-Guy standing in the middle
2-Guy starts to move to the right
3-Guy is moving to the right.
4-Guy is now on the right of the screen
As a movie, you can print this on 3 individual (well more than 3 obviously) frames.

Game frames:
1-Guy standing in the middle
2-Guy starts to go left (guy then decides to go right)
3-Frame displays guy going left,
4-Guy is now all the way on the right, and missed the part where he was in the middle because the frames didnt update fast enough to display the fast change.

The fast pace of changing items in the image creates problems. If you have 100fps you can more accuratly display moving objects of games. With a movie it isnt as important because the frames do not have to keep up because they are already pre-printed and do not have to wait for the next frame to display things that are happening in the present.


I hope that made sense, there are some other things to it, and im not too good at explaining things.
 

mpjesse

Splendid
The reason why video games need higher FPS is as follows.

For example...
Frames for a film;
1-Guy standing in the middle
2-Guy starts to move to the right
3-Guy is moving to the right.
4-Guy is now on the right of the screen
As a movie, you can print this on 3 individual (well more than 3 obviously) frames.

Game frames:
1-Guy standing in the middle
2-Guy starts to go left (guy then decides to go right)
3-Frame displays guy going left,
4-Guy is now all the way on the right, and missed the part where he was in the middle because the frames didnt update fast enough to display the fast change.

The fast pace of changing items in the image creates problems. If you have 100fps you can more accuratly display moving objects of games. With a movie it isnt as important because the frames do not have to keep up because they are already pre-printed and do not have to wait for the next frame to display things that are happening in the present.


I hope that made sense, there are some other things to it, and im not too good at explaining things.

Dude... no offense, but you don't know what you're talking about. Refer to the first 2 posts for the right answers.

(did you just make that crap up?)
 

BigCharb

Distinguished
Oct 9, 2006
295
0
18,780
SUP, i was wondering since you guys are talking about FPS; which GFX card is capable of displaying most games at a reasonable speed? please don't say the 8800's. thanx :)
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
SUP, i was wondering since you guys are talking about FPS; which GFX card is capable of displaying most games at a reasonable speed? please don't say the 8800's. thanx :)
Even though you don't want to hear the truth, the 8800GTX is the most capable GPU on the market.
 

kkmultes

Distinguished
Jun 26, 2006
59
0
18,630
the secret of good fps in a game is :cards ho maintain the same number o fps all the time even if its a low number like 20

like somebody said up here : in some moments the fps of the game can dip below in very elaborate/complex scenes.

even if you are playng the most of the game above 100 fps, when you get very elaborate/complex scenes, you may fall to 80 fps , and that will make framedown that is bad

so is better always with 30fps then 100 to 80fps
 

mpjesse

Splendid
well since you're not willing to consider the 8800 GTS, then the 7950GX2 or x1950xtx are your best bets for highest FPS. 7900GTX is also a good alternative, but GX2 makes more sense on the nVidia side of things.
 

CompuTronix

Intel Master
Moderator
There is no secret, but for the benefit of others, let's go over this again, so it's simply and clearly stated:

(1) 30 FPS is MINIMUM acceptable gaming. (This means NEVER dip below, regardless of conditions).

(2) 45 FPS is adequate.

(3) 60 FPS or more is imperceptable.

Depending on what gaming software you run, a certain combination of GPU/CPU horsepower is required to meet the above (1) requirement. The goal is to enjoy a visually smooth computer gaming experience, without feeling like we're trying to participate in a slide show.

I hope this helps you to understand how frame rate applies to a gaming rig.
 

kkmultes

Distinguished
Jun 26, 2006
59
0
18,630
30 FPS is MINIMUM acceptable gaming. (This means NEVER dip below, regardless of conditions).

(2) 45 FPS is adequate.

(3) 60 FPS or more is imperceptable.

this number may vary with people to people
 

CompuTronix

Intel Master
Moderator
I don't like to quote myself, however:

An Optometrist will tell you that the Flicker Frequency Threshold for the human eye is 48, so we rarely detect faster image rates. This is why the video industry can squeak by with 30FPS interlaced, and is also why most people don't see the 60Hz flicker in florescent lights, and won't see the flicker in a CTR at 60Hz refresh rate. For flat panel LCD monitors, this is 16.7mS response time.

Remember that when you increase refresh rate (72 vs 60) and/or image matrix (1280 vs 1024), you decrease FPS. The goal is to design a rig that can consistently render FPS that never fall below 45. Anything above 60Hz is meaningless to the human eye, although the numbers are impressive for benchmarking and bragging rights.

If you prefer, you can google "Flicker Frequency Threshold", or simply ask your eye doctor.
 

CompuTronix

Intel Master
Moderator
To put it in different words, gaming frame rate will always fluctuate with continually changing game situations. A GPU/CPU combination powerful enough to continually maintain frame rates above 30, will sometimes render extremely high FPS, well above 60 and into tripple digits, which is imperceptable and irrelevant to the human eye. These highest FPS numbers can be disregarded, except for benchmarking overall performance characteristics.

Again, the most important consideration for a gaming rig, is never dipping below 30, and for a smoother gaming experience, preferably not below 45.
 
As stated above, as long as the frame rate stays above 30fps the game will still have a fluid look to it. You are always going to have a varaince in your frame rate depending on the amount of data being processed. Eg: When you are walking thru a narow hallway your frame rate will be higher than when you have 5 enimies shooting at you in a cluttered warehouse. It would be mutch more desirable to have your frame rate stay in the 80-100fps range diping down to the 30fps range only under heavy load.