Sign in with
Sign up | Sign in
Your question

What does frames per second really mean?

Last response: in Graphics & Displays
Share
October 19, 2001 6:58:21 PM

I'm confused. I just found Dr. Pabst's article "3D Benchmarking Explained" (or some such). Thinking that the question that has plagued me since I starting shopping for a card would be answered. Alas, no. Y'all seem to know what's up so tell me- isn't the film standard of 30 fps good enough in computer graphics display. I mean if the brain can't detect individual frames at 30 per second, what good is a card that can display 127 fps@10X7x32 bit? I look at card benchmarks and often the worst card reviewed is displaying well over 30 fps at high resolutions. So, isn't all the power of the bestest and fastest (ie GeForce3 Ti 500) a waste if a card at a third its cost (ie Geforce2 Pro)is displaying framerates higher than the brain can distinguish (all the extra DirectX 8.1 stuff aside)?

More about : frames

October 19, 2001 7:15:19 PM

60 is realy about the maximum human threshhold, and anything that can handle 60 FPS at a resolution you like to use is probably a good card to have in your system.

There is faction of people who dedicate to squeezing every last frame out of a card and computer, and those are the people who are usually more interested in FPS than anyone else.

Usually, a very high FPS shows more to the longevity of the card, as something that does 90 FPS now should be able to do 45-60 next year.

60 FPS, 70 FPS, 80 FPS Crash!
Daylight comes and I have to go to work :frown:
October 19, 2001 7:29:49 PM

NTSC is 29.97 FSP, but with a computer, it's different than a TV. It's a lot easier to notice different frame rates. But you're right, 200 FPS in Q3 is a bit much...it's merely future-proof.

<font color=green>I post so you don't have to!
9/11 - RIP</font color=green>
Related resources
Anonymous
a b U Graphics card
October 19, 2001 8:25:16 PM

I don't watch movies on my 'puter, so I can't speak to that, but when you are gaming, the time delay between an event and your reaction to it are critical to success.

From my experience, anything above about 30 is very playable, but I actually do notice a difference in the behavior of games (not so much the appearance) as the frame rates increase. Nascar Racing 4 is a good example. where the stability of my car is influenced by more a more powerful (faster) video card (all else the same) even though framerates are well above what the human eye can detect.

bottom line framerate is also an indication of how long your computer is waiting around for the next piece of information before it can display it.

there is alot more going on than just pleasing the eye, your computer is VERY busy during gameplay. Faster video cards can be thought of (although way too simplistic to be taken too literally) as a device to remove distractions from your computer while it thinks about stuff. so faster cards remove more distractions and the computer does a better job.

again, this is supposed to be an anology, NOT a scientific explanation of what happens with better framerates
October 19, 2001 8:40:04 PM

Insidious, that seems to be confusing the platform performance and video card performance issue. Doesn't the CPU/Bus determine the "behavior" of objects that the video card is displaying? I understand the effect a display lag has on your ability to react appropriately in a game but your car isn't actually more stable with a faster video card. Is it?
October 19, 2001 8:46:01 PM

Computer games used to use CPU speed as the determining factor for how fast they went. So those people playing on a 486 had a huge disadvantage in regard to those playing on a 386. What? Isn't that backwards? No, because games weren't multiplayer back then. So you go slower, you can get a better score :) 

They fixed it now, of course. But imagine if you could move twice as fast with a GF3 as the guy who still has a VooDoo 3.

<font color=green>I post so you don't have to!
9/11 - RIP</font color=green>
Anonymous
a b U Graphics card
October 19, 2001 8:56:11 PM

hmmmm,

(please forgive my technical innacuracies, but this explanation seemed to match what I was experiencing.)

the way it was explained to me was this:

The computer is making information for the game based on inputs from you and the game software. This information has to be made into pictures for the screen. As the computer works, it sends information to the video card for display. and then goes back to work for the next picture. but it can only send information to the video card once the card has displayed the last information it got and has room for more. The rest of the time the computer either waits or just erases information from the last pass if the video card wasn't ready to receive it when it completed that step.

better video cards do two things. their memory is larger and they can hold more information from the computer. Also with more powerful Graphic processing units they can do some of the computations required to turn raw information into a display for the computer.

basically you have TWO computers working together to make the final result. The ultimate flow of information to the screen is only as fast as the slower of the two computers.

while only 30 fps or so is required for the eye to think it has all the information it needs, there is in actuality information being lost in between frames. More frames per second => less information lost.

there is A LOT going on in our minds to control our reactions, access choices, etc. that is MUCH faster than our eyes abilities to detect frames. Don't forget much of the human effort is from the cerebrial cortex (muscle memory) and that is one fast processor! Further, don't forget we have internet latencies that are additive with any processing time our machines use. It all adds up. anything that can shorten the processing time helps.

how many fps are needed are not only a function of the eye, but of the game, your system, and the medium that connects players together (whether it is external (internet) or internal (Motherboard))

PS: Yes, I really do turn better laps since upgrading my video. My son said he is experiencing the same thing. This upgrade I mention was from a GeForce2 GTS which was no slouch. (I am on a GF3 now) As another point, I really see no appreciable difference in Half-life (mods) which was already at the games' cap of 100fps before the upgrade. Unreal Tournament.... it's hard to say because I really suck at it!
<P ID="edit"><FONT SIZE=-1><EM>Edited by Insidious on 10/19/01 05:00 PM.</EM></FONT></P>
October 19, 2001 9:23:18 PM

That makes sense.
October 20, 2001 12:06:00 AM

I agree with Insidious, higher frame rate does improve your ability to play. I'm a twitch gamer and I've always noticed the difference between 30fps and 60fps, not because it looks different but because it feels different.
Also, it's important to remember that a 100fps average is just that, an average. It's not the empty scenes you have to worry about, it's the ones that are filled with enemies and action. An opponents smoke grenade can be a fatal weapon if your using a Voodoo 2, but it's just an annoyance when you have a GeForce 3.

"Ignorance is bliss, but I tend to get screwed over."
October 20, 2001 1:03:46 AM

Mr. Random has pegged it :D 

Average framerate is not important so much as the low framerate in critical action. Twitch games like first person shooters require high framerates so that you never drop below an acceptable framerate (for me, that area is about 60 - 90 fps... you CAN tell the difference between 60 and 90 fps in these games). Racing games benefit from high FPS too... visual lag is death when you need to make hairpin movements and adjustments. Games look better and play better when you have a high framerate, which is why I always buy a gaming video card based on 3d performance.

"Laziness is a talent to be cultivated like any other" - Walter Slovotsky
!