Sign in with
Sign up | Sign in
Your question

FPS: When is enough, enough?

Last response: in Graphics & Displays
Share
December 11, 2007 1:08:20 PM

They say in gaming FPS is life, so what FPS can you / do you live with, with regard to playing games at an enjoyable level (eye candy) with *smooth game play.

How do you guys setup your games for single player and for online/multiplayer?

I may play single player at 1280x1024 or higher but usually reduce this to 1024x768 or stay at 1280x1024 but remove all AA and AF, to reduce LAG online or in multiplayer.


*My definition of smooth game play is no frame drops below 50 FPS. So min frame rate is 50. Also remember that a game will vary its frame rate due to a number of factors, current scene action, background tasks of the PC etc…

More about : fps enough

December 11, 2007 1:29:09 PM

I play with all games (sans Crysis of course) at maximum settings @ 1280x1024 res. I'd also say the absolute minimum acceptable FPS for me during any kind of intense fighting or action is 30. I prefer 40-50 and can't really notice any difference when my FPS goes above that, so that's my happy zone.
a c 112 U Graphics card
December 11, 2007 1:29:15 PM

You nail it right on there. I prefer frame rates(50+) to eye candy. ET qauke wars is capped at 30 and seems to be ok. so its all about what game it is....

For strategy games lower fps is more acceptable(as low as 30 depending on the game..hell some are strategy/mmo's capped there anyway..).....
Related resources
December 11, 2007 1:42:06 PM

depends on the monitor.

On my 24", playing an FPS, anything less than 1920x1200 at 30 fps isn't acceptable.
December 11, 2007 1:43:24 PM

I really don't mind as long as it is smooth and consistant FPS at or above 30. Games that are capped at 30 are fine by me, just as long as it doesn't go under that limit.
a c 112 U Graphics card
December 11, 2007 1:43:28 PM

cleeve said:
depends on the monitor.

On my 24", playing an FPS, anything less than 1920x1200 at 30 fps isn't acceptable.


Yeah, lower res on big screen is a pain.....
December 11, 2007 2:17:30 PM

i typically play with graphics maxed and 1280x1024 without AA or AF, 30+ is good for non fps games but if im playing fps i want at least 40 fps or i cant stand playing the game. unless of course its crysis :) 
December 11, 2007 2:44:00 PM

On my 22" running 1680x1050, I hate dropping below 60FPS in CoD4 multiplayer. Im used to playing CS, TF2, CoD, UT2K4 all at over 100fps with my max set to 100. So to me.... anything below 100 is annoying, but 60 is acceptable to me in online play.
December 11, 2007 3:07:27 PM

See, thats the trick of it...FPS's either have just environment zipping by you at say....40FPS... then, all of a sudden, you could have about 10+ ppl shooting rocket launchers at you and zipping/exploding around your head (of course you would die from this, but for the sake of argument, you dont die...yet). Your probably going to get some serious lag from this. Idk, say 5FPS is what you end up with while trying to save yourself. To make up for this wouldn't you have to have about 70-80FPS at least to start to make up for that?
December 11, 2007 3:14:25 PM

40 seems to be my sweet spot, though I have been known to tolerate 25.
December 11, 2007 5:03:53 PM

Anything above 15 is good in my book!
December 11, 2007 5:15:58 PM

My moniter is a 20 in. widscreen optimum resolution of 1680x1050. I never notice any difference in FPS when it gets over 60.
December 11, 2007 5:20:49 PM

you wouldn't. lcds have a refresh rate of 60fps. Usually you should have vsynch enabled on your software to limit the fps output to 60 since too many frames can cause clipping.
December 11, 2007 7:17:27 PM

Quote:
On my 22" running 1680x1050, I hate dropping below 60FPS in CoD4 multiplayer. Im used to playing CS, TF2, CoD, UT2K4 all at over 100fps with my max set to 100. So to me.... anything below 100 is annoying, but 60 is acceptable to me in online play.
Unless you have a 100Mhz monitor... you won't actually see 100 fps... I think maybe you find it more annoying that the number of frames it self drops down rather than actually noticing a decrease in performance.
December 11, 2007 7:31:23 PM

I had a CRT and ran it at 100hz refresh rate. I just got my new computer and LCD about a week ago and I notice a huge difference.
December 11, 2007 7:45:22 PM

I highly doubt that it's actually the 60fps thats bothering you, but rather like rgeist said it's more likely knowing it's 60fps is whats bothering you. Especially with the games your listing 60fps should be just as smooth as 100fps or 1000fps for that matter.
December 11, 2007 7:51:03 PM

I hate to break it to the FPS junkies, but the human eye can only process 30 frames per second. Period. That's why so many official reviews cite 30 FPS as the magic number.

If your game is running above that (ie: there are 30 actual frames per second being flashed across the screen, or more), you won't be able to detect the difference.

However, if you're running at 30 FPS exactly (or around abouts), there is the risk of dipping below the threshold, and into the choppies. That's why a lot of people prefer a 60 FPS average. However, if you're just talking second to second frame count, you can't do better than 30, as far as the eye is concerned.
December 11, 2007 8:04:34 PM

Yes but does COD4 or BF2 not feel slow at 30/35 FPS for example?

Or is the head not listening to the eyes in this case? :pt1cable: 
December 11, 2007 8:13:14 PM

The 30 FPS thing is actually a misconception.

Article: http://amo.net/NT/02-21-01FPS.html

Read the bold for a few highlights. (The whole article is like 2 or 3 pages)

Quote:

NVIDIA a computer video card maker who recently purchased 3dFx another computer video card maker just finished a GPU (Graphics Processing Unit) for the XBOX from Microsoft. Increasing amounts of rendering capabilities and memory as well as more transistors and instructions per second equate to more frames per second in a Computer Video Game or on Computer Displays in general. There is no motion blur, so the transition from frame to frame is not as smooth as in movies, that is at 30 FPS. In example, NVIDIA/3dfx put out a demo that runs half the screen at 30 fps, and the other half at 60 fps. The results? - there is a definite difference between the two scenes; 60 fps looking much better and smoother than the 30 fps.

Even if you could put motion blur into games, it would be a waste. The Human Eye perceives information continuously, we do not perceive the world through frames. You could say we perceive the external visual world through streams, and only lose it when our eyes blink. In games, an implemented motion blur would cause the game to behave erratically; the programming wouldn't be as precise. An example would be playing a game like Unreal Tournament, if there was motion blur used, there would be problems calculating the exact position of an object (another player), so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned, that is the object wouldn't exist at exactly coordiante XYZ. With exact frames, those without blur, each pixel, each object is exactly where it should be in the set space and time.

...

This is where this article gets even longer, but read on, please. I will explain to you how the Human Eye can perceive much past the mis conception of 30 FPS and well past 60 FPS, even surpassing 200 FPS.


Anyways, read it if you want. If not - don't try to flame me.
December 11, 2007 8:15:13 PM

COD4 or BF2 at 30/35 could very easily be dropping momentairily to less then 25fps which would be noticeable.
a c 130 U Graphics card
December 11, 2007 8:19:11 PM

jjblanche said:
I hate to break it to the FPS junkies, but the human eye can only process 30 frames per second. Period. That's why so many official reviews cite 30 FPS as the magic number.

If your game is running above that (ie: there are 30 actual frames per second being flashed across the screen, or more), you won't be able to detect the difference.

However, if you're running at 30 FPS exactly (or around abouts), there is the risk of dipping below the threshold, and into the choppies. That's why a lot of people prefer a 60 FPS average. However, if you're just talking second to second frame count, you can't do better than 30, as far as the eye is concerned.


And i hate to break it to you but you are just propogating a myth. I have spent many hours on the internet trying to find anything that supports what you just said and i cant so if you have some info can you post a link.
This is not a new topic to the forums as you may or may not know but the truth of it is there are many more subtleties involved in how the brain to optic nerve relationship works.
It just isnt possable to measure the value,you can say that most people wont see the differance at x y z but if you take someone who has been gaming at average fps 70 and put them infront of a monitor running at 30 trust me they will know.
Mactronix
December 11, 2007 8:41:42 PM

I'm not going to argue how many FPS the human eye can see. I agree that at 20,40,50fps there are differences that are easy to see. But when you get to 60fps and above, especially when you're talking about displays that can't even show more then 60fps, I tend to think that it becomes more of an ego thing for people to say "my rig does xxxfps" or "I can tell the difference between xxxfps and xxxfps". Also I believe that it has a lot to do which constant frame rates. To me Crysis looks a lot smoother at a steady 20fps then something flucuating between 25-40fps. While I don't necessarily agree that 30fps is the limit, I would take the article posted above with a grain of salt. Just because somebody wrote it, that doesn't make it true - especially with something that old and no real evidence to support it. Just read the section about motion blur where the author makes it sound like such a thing would not only not have the desired effect in video games, but probably wouldn't even be possible (see Crysis).
December 11, 2007 8:53:44 PM

The human eye might be able to see 300 fps, but when a monitor's refresh rate is 60 Hz... which is 60 fps... what does it matter if your LCD can't update it that fast?
December 11, 2007 8:55:09 PM

That's because it is a simulated blur and not a actual motion blur (a blur caused by you not focusing on a moving object) Trying to recreate an actual blur in a video game would be impossible w/o doing some of the following: 1.) Making the texture of the object blurry or 2.) Simulating a blur by distorting the space around the object. All Crysis does it basically create a distorted area around the object.

I think the author of the article is basically trying to say that creating a true motion blur inside of a video game would be impossible. You just can't take a full high-qaulity image and move it across a screen @ 60FPS and expect it to blurry w/o some artificial tampering. All you have to do is focus your eyes on the moving object and you'll be able to make details about it.

Quote:
But when you get to 60fps and above, especially when you're talking about displays that can't even show more then 60fps, I tend to think that it becomes more of an ego thing for people to say "my rig does xxxfps" or "I can tell the difference between xxxfps and xxxfps".
That's the point I was making above. You may be able to tell some small difference between the two IF your monitor can even refresh fast enough, but in most cases it comes down to physically seeing your FPS as an actual number and watching it drop. You don't like seeing it drop, so in turn you think it looks worse, or you get upset.
December 11, 2007 9:37:04 PM

rgeist554 said:
That's because it is a simulated blur and not a actual motion blur...

Well isn't everything in a video game simulated? What's the difference between making a 2D image appear to be 3D and simulating motion blur? Both are playing a trick on the human eye, which isn’t nearly as hard as the AMO article would try to make you believe. Just go to YouTube and search for optical illusions. Yes under the right circumstances the human eye can do some incredible things, but when pushing the limits at high frame rates (even displaying 30 images in less then 1 sec is pretty fast) in an environment specifically designed to fool the eye (video games) I tend to believe, as you do, that most people are full of crap when they say they can see the difference between 60 and 70 fps.
December 11, 2007 10:09:56 PM

jjblanche said:
I hate to break it to the FPS junkies, but the human eye can only process 30 frames per second. Period. That's why so many official reviews cite 30 FPS as the magic number.

If your game is running above that (ie: there are 30 actual frames per second being flashed across the screen, or more), you won't be able to detect the difference.

However, if you're running at 30 FPS exactly (or around abouts), there is the risk of dipping below the threshold, and into the choppies. That's why a lot of people prefer a 60 FPS average. However, if you're just talking second to second frame count, you can't do better than 30, as far as the eye is concerned.


Umm...what planet do you live on? Try playing F.E.A.R. or COD2 or most FPS games at 30fps. Then look at them at sixty. BIG difference. So I hate to break it to you, but what you said is a bunkload of crap. The human eye cannot detect much past 60 frames per second. That is why if you are running a game at 60fps or at 1000fps, they will both appear to be as smooth as each other. Of course at 1000fps you will most likely have tearing occur. I am no scientist, but wherever you learned that crap, it's wrong, and more people than you could imagine would agree out of personal experience.
December 11, 2007 10:24:10 PM

Seeing as 60 is the max for my monitor, I like hitting there with max settings for all games (22" with GTS 320). I can accept 40-50 for First Persons, but I really want to keep it above 30 and would turn down the settings if it ever hit 20 (Crysis excluded). I can deal with 30-40 in RTS's like SupCom and will even be ok with 20 (which is what I tend to get when I crank up the AA to 8 or 16.
December 11, 2007 10:43:05 PM

I try for 60 fps as constant as possible. I can always notice a drop below that, but in many cases it depends how well the game is programmed and how fast the gameplay is. If a game is above 30 + fps, but not above 60+ FPS then I NEED it to stay constant. As in if it is going to run @ 30 fps...I need to to be at 30 FPS at all times or it really bothers me. Most games are not programmed well enough to stay smooth on fluctuating frames imo, but crysis is (thank god) or I wouldnt play it. I am very very picky about frames, but also about running as close to full res as possible.

Best,

3Ball
December 11, 2007 10:56:28 PM

A bunkload of crap, you say? ;-) One must remember that when you're riding right on 30 FPS, it often dips below the threshold, and thus you see chop, like I said. I advocated 60 FPS average for that very reason.

Further, I'm not out to flame anyone, or propagate myths. An old biology TA of mine, who gamed with us via LAN (he was quite good, actually), noted something about how "in the real world" the eye doesn't operate via frames per second, which is obvious. The world is always there, feeding us a constant stream of sense data. Frames per second are irrelevant, because there are no frames per second. It's just there.

However, with visual media that is projected or rendered, different things come into play. He didn't go into specifics, but he said the way the screen oscillates (ie: flashes, etc), has an impact on the relationship between the brain/eye. Basically, the way in which screens work have a kind of paralyzing effect that limits the way information can be processed. I'm trying to explain this, but it's difficult to articulate, and indeed I don't know nearly all the details....for example, have you ever been in a room that has a malfunctioning fluorescent light, and it makes you dizzy or nauseous? Or a strobe light for that matter? The same thing is going on.

This is a long winded explanation, and probably doesn't make much sense. In my personal experience, around 60 FPS average is something good to shoot for.

All this begs the question, where are you guys getting your FPS readings? Are they averages, or on the fly?
December 11, 2007 11:07:43 PM

jjblanche said:
I hate to break it to the FPS junkies, but the human eye can only process 30 frames per second. Period. That's why so many official reviews cite 30 FPS as the magic number.

If your game is running above that (ie: there are 30 actual frames per second being flashed across the screen, or more), you won't be able to detect the difference.

However, if you're running at 30 FPS exactly (or around abouts), there is the risk of dipping below the threshold, and into the choppies. That's why a lot of people prefer a 60 FPS average. However, if you're just talking second to second frame count, you can't do better than 30, as far as the eye is concerned.


Common misconception. Biologically speaking, this "rule of 30" that people speak about isn't the cap on how many frames the human eye can process, it's about motion itself. Video is basically a set of pictures flashed one after another. At about 15 fps, the brain starts to process the pictures as motion. Aproximately 30 fps is the ideal spot where that motion appears unbroken. The brains ability to process this is in no way halted after 30 though. Personally, ideally, I like 60 Hz. I find that below that, eyestrain is all too common
December 12, 2007 12:46:40 AM

As long as it never drops below 25 i'm fine, I'll choose eye candy over FPS any day.
December 12, 2007 1:47:52 AM

The fluxuation from, say, 100 to 30 is strainuous also imo. Could be an anal problem (need to see a doctor I guess) but it gets me 'confusious' or displaced. 40 - 50fps constant would be satisfactory.
As for what you can percieve - in my perception, 100fps on a 100hz CRT is minutely different than 75fps (yet visable). As for 30fps to 60fps, there is no doubt - videocard produced vsync or no vsync frames to the local news at +40fps or a 24fps movie. The key is consistance and becoming accustomed.
December 12, 2007 2:26:15 AM

Quote:
If your game is running above that (ie: there are 30 actual frames per second being flashed across the screen, or more), you won't be able to detect the difference


What is your vision? 1/20? All I know is that my eyeballz can detect the game running smoother at 60 than 30 :kaola: 


No offense meant dude :) 
December 12, 2007 2:37:07 AM

just curious, but to those of you playing at 100fps, what games/settings are you playing on? It seems to me that just to get those kinds of frame rates or higher that the quality would be so poor that there wouldn't be much to notice anyways.
December 12, 2007 2:49:57 AM

For PC games I want an average of 60FPS (this being the reason I've refused to buy Crysis) and on a console I can tolerate 30fps as framerates usually don't drop below that.
December 12, 2007 2:55:42 AM

Agian... regarding human vision, the 30fps myth comes from the movie industry. Films use a process called 'motion blur' which kinda makes the physical 30fps look more like 60.

Crysis and some games have post processing options which vaguely replicate this making the game at 20 odd fps quite playable.

However, in a pure test without any effects etc, the human eye can EASILY detect the difference between 30 and 60, but probably not 60 and 90.

As for fps in games... it all comes down to the crunch. What numbers will u get at the worst times? IMHO, ~30fps is the least for me. Any less and the chop starts to annoy me.

Holding a min of 60 in all games would be fantastic, but who can be farked spending $5k on a system and upgrading every 6 months.
December 12, 2007 3:41:11 AM

Depends a lot on the game... Some games won't look any different from 30 to 60 because the animations only contain 30 points of animation. Racing games are commonly like this. Other games contain 100's of points in an animation, that is where you will notice the difference. CoD4 is and example of this. Basically it comes down to how good you monitor is, personal preference, and how lazy the designer was.
I can get more detailed, but I think this is sufficient.
a b U Graphics card
December 12, 2007 4:03:35 AM

For the umpteenth time.

THERE IS NO SET LIMIT ON FPS !!

And nothing so pedestrian as 30fps (be it fields or frames), and while the eye senses (only limited individually at the firing/recharge/firing rate of the rods cones, (rods being the quickest) and the brain percieves at different rates, depending on adjascent stimuli and their properties with the brain divided up into sections dedicated to certain features, like angularity; and both are affected by things like contrast, stobic or cross-current motion, change, etc.

Anyone who thinks they have a hard and set rule really doesn't understand the complexity involved.

The 16/24/30/50/60 F/f-ps 'rules' are all MINIMUMS not maximums. and they are accepted RECOMMENDATION/Beliefs not rules, about what works for MOST people in MOST situations give the aspects of the medium used. Where precieved fluid motion in 50 ISO film is not going to be the same rate as on a digital monitor or even 3200 ISO film with no blur.

I think people should avoid the generalizations and just give the OP what works for you personally and realize that it is just that, YOUR personal preference.
December 12, 2007 3:09:18 PM

aslong as the person your trying to shoot does not chop and jump across your screen in a multiplayer online FPS game then you should be alright lol
December 12, 2007 3:31:59 PM

doesnt really matter what fps, I just like it smooth, as long as it doesnt jump up or down, even then I still manage to kill a squad of noobs in a 5v5 bf2 match @ 12 fps with my old pc (there was a bug.. wouldnt surprise me if they were getting the same fps, but they still sucked, make me play a match on a saturday, thinking us "young guys" wouldn't show, we showed them..).

Be it 12, 60, or 100, I am still lethal! Cave Adsum!
December 12, 2007 4:34:29 PM

I have two scenarios. Both using same PC (except Vid cards), and COD4, since it is a very up to date game. My pc consists of a P4 3.0G 630, 2G pc-5400, 7200rpm SATA, Asus MoBo 800Mhz FSB. 22" samsung syncmaster 1000:1 contrast, 5ms response, 60hz. Old pc but it works. My CPU is a bottleneck.

1. My old vid card was a Nvidia 7600GT 256M DDR3. COD4 was getting 50-70 FPS with 0xAA, 0xAF, most eye candy off at 1024x768. Lots of explosions in a full server. It was very playable and fun. It had smooth action, but no cool light or effects.

2. I just installed a ATI HD3870 512M DDR4. I am getting 40-70 FPS with 4xAA, 4xAF, all the eye candy, lights and effects on or normal with 1152x768. Looks awesome.

I am getting less FPS(Due to my CPU) but it is much nicer to look at. I dont notice any loss of FPS. I do notice everything is easier to see. The games are so much easier and more fun to play with anything over 40, especially with good res and effects.

I'll take eye candy and good graphics any day over FPS as long as it is in the 50ish range.
December 12, 2007 5:01:16 PM

I agree that 60fps is the max for LCD's with a 60hz refresh rate. But try playing on a CRT with 100hz then switching to a LCD with 60hz and you can notice a difference. Even if the game stays the same, you notice somthing isnt quite right.

The 30fps myth thing... I agree you can see more, but I dont know how much more.

If anyone has ever seen a peice of film shot at 30fps, then the same one shot at 1000fps... it seems like the film is in super slow motion, when in actually, your just seeing more of the clip so it seems like its slowed down but its really not.
a c 112 U Graphics card
December 12, 2007 6:01:07 PM

High speed cameras record at 1000fps but play back at 30 so you are seeing in slow motion. The way film works is much different from games. there is no blur as mentioned above. to over come this with games you just run higher fps(fools the eye nice).....The eye does not work in frames. more along the lines of noticing what is happening and focusing on 1 thing at a time....and scanning to see whats around....its quite confusing to thing how your eye works....

Also as said....its more about the drop.... Fantasy Star Online(older MMO) was locked at 30, but thats where it stayed so even at 30 it was not too painful to play. But i can still notice it....Fantasy Star Universe(when set to high) unlocks it to 60 fps. there is quite a difference in 30 to 60 where smoothness is concerned....

Since i play all games with Vsync anyway 60(or 75) is about it.....I will admit that back with my old CRT @ 120 it was damn smooth, but LCD's do not refresh like crt's yet....
December 12, 2007 6:50:58 PM

jjblanche said:
I hate to break it to the FPS junkies, but the human eye can only process 30 frames per second. Period. That's why so many official reviews cite 30 FPS as the magic number.

If your game is running above that (ie: there are 30 actual frames per second being flashed across the screen, or more), you won't be able to detect the difference.

However, if you're running at 30 FPS exactly (or around abouts), there is the risk of dipping below the threshold, and into the choppies. That's why a lot of people prefer a 60 FPS average. However, if you're just talking second to second frame count, you can't do better than 30, as far as the eye is concerned.


[EDIT] I just realized that like 50 other people have already trashed this myth. Ho hum, I'm late.
-cm
December 12, 2007 7:23:04 PM

I never have any issues that cause me to wonder what my fps score is. I say "score". Like it's a game we play on the forumz.
"All games are maxed out in every way @1600x1024-85mhz" Crysis @high of course, cuz 'o my DX9 card.
I don't play COD4, but from what I see on the reviews, a calculator watch could run that game at highest settings and still get 50+ fps.
No seriously, I see the biggest dif. in fps on Oblivion. With even 5 fps jump either way, it seems to be very different. Is it just me?
a c 130 U Graphics card
December 12, 2007 7:47:25 PM

No its not just you a couple of frames can make the differance betweeen choppies or not in Oblivion in my experiance.
Mactronix
December 12, 2007 7:59:56 PM

Howdy!
December 12, 2007 8:25:07 PM

As long as i can aim my gun and shoot to kill, i really don't care.

Wonder how many FPS life runs at...
!