FPS: When is enough, enough?

JonnyRock

Distinguished
Aug 7, 2004
117
0
18,680
They say in gaming FPS is life, so what FPS can you / do you live with, with regard to playing games at an enjoyable level (eye candy) with *smooth game play.

How do you guys setup your games for single player and for online/multiplayer?

I may play single player at 1280x1024 or higher but usually reduce this to 1024x768 or stay at 1280x1024 but remove all AA and AF, to reduce LAG online or in multiplayer.


*My definition of smooth game play is no frame drops below 50 FPS. So min frame rate is 50. Also remember that a game will vary its frame rate due to a number of factors, current scene action, background tasks of the PC etc…
 

rgeist554

Distinguished
Oct 15, 2007
1,879
0
19,790
I play with all games (sans Crysis of course) at maximum settings @ 1280x1024 res. I'd also say the absolute minimum acceptable FPS for me during any kind of intense fighting or action is 30. I prefer 40-50 and can't really notice any difference when my FPS goes above that, so that's my happy zone.
 
You nail it right on there. I prefer frame rates(50+) to eye candy. ET qauke wars is capped at 30 and seems to be ok. so its all about what game it is....

For strategy games lower fps is more acceptable(as low as 30 depending on the game..hell some are strategy/mmo's capped there anyway..).....
 

quantumsheep

Distinguished
Dec 10, 2005
2,341
0
19,790
I really don't mind as long as it is smooth and consistant FPS at or above 30. Games that are capped at 30 are fine by me, just as long as it doesn't go under that limit.
 

Eviltwin17

Distinguished
Feb 21, 2006
520
0
18,990
i typically play with graphics maxed and 1280x1024 without AA or AF, 30+ is good for non fps games but if im playing fps i want at least 40 fps or i cant stand playing the game. unless of course its crysis :)
 

folius

Distinguished
Nov 20, 2007
109
0
18,680
On my 22" running 1680x1050, I hate dropping below 60FPS in CoD4 multiplayer. Im used to playing CS, TF2, CoD, UT2K4 all at over 100fps with my max set to 100. So to me.... anything below 100 is annoying, but 60 is acceptable to me in online play.
 

rocket_sauce

Distinguished
Oct 1, 2004
168
0
18,690
See, thats the trick of it...FPS's either have just environment zipping by you at say....40FPS... then, all of a sudden, you could have about 10+ ppl shooting rocket launchers at you and zipping/exploding around your head (of course you would die from this, but for the sake of argument, you dont die...yet). Your probably going to get some serious lag from this. Idk, say 5FPS is what you end up with while trying to save yourself. To make up for this wouldn't you have to have about 70-80FPS at least to start to make up for that?
 

ganpachi

Distinguished
Aug 16, 2007
74
0
18,630
you wouldn't. lcds have a refresh rate of 60fps. Usually you should have vsynch enabled on your software to limit the fps output to 60 since too many frames can cause clipping.
 

rgeist554

Distinguished
Oct 15, 2007
1,879
0
19,790
On my 22" running 1680x1050, I hate dropping below 60FPS in CoD4 multiplayer. Im used to playing CS, TF2, CoD, UT2K4 all at over 100fps with my max set to 100. So to me.... anything below 100 is annoying, but 60 is acceptable to me in online play.
Unless you have a 100Mhz monitor... you won't actually see 100 fps... I think maybe you find it more annoying that the number of frames it self drops down rather than actually noticing a decrease in performance.
 

folius

Distinguished
Nov 20, 2007
109
0
18,680
I had a CRT and ran it at 100hz refresh rate. I just got my new computer and LCD about a week ago and I notice a huge difference.
 

purplerat

Distinguished
Jul 18, 2006
1,519
0
19,810
I highly doubt that it's actually the 60fps thats bothering you, but rather like rgeist said it's more likely knowing it's 60fps is whats bothering you. Especially with the games your listing 60fps should be just as smooth as 100fps or 1000fps for that matter.
 

jjblanche

Distinguished
Nov 19, 2007
447
0
18,790
I hate to break it to the FPS junkies, but the human eye can only process 30 frames per second. Period. That's why so many official reviews cite 30 FPS as the magic number.

If your game is running above that (ie: there are 30 actual frames per second being flashed across the screen, or more), you won't be able to detect the difference.

However, if you're running at 30 FPS exactly (or around abouts), there is the risk of dipping below the threshold, and into the choppies. That's why a lot of people prefer a 60 FPS average. However, if you're just talking second to second frame count, you can't do better than 30, as far as the eye is concerned.
 

rgeist554

Distinguished
Oct 15, 2007
1,879
0
19,790
The 30 FPS thing is actually a misconception.

Article: http://amo.net/NT/02-21-01FPS.html

Read the bold for a few highlights. (The whole article is like 2 or 3 pages)

NVIDIA a computer video card maker who recently purchased 3dFx another computer video card maker just finished a GPU (Graphics Processing Unit) for the XBOX from Microsoft. Increasing amounts of rendering capabilities and memory as well as more transistors and instructions per second equate to more frames per second in a Computer Video Game or on Computer Displays in general. There is no motion blur, so the transition from frame to frame is not as smooth as in movies, that is at 30 FPS. In example, NVIDIA/3dfx put out a demo that runs half the screen at 30 fps, and the other half at 60 fps. The results? - there is a definite difference between the two scenes; 60 fps looking much better and smoother than the 30 fps.

Even if you could put motion blur into games, it would be a waste. The Human Eye perceives information continuously, we do not perceive the world through frames. You could say we perceive the external visual world through streams, and only lose it when our eyes blink. In games, an implemented motion blur would cause the game to behave erratically; the programming wouldn't be as precise. An example would be playing a game like Unreal Tournament, if there was motion blur used, there would be problems calculating the exact position of an object (another player), so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned, that is the object wouldn't exist at exactly coordiante XYZ. With exact frames, those without blur, each pixel, each object is exactly where it should be in the set space and time.

...

This is where this article gets even longer, but read on, please. I will explain to you how the Human Eye can perceive much past the mis conception of 30 FPS and well past 60 FPS, even surpassing 200 FPS.

Anyways, read it if you want. If not - don't try to flame me.
 


And i hate to break it to you but you are just propogating a myth. I have spent many hours on the internet trying to find anything that supports what you just said and i cant so if you have some info can you post a link.
This is not a new topic to the forums as you may or may not know but the truth of it is there are many more subtleties involved in how the brain to optic nerve relationship works.
It just isnt possable to measure the value,you can say that most people wont see the differance at x y z but if you take someone who has been gaming at average fps 70 and put them infront of a monitor running at 30 trust me they will know.
Mactronix
 

purplerat

Distinguished
Jul 18, 2006
1,519
0
19,810
I'm not going to argue how many FPS the human eye can see. I agree that at 20,40,50fps there are differences that are easy to see. But when you get to 60fps and above, especially when you're talking about displays that can't even show more then 60fps, I tend to think that it becomes more of an ego thing for people to say "my rig does xxxfps" or "I can tell the difference between xxxfps and xxxfps". Also I believe that it has a lot to do which constant frame rates. To me Crysis looks a lot smoother at a steady 20fps then something flucuating between 25-40fps. While I don't necessarily agree that 30fps is the limit, I would take the article posted above with a grain of salt. Just because somebody wrote it, that doesn't make it true - especially with something that old and no real evidence to support it. Just read the section about motion blur where the author makes it sound like such a thing would not only not have the desired effect in video games, but probably wouldn't even be possible (see Crysis).
 

cleeve

Illustrious
The human eye might be able to see 300 fps, but when a monitor's refresh rate is 60 Hz... which is 60 fps... what does it matter if your LCD can't update it that fast?
 

rgeist554

Distinguished
Oct 15, 2007
1,879
0
19,790
That's because it is a simulated blur and not a actual motion blur (a blur caused by you not focusing on a moving object) Trying to recreate an actual blur in a video game would be impossible w/o doing some of the following: 1.) Making the texture of the object blurry or 2.) Simulating a blur by distorting the space around the object. All Crysis does it basically create a distorted area around the object.

I think the author of the article is basically trying to say that creating a true motion blur inside of a video game would be impossible. You just can't take a full high-qaulity image and move it across a screen @ 60FPS and expect it to blurry w/o some artificial tampering. All you have to do is focus your eyes on the moving object and you'll be able to make details about it.

But when you get to 60fps and above, especially when you're talking about displays that can't even show more then 60fps, I tend to think that it becomes more of an ego thing for people to say "my rig does xxxfps" or "I can tell the difference between xxxfps and xxxfps".
That's the point I was making above. You may be able to tell some small difference between the two IF your monitor can even refresh fast enough, but in most cases it comes down to physically seeing your FPS as an actual number and watching it drop. You don't like seeing it drop, so in turn you think it looks worse, or you get upset.
 

purplerat

Distinguished
Jul 18, 2006
1,519
0
19,810

Well isn't everything in a video game simulated? What's the difference between making a 2D image appear to be 3D and simulating motion blur? Both are playing a trick on the human eye, which isn’t nearly as hard as the AMO article would try to make you believe. Just go to YouTube and search for optical illusions. Yes under the right circumstances the human eye can do some incredible things, but when pushing the limits at high frame rates (even displaying 30 images in less then 1 sec is pretty fast) in an environment specifically designed to fool the eye (video games) I tend to believe, as you do, that most people are full of crap when they say they can see the difference between 60 and 70 fps.