FPS, does it really matter?

madaniel

Distinguished
Dec 8, 2007
26
0
18,530
I apologize if this comes off as an obvious question, but I have done some research through the forums about differences between video cards, specifically the whole ATI vs. Nvidia. While both seem to have their pros and cons and both seem to perform better according to their game, Nvidia seems to have the better "performance". ATI on the other hand seems to perform equally well in most games, yet the true benchmark comes down to FPS.

There is no argument that Nvidia does better under higher levels of AA and such, yet what I was wondering was does the FPS difference really make a difference. I realize that at lower levels of FPS there can be a clear distinction and visible difference between say 1 extra fps and 10 extra. Where does the line of visible performance draw.

For example say an ATI card puts out 50 fps for a given game, and a Nvidia card does 60. Now that is 20% difference. While on paper that may strike alot of ooos and ahhs but is it really a visually determining difference. I realize this will be different say if it was 10 FPS and 15 FPS, there is believe it would be pretty significant. Yet the higher the FPS goes does it reach a point where the game is just simply smooth or does the extra fps really add that much more. Perhaps there is a range of fps that can be noticed.

I only ask because I do not that capability to test this and see. It just seems to me that 10-15 fps in the higher levels, though on paper may be large percentage changes, wouldn't really be noticed. Of course this does not take into account the performance hits of resolution or added aspects such as AA, or price for that matter.

Just my thoughts.
 

mrbobc

Distinguished
Nov 3, 2007
59
0
18,630
basically i look for 30fps minimum in any bench. as long as its over that, there really isnt much difference to me!

if you set up your card accordingly there wont be a problem!!
 

Asian PingPong

Distinguished
Jan 21, 2008
131
0
18,680
For most people 30 fps looks smooth, and anything higher than that would not be noticed. Assume we have 2 cards, one is able to pump out 45 fps, while the other is able to pump out 60 in the same situation. Since there is no point at running such a higher fps, it makes sense to turn up the detail settings, res and filters. Now, assume the fps is now around 30 vs 40. The card pumping out 30 fps will most likely experience noticeable fps drops into the low twenties, while the other card will remain smooth and most likely not drop below 30.
 

teh_boxzor

Distinguished
Aug 27, 2007
699
0
18,980
30 fps is good for me for RTS. for FPS i try to get at least 40-50 to compensate for heavier scenes later in the game. or for multiplayer.
 

madaniel

Distinguished
Dec 8, 2007
26
0
18,530
So it seems that so far 30+ is an aim for alot of people. Say for cards at equal settings in game, anything say above 50 simply starts to become unnoticeable, and its merely bragging rights?
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780



Movies run at 23 fps, but still look smooth because there's the blur effect. For crisp computer display, the limit is around 35fps.
 
Generally speaking, 30fps is a bare minimum for good fluid game play. A constant 60fps is a good base line for really nice gameplay.

What you really need to look for, is not maximum fps, but a combination of both minimum fps and average fps. for example, say that you average 30fps, excellent! However if your fps is constantly fluctuating between 15 and 45fps... your overall experience will be much different than you would expect.

Again, its not so much for "bragging rights" (well it might be for some people :lol: ) as it is for just in case scenarios. I play a quake III based game alot... which is no problem for my 7800gt, as it easily renders over 150+fps. You might say the 7800gt overkill... but I keep my fps capped at 75 and it stays there, even if someone spams the spawn command.

@dagger: Movies (FILM) are 24fps progressive and CRTs can certainly display more than 35fps, as they refresh ~60-75 times per second.
 

ivanski

Distinguished
Jun 27, 2006
104
0
18,680
I assume people think as I do...what if a card can do 60 fps on a particular game with the same settings and other card does 40...not a visual difference there.Then a new game comes out that demands more graphic power.The second card now becomes unplayable with the same settings running at 20fps with the other card at 40.The cheaper card must lower settings to play.Thats why we buy the biggest and baddest we can afford.
 

rayzor

Distinguished
Apr 24, 2004
353
0
18,790



this is more or less the whole point of buying the best card you can. Yeah, my friends ragged on me for buying an 8800GTX a while back, and how they could play many games about as well as i could...but they sure were sorry when crysis came out!

anyway...you are right in your original comment. Once you climb consistently out of the 30's, (average fps) it gets very difficult to notice differences in frame rate.
 

stoner133

Distinguished
Mar 11, 2008
583
0
18,990
The experts say 25fps is fluid if maintained as a constant. That at a constant rate anything above 25 isn't really noticed by the human eye. But the big thing is a constant frame rate. The changes in a frame rate jumping around is what is noticed. The more detail drawn from frame to frame makes those changes stand out.
 
24 - 30 for film/DVD. This is due to the motion blur effect that replicates the way the human eye sees.

RTS low motion 30 is ok for most of them

High action and fps games over 60 i can not see the difference. Since games do not blur like a movie your eye needs more frames to make it look smooth. How many frames varies from person to power. For most its 30 - 60.

Why a card with higher fps?
1. Keep the game more fluid(high min frame rate)
2. To be able to play newer games without needing a new card.
3. Allow you to turn on extra eye candy and AA(good for rts....fps are soo fast paced you don't have time to see[or pay attention to it] it anyway :))

Motion blur with high action. the human eye does the rest of making it smooth.
motionblur3eh0.jpg
 

teh_boxzor

Distinguished
Aug 27, 2007
699
0
18,980
i never knew about the motion blur in movies. i was in the movies today and i wondered why the camera operated at such a soft focus, now i understand its motion blur and not the camera.
 
most games with motion blur make me sick(this is due to blurring even at slow speeds)....so i turn it off. Movies are still all good.

Crysis is about the closes to movies so far.

teh_boxzor, Another thing that may make it look soft is just the large size of a movie screen. but the flicker is a pain, but after about 20 min i adjust to it. Movie motion blur should not be noticeable to the human eye since its literally mimics they way the human eye see. We see still motion and not full frames out brain does the rest(simplified and easy way to say it.)
 

leo2kp

Distinguished
Another thing to consider is your monitor's refresh rate and ActiveSync. ActiveSync will sync the frame-rates to a FPS that reflects a multiple of your monitor's refresh. FPS will be "locked in" to a certain FPS unless the card can't produce those frames, and then it will drop to the next appropriate FPS. ActiveSync is good for virtually eliminating the page tearing effect that can occur when the FPS is different than the refresh rate of the monitor. It ends up with a much smoother experience. For example, most LCDs have a refresh of 60hz (and shouldn't be changed or it will damage the monitor). Without ActiveSync, you will experience page tearing with quick pans back and forth on games such as first-person-shooters or RPGs like Oblivion. It can really effect the realism and give you bad experience. Turning on ActiveSync, with a 60hz refresh, the game will be locked in to 60fps, or for less powerful cards, 30fps or lower (whatever the next one lower is). So in some scenes, you will have absolutely fluid graphics, and the next scene it can drop dramatically. It's not that your card can't handle more than 30fps, but if it goes anywhere in between 30 and 60 (or higher than 60), you'll get page tearing. ActiveSync simply eliminates page tearing by locking your FPS and "syncing" it with your refresh rate.

Now back to my point lol. Benchmarks are generally done with ActiveSync shut off to show the maximum FPS the card can do by not being limited by the monitor's refresh rate. But the benchmarks are tearing like MAD because of it and you most likely wouldn't like to play the game even if it got 300fps without ActiveSync.

So to answer your question about "does FPS really matter?", I would say YES it does for the very fact that a card with better results will be less likely to reduce the FPS to the next lowest multiple in accordance with your monitor's refresh rate, thus giving you consistently smooth graphics no matter what's going on. If you buy a card that gets 55fps because you didn't want to pay the extra $30 for the card that gets 60fps, and you want ActiveSync enabled (page tearing takes a lot out of a game so I always have it on), and the 55fps card misses just slightly when you're battling 13.5 goblins and it has to drop down to 30fps or lower, you're going to wish you had the slightly faster card to keep it at 60.

Those are just my 2gp ^.^
 

krazyk12

Distinguished
Oct 26, 2007
87
0
18,630
depends on the situation. single player solid 30 is fine. multiplayer, i like as much as i can get, and dont like anything below 60.

but anything more that 100 its basically for pure bragging rights
 

To help with Vertical Sync(Activesync Is a Microsoft program :p) make sure you run triple buffering(uses more memory). From what i have seen Vsync always fully removes all page tearing effects, but on some games cause input lag or game lag when a game is poorly programed.
 

nkarasch

Distinguished
Nov 13, 2007
287
0
18,780
I think the reason Crysis looks smooth at a little under 30 is because of motion blur. Most games at those framerates are attrocious though
 
There isnt "motion blur" added into FILM. The effect comes from the fact that movies are shot at... you guessed it... 24fps. Its like the time lapse photos that you see that have a exposure time of a couple hours... but for film.

and to the noobs that say the human eye cant see more than 25fps or whatever... stay the heck out of the discussion :|
 

basketcase

Distinguished
Jun 1, 2006
561
0
18,980
The human eye can definately see more than 25 fps. But, most people can't tell the difference between 45 constant FPS and 70 constant FPS. Key word is CONSTANT, though. If I am sitting at 30 FPS constant, I am happier than a FPS that jumps all around from 20-60 FPS. Because what we do notice is the changes in FPS. And that is what is annoying to the eye.
 

focker

Distinguished
Dec 2, 2007
35
0
18,530
I play ut3 at average 170 fps and still get annoyed because the processor cant keep up when a lot of people are around