Sign in with
Sign up | Sign in
Your question

FPS, does it really matter?

Last response: in Graphics & Displays
Share
March 31, 2008 12:14:02 AM

I apologize if this comes off as an obvious question, but I have done some research through the forums about differences between video cards, specifically the whole ATI vs. Nvidia. While both seem to have their pros and cons and both seem to perform better according to their game, Nvidia seems to have the better "performance". ATI on the other hand seems to perform equally well in most games, yet the true benchmark comes down to FPS.

There is no argument that Nvidia does better under higher levels of AA and such, yet what I was wondering was does the FPS difference really make a difference. I realize that at lower levels of FPS there can be a clear distinction and visible difference between say 1 extra fps and 10 extra. Where does the line of visible performance draw.

For example say an ATI card puts out 50 fps for a given game, and a Nvidia card does 60. Now that is 20% difference. While on paper that may strike alot of ooos and ahhs but is it really a visually determining difference. I realize this will be different say if it was 10 FPS and 15 FPS, there is believe it would be pretty significant. Yet the higher the FPS goes does it reach a point where the game is just simply smooth or does the extra fps really add that much more. Perhaps there is a range of fps that can be noticed.

I only ask because I do not that capability to test this and see. It just seems to me that 10-15 fps in the higher levels, though on paper may be large percentage changes, wouldn't really be noticed. Of course this does not take into account the performance hits of resolution or added aspects such as AA, or price for that matter.

Just my thoughts.

More about : fps matter

March 31, 2008 12:36:22 AM

basically i look for 30fps minimum in any bench. as long as its over that, there really isnt much difference to me!

if you set up your card accordingly there wont be a problem!!
March 31, 2008 1:17:49 AM

For most people 30 fps looks smooth, and anything higher than that would not be noticed. Assume we have 2 cards, one is able to pump out 45 fps, while the other is able to pump out 60 in the same situation. Since there is no point at running such a higher fps, it makes sense to turn up the detail settings, res and filters. Now, assume the fps is now around 30 vs 40. The card pumping out 30 fps will most likely experience noticeable fps drops into the low twenties, while the other card will remain smooth and most likely not drop below 30.
Related resources
March 31, 2008 1:35:57 AM

I can't tell difference after about 45fps. 30 in a first person shooter doesn't seem too smooth to me when I'm looking around really quickly.
March 31, 2008 1:49:43 AM

30 fps is good for me for RTS. for FPS i try to get at least 40-50 to compensate for heavier scenes later in the game. or for multiplayer.
March 31, 2008 1:54:04 AM

So it seems that so far 30+ is an aim for alot of people. Say for cards at equal settings in game, anything say above 50 simply starts to become unnoticeable, and its merely bragging rights?
March 31, 2008 2:05:34 AM

madaniel said:
So it seems that so far 30+ is an aim for alot of people. Say for cards at equal settings in game, anything say above 50 simply starts to become unnoticeable, and its merely bragging rights?



Movies run at 23 fps, but still look smooth because there's the blur effect. For crisp computer display, the limit is around 35fps.
March 31, 2008 2:14:30 AM

Generally speaking, 30fps is a bare minimum for good fluid game play. A constant 60fps is a good base line for really nice gameplay.

What you really need to look for, is not maximum fps, but a combination of both minimum fps and average fps. for example, say that you average 30fps, excellent! However if your fps is constantly fluctuating between 15 and 45fps... your overall experience will be much different than you would expect.

Again, its not so much for "bragging rights" (well it might be for some people :lol:  ) as it is for just in case scenarios. I play a quake III based game alot... which is no problem for my 7800gt, as it easily renders over 150+fps. You might say the 7800gt overkill... but I keep my fps capped at 75 and it stays there, even if someone spams the spawn command.

@dagger: Movies (FILM) are 24fps progressive and CRTs can certainly display more than 35fps, as they refresh ~60-75 times per second.
March 31, 2008 2:23:17 AM

I assume people think as I do...what if a card can do 60 fps on a particular game with the same settings and other card does 40...not a visual difference there.Then a new game comes out that demands more graphic power.The second card now becomes unplayable with the same settings running at 20fps with the other card at 40.The cheaper card must lower settings to play.Thats why we buy the biggest and baddest we can afford.
March 31, 2008 2:29:00 AM

I think the whole point is:

Higher FPS overall = highest minimum fps = smoothest gameplay overall.
March 31, 2008 2:30:24 AM

ivanski said:
I assume people think as I do...what if a card can do 60 fps on a particular game with the same settings and other card does 40...not a visual difference there.Then a new game comes out that demands more graphic power.The second card now becomes unplayable with the same settings running at 20fps with the other card at 40.The cheaper card must lower settings to play.Thats why we buy the biggest and baddest we can afford.



this is more or less the whole point of buying the best card you can. Yeah, my friends ragged on me for buying an 8800GTX a while back, and how they could play many games about as well as i could...but they sure were sorry when crysis came out!

anyway...you are right in your original comment. Once you climb consistently out of the 30's, (average fps) it gets very difficult to notice differences in frame rate.
March 31, 2008 2:39:40 AM

The experts say 25fps is fluid if maintained as a constant. That at a constant rate anything above 25 isn't really noticed by the human eye. But the big thing is a constant frame rate. The changes in a frame rate jumping around is what is noticed. The more detail drawn from frame to frame makes those changes stand out.
a c 107 U Graphics card
a b Î Nvidia
March 31, 2008 2:47:13 AM

24 - 30 for film/DVD. This is due to the motion blur effect that replicates the way the human eye sees.

RTS low motion 30 is ok for most of them

High action and fps games over 60 i can not see the difference. Since games do not blur like a movie your eye needs more frames to make it look smooth. How many frames varies from person to power. For most its 30 - 60.

Why a card with higher fps?
1. Keep the game more fluid(high min frame rate)
2. To be able to play newer games without needing a new card.
3. Allow you to turn on extra eye candy and AA(good for rts....fps are soo fast paced you don't have time to see[or pay attention to it] it anyway :) )

Motion blur with high action. the human eye does the rest of making it smooth.
March 31, 2008 3:00:58 AM

i never knew about the motion blur in movies. i was in the movies today and i wondered why the camera operated at such a soft focus, now i understand its motion blur and not the camera.
March 31, 2008 3:24:38 AM

some games have motion blur though....
a c 107 U Graphics card
a b Î Nvidia
March 31, 2008 3:48:05 AM

most games with motion blur make me sick(this is due to blurring even at slow speeds)....so i turn it off. Movies are still all good.

Crysis is about the closes to movies so far.

teh_boxzor, Another thing that may make it look soft is just the large size of a movie screen. but the flicker is a pain, but after about 20 min i adjust to it. Movie motion blur should not be noticeable to the human eye since its literally mimics they way the human eye see. We see still motion and not full frames out brain does the rest(simplified and easy way to say it.)
March 31, 2008 4:06:28 AM

Another thing to consider is your monitor's refresh rate and ActiveSync. ActiveSync will sync the frame-rates to a FPS that reflects a multiple of your monitor's refresh. FPS will be "locked in" to a certain FPS unless the card can't produce those frames, and then it will drop to the next appropriate FPS. ActiveSync is good for virtually eliminating the page tearing effect that can occur when the FPS is different than the refresh rate of the monitor. It ends up with a much smoother experience. For example, most LCDs have a refresh of 60hz (and shouldn't be changed or it will damage the monitor). Without ActiveSync, you will experience page tearing with quick pans back and forth on games such as first-person-shooters or RPGs like Oblivion. It can really effect the realism and give you bad experience. Turning on ActiveSync, with a 60hz refresh, the game will be locked in to 60fps, or for less powerful cards, 30fps or lower (whatever the next one lower is). So in some scenes, you will have absolutely fluid graphics, and the next scene it can drop dramatically. It's not that your card can't handle more than 30fps, but if it goes anywhere in between 30 and 60 (or higher than 60), you'll get page tearing. ActiveSync simply eliminates page tearing by locking your FPS and "syncing" it with your refresh rate.

Now back to my point lol. Benchmarks are generally done with ActiveSync shut off to show the maximum FPS the card can do by not being limited by the monitor's refresh rate. But the benchmarks are tearing like MAD because of it and you most likely wouldn't like to play the game even if it got 300fps without ActiveSync.

So to answer your question about "does FPS really matter?", I would say YES it does for the very fact that a card with better results will be less likely to reduce the FPS to the next lowest multiple in accordance with your monitor's refresh rate, thus giving you consistently smooth graphics no matter what's going on. If you buy a card that gets 55fps because you didn't want to pay the extra $30 for the card that gets 60fps, and you want ActiveSync enabled (page tearing takes a lot out of a game so I always have it on), and the 55fps card misses just slightly when you're battling 13.5 goblins and it has to drop down to 30fps or lower, you're going to wish you had the slightly faster card to keep it at 60.

Those are just my 2gp ^.^
March 31, 2008 4:21:50 AM

Average frame rates? I like somewhere 50+ fps.

Minimal frame rates? I like 25fps but 30 would be better.
March 31, 2008 4:32:26 AM

depends on the situation. single player solid 30 is fine. multiplayer, i like as much as i can get, and dont like anything below 60.

but anything more that 100 its basically for pure bragging rights
a c 107 U Graphics card
a b Î Nvidia
March 31, 2008 5:08:51 AM

leo2kp said:
Another thing to consider is your monitor's refresh rate and ActiveSync. ActiveSync will sync the frame-rates to a FPS that reflects a multiple of your monitor's refresh. FPS will be "locked in" to a certain FPS unless the card can't produce those frames, and then it will drop to the next appropriate FPS. ActiveSync is good for virtually eliminating the page tearing effect that can occur when the FPS is different than the refresh rate of the monitor. It ends up with a much smoother experience. For example, most LCDs have a refresh of 60hz (and shouldn't be changed or it will damage the monitor). Without ActiveSync, you will experience page tearing with quick pans back and forth on games such as first-person-shooters or RPGs like Oblivion. It can really effect the realism and give you bad experience. Turning on ActiveSync, with a 60hz refresh, the game will be locked in to 60fps, or for less powerful cards, 30fps or lower (whatever the next one lower is). So in some scenes, you will have absolutely fluid graphics, and the next scene it can drop dramatically. It's not that your card can't handle more than 30fps, but if it goes anywhere in between 30 and 60 (or higher than 60), you'll get page tearing. ActiveSync simply eliminates page tearing by locking your FPS and "syncing" it with your refresh rate.

Now back to my point lol. Benchmarks are generally done with ActiveSync shut off to show the maximum FPS the card can do by not being limited by the monitor's refresh rate. But the benchmarks are tearing like MAD because of it and you most likely wouldn't like to play the game even if it got 300fps without ActiveSync.

So to answer your question about "does FPS really matter?", I would say YES it does for the very fact that a card with better results will be less likely to reduce the FPS to the next lowest multiple in accordance with your monitor's refresh rate, thus giving you consistently smooth graphics no matter what's going on. If you buy a card that gets 55fps because you didn't want to pay the extra $30 for the card that gets 60fps, and you want ActiveSync enabled (page tearing takes a lot out of a game so I always have it on), and the 55fps card misses just slightly when you're battling 13.5 goblins and it has to drop down to 30fps or lower, you're going to wish you had the slightly faster card to keep it at 60.

Those are just my 2gp ^.^

To help with Vertical Sync(Activesync Is a Microsoft program :p ) make sure you run triple buffering(uses more memory). From what i have seen Vsync always fully removes all page tearing effects, but on some games cause input lag or game lag when a game is poorly programed.
March 31, 2008 5:14:41 AM

I think the reason Crysis looks smooth at a little under 30 is because of motion blur. Most games at those framerates are attrocious though
March 31, 2008 8:33:47 PM

There isnt "motion blur" added into FILM. The effect comes from the fact that movies are shot at... you guessed it... 24fps. Its like the time lapse photos that you see that have a exposure time of a couple hours... but for film.

and to the noobs that say the human eye cant see more than 25fps or whatever... stay the heck out of the discussion :|
March 31, 2008 8:51:36 PM

The human eye can definately see more than 25 fps. But, most people can't tell the difference between 45 constant FPS and 70 constant FPS. Key word is CONSTANT, though. If I am sitting at 30 FPS constant, I am happier than a FPS that jumps all around from 20-60 FPS. Because what we do notice is the changes in FPS. And that is what is annoying to the eye.
March 31, 2008 9:26:20 PM

I play ut3 at average 170 fps and still get annoyed because the processor cant keep up when a lot of people are around
March 31, 2008 9:34:15 PM

yup the jumps in fps i get in DOD:S really tick me off i go from 90-50 fps sometimes and it makes the game really bad.
March 31, 2008 9:40:32 PM

i doubt any of you have seen above 60 fps on anything some maybe but doubtful because your refresh rate is most likely 60 which means that your monitor is only dysplaying things 60 times a second
a c 107 U Graphics card
a b Î Nvidia
April 1, 2008 12:16:31 AM

skittle said:
There isnt "motion blur" added into FILM. The effect comes from the fact that movies are shot at... you guessed it... 24fps. Its like the time lapse photos that you see that have a exposure time of a couple hours... but for film.

and to the noobs that say the human eye cant see more than 25fps or whatever... stay the heck out of the discussion :|

I never said it was added(or i did not want to make it sound that way)...its just the way film works....but that blur DOES help it to look smooth... so :p 

I guess i better make sure i describe it more as a 1/30th(or 24th) of a second exposure per frame next time...
April 1, 2008 12:35:12 AM

I actually had my eyes tested and I can notice the difference on a consistent basis up to 45-50 fps. The doctor that performed the test said the highest result she ever saw was 70 fps. I actually find that a game just "feels" better if I can keep the fps at 60+ though I can't really notice any chopiness.

How high FPS depends on the individual not the national average so find your own sweetspot and stick with it.
April 1, 2008 12:48:34 AM

Ananan said:
I think the whole point is:

Higher FPS overall = highest minimum fps = smoothest gameplay overall.


Exactly.
a c 107 U Graphics card
a b Î Nvidia
April 1, 2008 1:31:01 AM

navvara said:
I actually had my eyes tested and I can notice the difference on a consistent basis up to 45-50 fps. The doctor that performed the test said the highest result she ever saw was 70 fps. I actually find that a game just "feels" better if I can keep the fps at 60+ though I can't really notice any chopiness.

How high FPS depends on the individual not the national average so find your own sweetspot and stick with it.

How did they do the test? just curious....
April 1, 2008 2:36:15 AM

Black square moving across white background at constant speed. Fore each test run you're supposed to say whether or not the image was blurred or not. When you see the image as no longer blurred that's your mark! Of course you can cheat but it's a freaking eye exam why would you?

I got to 45-50 with both eyes, 45-50 with my left eye and below 20 with my right eyes which is pretty bad.
a c 107 U Graphics card
a b Î Nvidia
April 1, 2008 4:31:50 AM

ahh cool.....
April 1, 2008 8:51:40 AM

klobnitrones said:
i doubt any of you have seen above 60 fps on anything some maybe but doubtful because your refresh rate is most likely 60 which means that your monitor is only dysplaying things 60 times a second


You doubt wrongly my monitor is capable of refreshing at up to 130 times a second although 87 times a second at my more commonly used resolutions... of course I use dinosaur highend CRT's rather than more slim line LCD's, Personally I find the CRT image quality far better than LCD panels.
April 1, 2008 12:48:57 PM

30 fps looks smooth to me, but i can feel "lag" on any First person shooter under 60fps, so for me i need 60fps or higher to play a game correctly.

what you have to remember is that benchmarks dont always show the AVERAGE fps, which imo is more important than the high/low figures, i dont mind a slow down now and then, and i certaintly dont notice the difference between 75fps and 100fps(unless its the original UT, which for some reason i can only play well at 300fps or higher.).
April 1, 2008 4:46:25 PM

The important of all FPS numbers is the minimum. A great example would be that of a computer I was reviving so a friend can play WoW (I play EVE). It was a XP +2100 with 1GB of 2700, and a ATI 9000 Pro 128MB(yesh I know). But check this out, for fun I was testing all the video options to get the best visual without a slideshow. I had it hooked up to my monitor and for fun decided to mess around and see what this old hardware can do. I had everything maxed with the exception of viewing distance, and slammed the res on 1920x1200. Outside it was a beautiful slide show around 11FPS (I thought was impressive consider the age of parts). Then as soon as I walked inside a cave the frames soared to around 45~50. Which was totally playable, and goes to show although you get good FPS in a game in one part, it can be a completely different story in another, meaning, minimum FPS is the one that matters most. And just about most benchmarks don't mention it, which I believe they should.
a c 271 U Graphics card
a c 171 Î Nvidia
April 1, 2008 5:55:45 PM

bildo123 said:
The important of all FPS numbers is the minimum.

I agree a high average minimum framerate is what I like/need, which is why I travel the Sli road.
April 1, 2008 10:25:25 PM

i usually play stuff at 20 fps...
April 1, 2008 11:13:43 PM

If you like a pot of coffee in the morning and a noon time headache the concept of fps is for you.
there is way more than fps going on...and am not even going to begin to babble about it.

I suppose a comparison of what goes long and what doesn't needs more than fps, I will leave it at that. Learn it and forget it. :whistle: 
April 2, 2008 4:17:56 AM

the moral of the story? if you want crysis to be lifelike go enlist.

if you don't want to enlist, stop griping.
a c 107 U Graphics card
a b Î Nvidia
April 2, 2008 4:22:19 AM

teh_boxzor said:
the moral of the story? if you want crysis to be lifelike go enlist.

if you don't want to enlist, stop griping.

Why am i thinking boom head shot....
April 2, 2008 9:58:01 AM

Not sure about you guys but i can certainly tell the difference between 45 and 90 FPS (taking those two mins as extremes) yet its so minor i don't even bother. To be honest i'd go for the best performance for your money and i believe ATI do a good job of shipping out those sorts of cards. However go with whatever suites you best :) 
April 2, 2008 11:54:30 AM

phew.... and lets not even get started on interlaced TC and IVTC content... then you all would be really confused!

For all practical purposes... You cannot directly compare FILM footage to computer rendered frames.
April 2, 2008 12:35:30 PM

It's all in the eye of the beholder. Everyone sees things differently. Get as much video card as your budget will allow.
!