Sign-in / Sign-up
Your question

Newbie Question about FPS.

Tags:
  • Graphics Cards
  • Lag
  • Games
  • FPS
  • Graphics
Last response: in Graphics Cards
May 23, 2006 9:45:50 PM

At how many FPS is the game considered playable? Like at what point do you not really notice any frame lag?

More about : newbie question fps

May 23, 2006 9:57:49 PM

30 on first person shooters and 24 for RTS games also movies are shot at 24 fps.
Related resources
May 23, 2006 10:24:11 PM

Quote:
30 on first person shooters and 24 for RTS games also movies are shot at 24 fps.


that is only if you are playing a single player game. Online then I feel that 60 is more the mark to shoot for. (get as close as you can or above)

All else being equal, if you are running 30 and your oponent is running 60 then he has twice the frames to frag you in. Online shooters particularly will see ppl w/ slower rigs getting frustrated b/c between the few frames they are rendering they got smoked and don't know why. Some of that is network lag, but if you are on the same connection (like on a LAN) you can see that the slower machine holder has a dissadvantage.

Otherwise play what looks smooth to you. Don't worry about what the #s say if you can enjoy the game then do it. If it looks choppy or jumpy then fps is too low. ;) 
May 23, 2006 10:37:14 PM

Quote:
At how many FPS is the game considered playable? Like at what point do you not really notice any frame lag?


Excellent question. Given that I've found FPS frame rates above 30 or 40 extremely usable, I find myself wondering what might be the point striving for numbers up near 100 or more.

My objective is to get all the eye-candy turned on (because I enjoy it) and still have a usable frame rate. I don't buy hardware just to get a frame rate up in the stratosphere.
May 23, 2006 10:40:31 PM

Heck yeah i turn down my settings to medium in order to get 50 on average.
May 23, 2006 11:10:28 PM

The problem is the framerate drops in games; if it had a constant framerate of 30FPS it wouldn't be as bad as having 60 and dropping to 30 a lot.
May 24, 2006 2:56:12 AM

How are you supposed to shoot for 60 fps without dropping a large sum of money on a graphics card in order to achieve on a game like FEAR?
May 24, 2006 3:08:06 AM

Quote:
How are you supposed to shoot for 60 fps without dropping a large some of money on say something like FEAR?


Well, I was prepared to drop some large money on a pair of 7800GTXs in SLI. But that is my hobby and where I put my resources. Others go on vacations to Disneyland or on cruises ... buy motorcycles ... whatever. Still, computing is and computers are so much cheaper these days than when I started it isn't funny ... like when IBM was the first and only widely available desktop for $7,000 and that was with a huge 10 MB HD. :p 

We'd sit around for a couple of minutes waiting from some simple task to complete thinking about all that "power in that box. WOW ... big stuff. Seems funny now.
May 24, 2006 3:15:36 AM

WOW (and not the MMO). That is true that if it is something you are passionate about doing its worth throwing a large sum of cash no matter how unreasonable. I am in the process of building my own computer. I think at first that I am going to build something within a $800 budget and then slowly upgrade from there. I have always wanted to build my own computer, I have even dreamed about finally finishing a computer and playing half life 2 with all my friends. I am graduating from highschool and I am going to get a restaurant job(I hear that there good for getting fast cash) while I attend a community college. I really want to understand the technality of different types of graphic cards and so on for instance I will see 5 different versions of the some graphics crad so I wouldn't know what to pick. My current computer is like 4 yrs. old. here let me give you the stat

Celeron 1.4 GHz
128mb ram
this is a shocker-
11mb dynamically :roll:
allocated as Video memory(yeah that sucks can barely play runescape)
40 GB HD

So you could say that I have had a secret obsession with computers for the last 4 years but never had the means to finance it. Soon I hope that my "dreams" will eventually be realized in the very near future.
May 24, 2006 3:47:21 AM

Hmm, that 100FPS page wasn't that helpful.

Try my dissertation on the subject. It is a bit more geared towards gaming framerate: http://planetdescent.com/d3help/framerate.shtml

The page is quite old, but the information is still valid. It was written about the game Descent 3 (The "real" D3!) but it applies to all FPS games.

RTS games don't have nearly so high a requirement for frame rates, as the general slow pace of movement doesn;t require it.

Personally, I find any framerate under 60 in an FPS game to be utterly unnacceptable. 30 is just pathetic, and makes me cry like a baby within 2 minutes.

100 is preferable, but anything above 85 is fine. Provided you have an 85Hz refresh rate.

The key thing is that once you hit 60 or more, you absolutely HAVE to have the refresh rate maxed out. This is why is is still preferable to spend large sums of money on advanced CRTs: the refresh rate is so much higher than that of LCDs.

When considering frame rates, it's the ACTUAL video data information rate DISPLAYED ON YOUR SCREEN which is important. In other words, 200 FPS at 60Hz refresh, has a lower information rate than 100 FPS with 85Hz refresh rate.

Peruse my article, and you might gain a better understanding!
May 24, 2006 4:12:05 AM

Yes, now I understand, thanks for linking that so I could get a better perspective of it. So what kind of Graphics Card or Rig or even Monitor and what would the cost be to achieve 60 fps and higher in a game like FEAR or Oblivion.
May 24, 2006 5:28:51 AM

I am using a 512MB X1300 right now (its slightly faster then my old Radeon 9800 Pro) Soon ill get something better but to be honest I can play any game if I lower the resolution to 800X600. At that setting on Oblivion with HDR turned on I get about 20-50 FPS (depends on where I am) its playable :)  I know I would enjoy it much more on a X1900 or a 7900 GTX but I cant say I dont enjoy it on the X1300 ! I guess what im saying is that buying the most expensive video card isnt important. Whats important is if you enjoy playing the game ! If I was buying a video card right now to be honest I would stick to the 200-300$ range the 400+ range are awesome cards but you sure do pay alot for that last 10-20% of performance (I made that number up but im sure its close to it lol)

Edit: I just seen the post above this one. smooth consistant 60+FPS in Oblivion is a tough one I have heard you need 2GB's of RAM and a good CPU and a better Video card. Probably one of the 500$ video cards and the 2GB's and a A64 3500+ or better or on the Intel side shoot for at least 3.4Ghz I have a cheesy 805 made not so cheesy by OCing it to 3.6Ghz so it runs pretty good for me. Amazingly on my Athlon 64 3500 with the same video card VS my 805D the 805D performs better but I cant OC my Athlons at all so im sure that has ALOT to do with it.
May 24, 2006 6:42:23 AM

Thanks for the reply, anyone elses 2 cents would be greatly appreciated on this.
May 24, 2006 2:02:45 PM

Quote:
How are you supposed to shoot for 60 fps without dropping a large sum of money on a graphics card in order to achieve on a game like FEAR?

Well i went and spent a ton of cash on a dam 7800GTX and FEAR runs shit compared to how it runs for everyone else who owns that card :evil:  :evil:  so my advice is to be carefull, this is my second GTX after last one crapped out, and in my opinion the chances of getting what you pay for are very very dam slim!!! :evil: 
May 24, 2006 2:42:34 PM

Quote:
30 on first person shooters and 24 for RTS games also movies are shot at 24 fps.


that is only if you are playing a single player game. Online then I feel that 60 is more the mark to shoot for. (get as close as you can or above)

All else being equal, if you are running 30 and your oponent is running 60 then he has twice the frames to frag you in. Online shooters particularly will see ppl w/ slower rigs getting frustrated b/c between the few frames they are rendering they got smoked and don't know why. Some of that is network lag, but if you are on the same connection (like on a LAN) you can see that the slower machine holder has a dissadvantage.

Otherwise play what looks smooth to you. Don't worry about what the #s say if you can enjoy the game then do it. If it looks choppy or jumpy then fps is too low. ;) 

I cant believe you would post numbers like that and still not be able to see them after typing them yourself. I dont think that online I will EVER wonder how I got fragged by missing 1/60th of a second. If I were the type of person who called others idiots you would be on the list. What happened? Are you trying to justify the money you spent on your last video card to yourself? If so just spend the money and be ok with it. Dont use numbers that prove you wrong to feel better about yourself.
May 24, 2006 3:04:28 PM

umm... wow. not sure how my post garnered a flame but whatever...

...I do not need justification, I was making a point about fps. The OP wanted opinions about it so I weighed in. The numbers I used are for explanation only, not actual numbers to measure by. I assumed that was obvious. I offered something more to think about as it is an issue, even if you don't believe it. Simply put, there is a reason that "pro" gamers turn down graphics on quake or doom durring competition... that is to get more frames for an advantage, no matter how small. I ended with saying that raw numbers dont really matter (insert: "for most gamers" here) and get what looks good to you.

If you cannot read the point I was making I feel truly sorry for you. Take some night classes and learn some comprehension to go with your aparent character recognition skills. (unless you had a monkey type your post in which case learn those recognition skills too) Take your bitchy self somewhere else.
May 24, 2006 3:23:23 PM

Keep in mind that if you're getting 30 fps, the odds are your minimum fps is probably MUCH lower than that... and when it dips below 30, your eyes and brain are definitely going to notice it.

For some games it won't matter... who really cares abou 10 fps in the Sims 2? In Quake or Battlefield 2 it's a whole different story.
May 24, 2006 4:33:10 PM

My optomitrist said that the human eye rarely can distinguish speeds faster than 60 ffps. At the same time, it likes that 60 ffps in terms of less eyestrain, headaches, and so forth. People can play games at 25 or 30 frames, but it is hard on their eyes in the long run. So, the thing to do is to either buy fast video card(s), or turn down the specs until you get the framerate up. If it wasn't for the demands of my ex-wife, I'd get a faster card or two, but as it is, I'll just turn the eye candy down a bit.
May 24, 2006 4:44:42 PM

I thought (perhaps wrongly) the whole <60 thing mainly applied to the refresh rates of monitors... way back when, some monitors would operate at 55 Hz and that would create headaches after extended periods of use. And yes, the number they recommended back then was 60 Hz or higher. I don't guess I realized this would apply to frame rates as well... if what you're saying is true, watching a movie at 24 fps in a theater would contribute to eye strain or our TVs at 30 fps. I'm thinking fps and refresh rates affect our eyes differently...

That said... if you can convince your wife that a video card upgrade is the only way to keep from going blind, GO FOR IT! Wish I'd have thought of that!
May 25, 2006 6:47:00 AM

Quote:

That said... if you can convince your wife that a video card upgrade is the only way to keep from going blind, GO FOR IT! Wish I'd have thought of that!


ROTFLMAO I wonder if my wife would save my eyes :)  hmmm I think she would tell me "Just dont play that game you got all these old games here...." LOL I know some women would understand but they are like gold hard to find and everyone wants them ! :) 
May 25, 2006 7:56:29 AM

If you are using a CRT wouldn't the refresh rate be the same, even if your game is running at 100fps.
As for LCDs, isn't there a delay in changing of colors (latency), with some as low as 30ms. So high frame rates should not matter.

I think the question should not be average fps but minimum fps, as to not to notice.
May 25, 2006 4:22:44 PM

Quote:
I dont think that online I will EVER wonder how I got fragged by missing 1/60th of a second.


It's not trhat simple. Games do not run at a single framerate.

If you are running 60 fps average, you will probably dip into the 30's when action heats up.

If you are running 30 fps average, you might dip into the high teens.

When you're lining up a shot and your mouse movement get's choppy, that's going to make a difference. Enough to get you fragged, even.
May 25, 2006 5:49:32 PM

Well, what you want for a framerate depends on the game, and personal taste. Framerates vary wildly for things, as I'll show below. Note that when I put an "~" before the framerate, that's because it's a general estimate for a game, a framerate that 99% of people will agree will either be sufficient or more than sufficient. I also list some popular games, and what they have the framerate "fixed" at, so that it won't go over or (usually) under:[*:2d73dab9b0]Ultima Online (for the PC) - 15 FPS
[*:2d73dab9b0]The Legend of Zelda: Ocarina of Time (for the Nintendo64) - 18 FPS
[*:2d73dab9b0]Real-time RPGs - ~20 FPS
[*:2d73dab9b0]Films (VHS, DVD, theatre) - 24 FPS
[*:2d73dab9b0]StarCraft (PC version) - 24 FPS (animation rate, @normal game speed)
[*:2d73dab9b0]PAL (European) interlaced TV (480i, 720i, 1080i) - 25 FPS
[*:2d73dab9b0]NTSC (American/Japanese) interlaced TV (480i, 720i, 1080i) - 30 FPS
[*:2d73dab9b0]Sports/Action titles: ~30 FPS
[*:2d73dab9b0]Halo/Halo 2, (Xbox version) - 30 FPS
[*:2d73dab9b0]The Elder Scrolls IV: Oblivion (Xbox 360 version) - 30 FPS
[*:2d73dab9b0]The majority of console titles: - 30 FPS
[*:2d73dab9b0]Doom (PC version) - 35 FPS
[*:2d73dab9b0]PAL (European) progressive-scan TV (480p, 720p, 1080p) - 50 FPS
[*:2d73dab9b0]NTSC (American/Japanese) progressive-scan TV (480p, 720p, 1080p) - 60FPS
[*:2d73dab9b0]Multiplayer FPS titles (particularly Counter-Strike) - ~60 FPS
[*:2d73dab9b0]Maximum noticable framerate of the human eye: more than 300However, perhaps it's also worth paying attention to not just the average framerate, but also what the LOWEST-POINT framerate is, as well as how CONSISTENT the framerate is. If a FPS game averages well over 60fps, it's not good if that's only because outside of combat, you could get 200+ FPS, but in combat, it slows to 10-20 FPS. Similarly, if the game "hiccups," where it will still get 60 FPS, but there are often stutters in that, that's not good either.

So really, this question is far deeper than to be satisfied by a single (or even multiple) numbers for the answer.
May 25, 2006 7:09:33 PM

Quote:

  • Films (VHS, DVD, theatre) - 24 FPS


  • I'd also like to remind people that film is a very different animal than other media, because film will capture 'motion blur'

    Motion blur allows slower framerates to appear smoother because it tricks the eye. Even Pixar animated films will have motion blur calculated so the CG doesn't look choppy.

    No video game in existance has realistic motion blur at this time, to the best of my knowledge.
    May 25, 2006 7:28:40 PM

    On enemy territory when im alone i can average about 50 fps now when action heats up and theres bombs dropping and people are shooting at me I average about 14 fps and have reached 6 fps. And I get easily fragged. At this point its no longer fun and start cussing and kicking the computer tower.
    May 25, 2006 7:36:17 PM

    I can play all games made about 2 or so years ago and some really boring games today like the sims 2. I was playing Wolfenstein Enemy Territory its pretty fun, most of the time.
    May 25, 2006 8:13:32 PM

    Quote:
    How are you supposed to shoot for 60 fps without dropping a large sum of money on a graphics card in order to achieve on a game like FEAR?

    Well i went and spent a ton of cash on a dam 7800GTX and FEAR runs **** compared to how it runs for everyone else who owns that card :evil:  :evil:  so my advice is to be carefull, this is my second GTX after last one crapped out, and in my opinion the chances of getting what you pay for are very very dam slim!!! :evil: Yeah; fukc a new video card; I just my whole system to run at its (GPU-limited) potential. The best 3DMark05 score I ever got was 2,265, and that was with a 3.2GHz P4 on a proprietary HP board. That damn Prescott fried itself (and the board), so I got this A64 and DFI board. My 3DMark05 score has gone down to 1,300; I think I've f*cked something up, but I'm not quite sure what it is.
    May 26, 2006 12:32:14 AM

    Quote:

  • Films (VHS, DVD, theatre) - 24 FPS


  • I'd also like to remind people that film is a very different animal than other media, because film will capture 'motion blur'

    Motion blur allows slower framerates to appear smoother because it tricks the eye. Even Pixar animated films will have motion blur calculated so the CG doesn't look choppy.

    No video game in existance has realistic motion blur at this time, to the best of my knowledge.
    Ah, yes, I did forget to mention it.

    Motion blur is, in many respects, like anti-aliasing for the dimension of time.

    To be more specific, the way it's handled in films, it's no wonder why it's not seen in games; you render a BUNCH of frames in sequence, then blend them together. (at least, that's how I THINK they do it)

    Like super-sample AA, this dramatically increases the processing required to achieve a single displayed frame. Worse yet, if you don't use enough rendered frames to compile the final frame, it just looks funny.

    I know what it looks like; my younger brother has a habit of, when he records video clips, of cutting the game speed down to 25%, (or sometimes even slower) and then merging frames to yield a motion-blurred final product. (obviously, he runs another recording pass at normal speed to get the audio to come out right)
    May 26, 2006 8:20:29 PM

    Reading these replies makes me feel like I have to buy the most expensive piece of hardware on the market to have an enjoyable gaming experience.
    What hardware in your computer affects FPS achieved in a game?
    May 26, 2006 8:25:13 PM

    enjoyable is subjective man... what you like is not what I like and so-on...

    todays games hit the gpu the hardest, but cpu/ram/hard drive/sound chip all affect the fps. Just depends on what you want. Only get the most expensive if you want the best, but that does not mean you need it to enjoy it.

    for some, middle-ground is still enjoyable. Others even enjoy graphics circa 1996. I can still find joy in text based adventures.

    get what you want for what you enjoy, dont ask what the Jones' have. ;) 
    May 27, 2006 1:28:28 AM

    Hey thats some good advice, Ill take that into consideration. All I really wanna be able to play is CS:S with no lag or stuttering, BF2, and Lineage 2 because abunch of my friends play those games and I would like to join them in it. I guess I could build a middle-end computer and just upgrade later.

    I also enjoy games like Baldurs Gatre and Ice Wind Dale.
    May 28, 2006 5:00:31 AM

    I played many of those games wonderfully on my 9700pro before my current system. The source engine screams on the 97/9800s at high settings and Lineage2 uses the unreal engine i believe so it would run great as well on anything above that level. In fact, if you got anything above a 9800 you would play all those games you mentioned very well. Anything from the Nv gf6 series or the x800 ati cards would serv well... "middle end" as you put it would do verywell as those cards are not high end by today's standards.

    Of course the gf7 or x1000 series gives you some future-proofing for newer games that you might take a fancy to. anything gf7600gs or above or x1800gto or above would give you lots of life.
    June 14, 2006 7:28:15 AM

    Quote:
    Hey thats some good advice, Ill take that into consideration. All I really wanna be able to play is CS:S with no lag or stuttering, BF2, and Lineage 2 because abunch of my friends play those games and I would like to join them in it. I guess I could build a middle-end computer and just upgrade later.

    I also enjoy games like Baldurs Gatre and Ice Wind Dale.

    Well, fortunately, those games don't rely too much on having an expensive video card. I'm actually using a Radeon X800XT; mid-range by today's standards, but it serves me well even in Oblivion, by far the most demanding game out today.

    So, you should actually be able to do with much less. Just as a note, there's a LOT of factors that go into what your framerate will be. So, for what you selected...

    Battlefield 2 is definitely the most demanding game you have. However, it's not because it's particularly advanced, (it's on a par with Morrowind or Unreal Tournament 2003, both from the year 2002) but because it's poorly coded. Even a mildly decent video card (<$150US) can max out the settings and get 60fps. But that assumes that your CPU is working well, and perhaps most importantly for the game, your memory is both fast and plentiful. Lastly, note that having LOTS of players in the game will ALWAYS slow down the game; even on the best systems, 60fps can become impossible.

    Counter-Strike: Source may be the one game that demands having 60fps the most, but fortunately, it's not that intensive. Source games tend to run better on ATi cards; as suggested, a 9700 or 9800 series card will fly through the game; an X700 will be comparable, and an X800 or better will be simply overkill. Obviously, you want to have at least a DECENT CPU with it, but even an AthlonXP or 2.0GHz or better Pentium 4 should be fine.

    Lastly, Lineage 2, if memory serves me correctly, uses the ORIGINAL Unreal engine, as mentioned by sojrner. That should run fine even on much older hardware.

    Similarly, Baldur's Gate and Icewind Dale games are simply 2D, and your graphics card makes no difference there. A half-decent processor will be all you need. Note that because they are SPRITE-BASED, your "apparent" framerate might never reach 60fps; this is normal. Each sprite can only show frames that the artists have drawn for them ahead of time.
    June 14, 2006 2:13:55 PM

    actually I believe Lineage 2 uses the Unreal 2 (ut2003) engine. Not to be confused w/ the original Unreal 1 (ut '99) Unreal 1.5 or Unreal 2.5 (ut2004)

    I am a die-hard ut fan, love them all (even w/ my dislikes of certain things in ut2003) and I would not equate BF2 w/ any of them engine/performance wise. I do agree that BF2 is poorly executed, buggy and laced w/ a bad interface... while all of the ut incarnations are smooth as silk, but the BF2 engine will draw much farther out than even ut2004. The unreal engine is solid and scales better on all ranges of systems, but the detail close up is better on BF2 when the games are run @ the same res w/ each game set w/ similar settings. (onslaught levels in ut) Detail farther out w/ draw distances maxed on both BF2 beats it IMO. (ut maps are far from the size of BF2 but the distance fog on both never hits the "edge of the world")

    I did tests to see on my 9700pro athlonXP 2700, and you can see that while BF2 runs a bit slower than ut2004, it is doing more and so should run slower. The game was still more than playable @ 1024 w/ high settings on that system. (I would advise the x800 or above mentioned by nottheking to get up to at least 1280 w/ settings high) Running BF2 on my current x1900 compared w/ ut2004 the gap is even wider. Once AA is set to 4x and AF at 8x and seeing 1600x1200 res it is no contest. BF2 shader work looks amazing next to the relatively old ut2004

    nottheking's assessment of the rest of the games is dead-on. While that newer card is certainly overkill for most/all of the games you are playing, you can never really have too much card b/c you will always run into a new game that rocks your world, and like it or not new games take newer games are getting hardware hungry. Rest assured, what is overkill today will be barely enough tomorrow. ;) 
    a b U Graphics card
    June 14, 2006 4:43:18 PM

    Quote:

    No video game in existance has realistic motion blur at this time, to the best of my knowledge.


    Similar to the TV effect of edge-bleeding/blending it's amazing what a difference that makes.

    Last time I saw anyone attempt to emulate film stock (500/f16) was with an FX5900 based Quadro, and at that time, they calculated that to have the level of AA while maintianing AF for non blurred items and AA (remember a moving object is blurred but not the background which remains detailed) require over 4-12+ seconds per frame (depending on complexity of scene), and that's not frames per second, at only 640x480. Even then they said it was close but they still couldn't achieve the same results as film due to the FX series limitation of 32XAA, using CPU could achieve higher levels, but even at the same quality as the Quadros it became many minutes per frame of course. I still don't think we're anywhere near there yet.

    I couldn't find the original article looking quickly through google, but this is a kinda neat primer for anyone compariing the two (although expect some knowledge of the concept);
    http://www.acm.org/crossroads/xrds3-4/ellen.html

    Quote:
    Motion blur is, in many respects, like anti-aliasing for the dimension of time.


    Yeah, kinda like that Temporal AA, eh?!? :wink:
    I think temporal AA would help get cloaser to realistic motion blur, with less sample rates. I would love to see a comparison with more modern hardware.
    June 14, 2006 5:42:24 PM

    Quote:
    No video game in existance has realistic motion blur at this time, to the best of my knowledge.
    Trackmania Nations has some form of simulated motion blur; it looks alright at times but I couldn't really say how accurate it is. Valve also has plans to implement motion blur of some sort in the Source engine but no saying how well it'll turn out.
    June 14, 2006 8:57:55 PM

    agreed, and I actually think their HDR looks better than what is in Oblivion. It is more subtle and "real" looking to me. So they have the chops, just can it be done practically on today's hardware? hmm....
    June 14, 2006 10:39:16 PM

    Quote:

  • Films (VHS, DVD, theatre) - 24 FPS

  • No video game in existance has realistic motion blur at this time, to the best of my knowledge.

    projectoffset will incorporate motion blur but it doesn't exist "yet" lol