Sign in with
Sign up | Sign in
Your question

What fps do you game at?

Last response: in Graphics & Displays
Share
January 11, 2013 1:29:17 PM

Hey I am just wondering what fps everyone games at. I hear people saying 30 fps is unacceptable but don't consoles run at 30 fps. IMO i cannot tell the difference between 30 fps and 60 fps. When I am gaming I choose better graphics at ~40fps then lower graphics at 60+ fps. I am wondering what your guys thoughts are on the sweet spot in terms of fps and graphics quality is and what fps you aim for?

More about : fps game

a c 178 U Graphics card
January 11, 2013 1:36:35 PM

It depends on everyone's taste.i like to get minimum 35- 40fps.but for very smooth gameplay guys want to touch that 60fps.anything less than 25fps is laggy and bad for gaming.
Related resources
a b U Graphics card
January 11, 2013 2:18:21 PM

It's typically not and argument about the difference from 30 and 60 its a matter of pc games don't play at a constant rate therefor gamers like to go for a higher fps to make up for the dips they will get during game play. An xbox will get a constant fps cause that's what they are designed to do but not all games play at 30 fps fyi.
January 11, 2013 2:20:03 PM

More FPS means faster response.
You can watch a movie on low FPS , but it does not matter simply because you do not need to act.
Unlike movies.
In games the more the FPS the better.

For example.
Let's say that you are playing BF3.
Battlefield 3 is a competitive first person shooter game.
You can fly aircraft , drive tanks and maneuver or play as infantry and use normal guns.

You see and target , so you react using your mouse on what you see on the screen to move your aim to the target.
The faster the FPS the faster your aim the better the gameplay.

I would say that it depends on the games.
But i would recommend that you try to achieve 100+ FPS in games to be able to perform good.
100 to 80 is good.
80 to 70 ok.
70 to 60 is not very good.
60- you start suffering.

Any thing less than 60 FPS especially in competitive games will not be good.

I Would recommend that you aim for 100+ FPS in a game if you want to play this game online.

Also be careful of the websites benchmarks , they are greatly misleading.

Better ask someone for the benchmarks.

Just so you know
According to my observation of my system.
In battlefield 3 FPS varies on the same settings (ultra) from 180 to 50 depending on the map , the people , the action and the sunlight.

What i want you to understand is that website's benchmarks are not trust worthy.
a b U Graphics card
January 11, 2013 2:22:53 PM

I honestly can't tell much of a difference between 30 fps and 60 fps. What I can tell is when my fps suddenly drops from 60 to 30. I played through both dead spaces with vsync (capped at 30 fps) and didn't have an issue. I would take higher detail over 60fps and low detail every time.
a c 216 U Graphics card
January 11, 2013 2:35:18 PM

I can watch a 30 FPS game, and possibly use a game controller and everything is fine. It's not spectacular, but it's not bad. With a joystick, you do not notice the latency, and they control the speed at which you turn, which disguises the problems of low FPS.

Once you move to a PC and use a mouse to aim and change the view you see in 1st person or over the shoulder, things change. With a mouse in hand, you gain a far more precise input device. When you move your hand fast or slow, the view changes with it. With this type of input device, 30 FPS causes latency which you can notice.

I personally notice this latency enough to get nauseated within a few minutes, and if I attempt to play through it, it leads to headaches within 30+ minutes. Gaming is not worth a headache or nausea.

At 60 FPS I can tolerate a lot more. It takes 30-60 minutes for me to become nauseated, so I can play, but take frequent breaks. It takes 80+ FPS before I no longer get nausea issues directly related to the input device. I still can get nausea if the game has a bouncy running animation, but normal smooth game play no longer makes me sick.
a b U Graphics card
January 11, 2013 2:38:11 PM

Dropz said:
More FPS means faster response.
You can watch a movie on low FPS , but it does not matter simply because you do not need to act.
Unlike movies.
In games the more the FPS the better.

For example.
Let's say that you are playing BF3.
Battlefield 3 is a competitive first person shooter game.
You can fly aircraft , drive tanks and maneuver or play as infantry and use normal guns.

You see and target , so you react using your mouse on what you see on the screen to move your aim to the target.
The faster the FPS the faster your aim the better the gameplay.

I would say that it depends on the games.
But i would recommend that you try to achieve 100+ FPS in games to be able to perform good.
100 to 80 is good.
80 to 70 ok.
70 to 60 is not very good.
60- you start suffering.

Any thing less than 60 FPS especially in competitive games will not be good.

I Would recommend that you aim for 100+ FPS in a game if you want to play this game online.

Also be careful of the websites benchmarks , they are greatly misleading.

Better ask someone for the benchmarks.

Just so you know
According to my observation of my system.
In battlefield 3 FPS varies on the same settings (ultra) from 180 to 50 depending on the map , the people , the action and the sunlight.

What i want you to understand is that website's benchmarks are not trust worthy.


anything above 60 fps for a majority of people is useless as the refresh rate for a lot of monitors is 60 hz aka 60 fps.
a b U Graphics card
January 11, 2013 2:41:43 PM

What I think people don't understand is that when we are just viewing a game being played at 30 fps and also again at 60 fps, we can't tell a difference because our eyes can only "see"(register) around 15 fps (maybe, idk the actual number), but when we are actually playing the game, the game is extremely smoother at 60+ fps rather than 30ish fps because that is when the fps rate matches the refresh rate of most monitors.

So, when you are just using your sight to analyze a game, it doesn't really matter what fps you're talking about as long as it's above 15 or so fps, but when it comes to actually playing a game and you are in control of the movement/motion of what's going on, you really need above 60 fps.

Also, I will always take 60 fps and lower settings over the highest settings with lower than 60 fps every day, but I really only play competitive fps games anyways.
a c 123 U Graphics card
January 11, 2013 2:43:25 PM

Tech answer:

1) Consoles game at "30FPS" usually, however Crysis 1 averaged about 24FPS on the XBOX360/PS3 and dipped as low as 12FPS.

2) A 60FPS experience can be WORSE than a 30FPS if there is stuttering and/or texture pop-in.
*I set Half-Life 1 to 30FPS to experiment. It felt far smoother than other demanding games registering a solid 60FPS!

3) Shooters are best at the higher frame rates (if smooth).

4) Gaming feels smoother with a mouse (though the mouse may not be the ideal control device for that game).

So to be clear, if I ran Oblivion forced to synch at a solid 30FPS on a high-end GTX680 gaming rig and compared it to an older PC also at 30FPS (barely) the high-end PC experience would be much smoother.

*Tessellation is a new feature that really lends itself to dynamic scaling of textures. If implemented properly, a game could LOCK the frame rate to 30FPS or 60FPS and adjust the quality to maintain the frame rate rather than dipping the frame rate. That's what we need!
a b U Graphics card
January 11, 2013 2:44:57 PM

Personally like to stay somewhat above monitor refresh rate to absorb dips. That reduces tearing and the need for real or adaptive vsync.
a c 216 U Graphics card
January 11, 2013 2:47:23 PM

From personal experience, a lot of these answers seem close but not quite.

When using an input device which feels like an extension of your body, like a mouse, as it moves how your hand moves, the mind does not like latency, and you really feel the latency at 30 FPS, even 60 FPS is not ideal.

When using a joystick, where it is like you are telling your character to move right or left, your mind tolerates much higher latency.

Our eyes and mind can process some very high FPS, but the reality is we do not see in FPS, we notice change. The more change is happening, the more we notice differences in FPS.
a c 123 U Graphics card
January 11, 2013 2:49:26 PM

J_E_D_70 said:
Personally like to stay somewhat above monitor refresh rate to absorb dips. That reduces tearing and the need for real or adaptive vsync.


If you don't have any VSYNC at all you get screen tearing.

What you should be doing is tweaking your game to get say 70FPS average and use Adaptive VSYNC (so leave VSYNC OFF while tweaking) then enable Adaptive VSYNC after. What will happen is you'll SYNCH to 60FPS, but if you ever can't maintain 60FPS then VSYNC is disabled (you'll still get screen tearing but it won't cause a stutter and a resynch at 30FPS).
a b U Graphics card
January 11, 2013 2:57:36 PM

Have tried BF3 ultra with adaptive on an off and cannot tell any difference visually so I'm running with it off for now. I do see frame rate drop way lower with it on (just the number, not any "feel" impact).
a b U Graphics card
January 11, 2013 3:01:57 PM

J_E_D_70 said:
Have tried BF3 ultra with adaptive on an off and cannot tell any difference visually so I'm running with it off for now. I do see frame rate drop way lower with it on (just the number, not any "feel" impact).


Why would you run with it off? Just like to stress your GPU at 100% at all times for no gain? I would assume you are not testing it in exact same situations because there is no reason you would see it drop lower with adaptive vsync on, due to the fact that it gets turned off when you drop below 60 fps anyway.
January 11, 2013 3:02:12 PM

bystander said:
From personal experience, a lot of these answers seem close but not quite.

When using an input device which feels like an extension of your body, like a mouse, as it moves how your hand moves, the mind does not like latency, and you really feel the latency at 30 FPS, even 60 FPS is not ideal.

When using a joystick, where it is like you are telling your character to move right or left, your mind tolerates much higher latency.

Our eyes and mind can process some very high FPS, but the reality is we do not see in FPS, we notice change. The more change is happening, the more we notice differences in FPS.


There are no right or wrong answers here, these are all opinions and they will all vary. I like better graphics at around 40fps where some people here that are really competitive wont settle for less than 60 and it is a good debate.
a c 216 U Graphics card
January 11, 2013 3:05:18 PM

sheepsnowadays said:
There are no right or wrong answers here, these are all opinions and they will all vary. I like better graphics at around 40fps where some people here that are really competitive wont settle for less than 60 and it is a good debate.


I was referring to things like "the human eye only see 15 FPS". That is clearly wrong.

Others will give answers which are 90% right, but through something technically wrong, like you don't get tearing when below your refresh rate.
a b U Graphics card
January 11, 2013 3:10:34 PM

Have not tried to replicate situations to see. I was noticing that with it on it would obviously cap at 60 and drop into the low 40s on occasion. When I turned it off for giggles, I got highs around 80 and lows in the mid-50s. Didn't see any tearing so I just left it off. GPU noise is the same either way. I agree that adaptive is a great thing but it *seemed*, in my case, to have some slight negative impact.

I do realize it could have been completely situational and am not against turning it back on - I just haven't done it yet - too busy with a toddler and AC3.
a b U Graphics card
January 11, 2013 3:13:13 PM

J_E_D_70 said:
Have not tried to replicate situations to see. I was noticing that with it on it would obviously cap at 60 and drop into the low 40s on occasion. When I turned it off for giggles, I got highs around 80 and lows in the mid-50s. Didn't see any tearing so I just left it off. GPU noise is the same either way. I agree that adaptive is a great thing but it *seemed*, in my case, to have some slight negative impact.

I do realize it could have been completely situational and am not against turning it back on - I just haven't done it yet - too busy with a toddler and AC3.


You are going into nvidia control panel to enable adaptive vsync and not using the in game settings or normal vsync right?
a b U Graphics card
January 11, 2013 3:17:46 PM

Yep.

Derza, I believe you :)  I was just playing around based on what I was observing at the time.

I'm the first one to question folks who have 60Hz monitors and claim they need 250fps to be competitive. Still trying to figure out how that can work without insane amounts of tearing.
a b U Graphics card
January 11, 2013 3:19:57 PM

J_E_D_70 said:
Yep.

Derza, I believe you :)  I was just playing around based on what I was observing at the time.

I'm the first one to question folks who have 60Hz monitors and claim they need 250fps to be competitive. Still trying to figure out how that can work without insane amounts of tearing.


Its mainly people that really don't know any better and assume higher is better.
a c 216 U Graphics card
January 11, 2013 3:20:53 PM

I personally just always have v-sync on. With a 120hz monitor, v-sync causes far less latency.
a b U Graphics card
January 11, 2013 3:26:14 PM

Something about that bystander - when your GPUs finally encounter something they can't chew on at 120, does it step down to 60? 90? I read somewhere that if you vsync on a 60Hz and it drops below that the next sync down is 30fps. No clue if that is even true. Just wondering.
January 11, 2013 3:26:56 PM

I agree, although i have an AMD card I just keep v-sync (vertical refresh) on all the time, don't see why you wouldn't, it just gets rid of the tearing with no drawbacks that i can see.
a b U Graphics card
January 11, 2013 3:28:33 PM

J_E_D_70 said:
Something about that bystander - when your GPUs finally encounter something they can't chew on at 120, does it step down to 60? 90? I read somewhere that if you vsync on a 60Hz and it drops below that the next sync down is 30fps. No clue if that is even true. Just wondering.

I'm assuming he uses adaptive vsync set to 120 FPS(i think you can do that... never tried), so it would turn off if he dropped below 120.
Not at home so can't check :( 
a b U Graphics card
January 11, 2013 3:29:16 PM

Depends on the game. If I'm playing an online competitive game or a shooting game, I prefer the FPS to be as high as possible as long as there is no tearing. Games like Skyrim, TW: Shogun 2 or XCOM for example, I can play at lower framerates, but I still rather have constant 35+ FPS.
a b U Graphics card
January 11, 2013 3:31:31 PM

i used to play @ 60 fps with vsync until i changed my refresh to 75 and locked it there. bf3 seems much smoother. 550ti sli haha ultra @ 1440x900 yes not the best resolution but they play good at that and hold 75 constant and only drops when i get killed
a b U Graphics card
January 11, 2013 3:38:58 PM

I like to stay above 40 FPS. It's weird, on my PC anything less than 40 seems laggy but when I play on my Xbox it only runs at 30 FPS and doesn't seem nearly as bad as 30 FPS on PC.
a b U Graphics card
January 11, 2013 3:42:30 PM

Google is my friend! This is old by still relevant apart from adaptive. It explains what I was asking about earlier wrt framerates lower than your vsync rate.

http://hardforum.com/showthread.php?t=928593
a b U Graphics card
January 11, 2013 3:45:27 PM

My opinion i can tell a difference between 30 and 60 frames. 60 just feels smoother/responsive and 30 feels jittery between frames i cant explain it but theres a difference. As the games ive played on xbox then playing the exact game on pc at 60 frames i would never go back on consoles or 30 frames unless i had to for some odd reason.
a b U Graphics card
January 11, 2013 3:45:46 PM

J_E_D_70 said:
Google is my friend! This is old by still relevant apart from adaptive. It explains what I was asking about earlier wrt framerates lower than your vsync rate.

http://hardforum.com/showthread.php?t=928593


Yup that is why adaptive vsync is so awesome, and to me a pretty big advantage over the AMD cards.
a b U Graphics card
January 11, 2013 3:46:50 PM

60 fps is the magnum opus of ultra smooth gaming. but not everyone has the hardware to run the latest games at 60fps on highest settings.
so 30fps is where i aim at minimum. 30fps is very playable and i have no problem with games running at 30fps but its when games dip from 30 to 20's or 10's thats when the problem arises.
a b U Graphics card
January 11, 2013 3:53:59 PM

Disregard that link (maybe) - guy on the last page of the thread tears apart most of the original post.

Even towards the end there, someone saying TF2 is smoother at 200fps on a 60Hz monitor. I'd like to see an authoritative explanation of how that could happen without really bad tearing.
a b U Graphics card
January 11, 2013 3:58:40 PM

as long as the minimum does not dip below 30 fps, im good. I aim for an average fps of 45-50 though.
a b U Graphics card
January 11, 2013 4:00:12 PM

My eyes are full of fail. I followed determinologyz' link and can't discern a difference between 30 and 60. Can tell from 15 to 30 tho.
a b U Graphics card
January 11, 2013 4:03:49 PM

J_E_D_70 said:
My eyes are full of fail. I followed determinologyz' link and can't discern a difference between 30 and 60. Can tell from 15 to 30 tho.


Its there..when the block is spinning in the air it looks smoother when its rotating 60 fps compared to 30 and you dont see as much jerkyness overall
a b U Graphics card
January 11, 2013 4:05:01 PM

My question to those with the video card & monitor to achieve it, is can you see a difference between 60+ FPS on a 120hz monitor?
January 11, 2013 4:47:19 PM

http://www.100fps.com/how_many_frames_can_humans_see.ht...

It's different for everyone as nobody's eyes / mind are the same IMO.

For me... As long as it's a smooth 30fps + I'm fine. And i used to play competitively now a days just a pub when I find time. It never affected how good I was between 30 - 60 to 120 or w/e i was at.
a b U Graphics card
January 11, 2013 4:57:43 PM

determinologyz said:
My opinion i can tell a difference between 30 and 60 frames. 60 just feels smoother/responsive and 30 feels jittery between frames i cant explain it but there's a difference. As the games ive played on Xbox then playing the exact game on pc at 60 frames i would never go back on consoles or 30 frames unless i had to for some odd reason.


Same for me. I can't see a visual difference between 30-60 fps but can feel it in game play. It really does impact fast paced shooters but MMO's not so much at least not until you are under 30 fps in MMO's. I usually try to tweak my settings so I stay over 60 fps at all times in shooters like BF3.
January 11, 2013 5:34:22 PM

100 fps @ 100Hz on a crt. i might go lower if its a non competitive game.
January 11, 2013 5:46:24 PM

I can tolerate anything with minimum framerates of 25 or better, though I prefer 40-60 constant.

I don't really play multiplayer games anymore, but I need ~45fps minimum in multiplayer FPS games.

And I think it's complete BS that anyone would be unable to tell the difference between 30 and 60fps... if you put two side by side screens, gaming, with one at 30 and one at 60 fps, every single person not legally blind would be able to tell which is which.
a b U Graphics card
January 11, 2013 6:02:51 PM

BigMack70 said:
I can tolerate anything with minimum framerates of 25 or better, though I prefer 40-60 constant.

I don't really play multiplayer games anymore, but I need ~45fps minimum in multiplayer FPS games.

And I think it's complete BS that anyone would be unable to tell the difference between 30 and 60fps... if you put two side by side screens, gaming, with one at 30 and one at 60 fps, every single person not legally blind would be able to tell which is which.


Not everyone sees the same frame rates, MOST people will be able to tell but i would not be surprised if a good 10-20% of people could actually not tell a difference.
January 11, 2013 6:06:23 PM

Like I said, I don't buy it.

My wife isn't sensitive to this sort of stuff at all... and to test all this a few months back I switched from gaming on my PC to gaming on the Xbox and asked her which one was smoother and she knew right away it was the PC.
a b U Graphics card
January 11, 2013 6:14:20 PM

BigMack70 said:
Like I said, I don't buy it.

My wife isn't sensitive to this sort of stuff at all... and to test all this a few months back I switched from gaming on my PC to gaming on the Xbox and asked her which one was smoother and she knew right away it was the PC.


Well you're talking about a console running at 30FPS and a PC running at 60+. I hope everyone can tell the difference.
a b U Graphics card
January 11, 2013 6:15:40 PM

BigMack70 said:
Like I said, I don't buy it.

My wife isn't sensitive to this sort of stuff at all... and to test all this a few months back I switched from gaming on my PC to gaming on the Xbox and asked her which one was smoother and she knew right away it was the PC.


So you are basing your assessment on a sample size of... 2? I think it would be a cool experiment to maybe test 100-1000 people and get some usable results.
January 11, 2013 6:16:34 PM

Derza10 said:
Yup that is why adaptive vsync is so awesome, and to me a pretty big advantage over the AMD cards.


Whats the difference between Nvidia's vsync and AMD's vertical refresh?
a b U Graphics card
January 11, 2013 6:23:11 PM

Adaptive vsyn from nvidia turns off if you go under the target FPS, say you ahve vsync set to 60 and due to the demands of the game your FPS drops to 40. With adaptive vsync vsync is just turned off and you get the FPS your graphics card can handle putting out(40 fps)... With normal vsync you might see some stutter and then your fps getting set to 30 fps vs the 40 you would get otherwise. That is more or less the difference.
January 11, 2013 6:31:49 PM

Hmmm i never noticed any stuttering and i play all my demanding games at around 40 fps (only have 6850) with vertical refresh always on. Actually come to think of it AMD's vertical refresh does exactly how you described Nvidia's vsync
January 11, 2013 6:32:29 PM

Derza10 said:
So you are basing your assessment on a sample size of... 2? I think it would be a cool experiment to maybe test 100-1000 people and get some usable results.

Same i'd love to see this tested on a large population. 500-1000 at least. I'd bet my monthly paycheck that you get closed to 50% wrong answers on whats 60 fps and whats 30 fps if you used the same set ups (2 pc's) instead of a gaming console vs pc. The hardware should always be the same when doing these tests :D .
!