Sign in with
Sign up | Sign in
Your question

FPS is overrated?

Last response: in Graphics & Displays
Share
January 17, 2010 5:17:16 AM

To me FPS is overrated. I have 5870s in crossfire and to me the more fps means ALOT more heat and to me you can't tell the difference over 60fps. For instance I was playing fear 2 with vsync off and getting over 200 fps with temps well in the 80s. I turned vsync on where I was getting 60fps and temps never got over 70C. I couldn't tell the difference in gameplay either. I have a ASUS VH242 23.6" monitor 1920x1080 60hz refresh rate. I turn vsync on in every game I play now. Also I probably have overkill too with 2 5870s in crossfire for now but I should be in pretty good shape as far as future proof goes. Does anybody agree??

More about : fps overrated

a c 189 U Graphics card
January 17, 2010 5:25:33 AM

yup, agree with u...
then why don't u just running one HD5870?
I think that's just more than enough, less heat, less power consumption... :) 
m
0
l
a c 173 U Graphics card
January 17, 2010 5:35:37 AM

They should have looked exactly the same. 200FPS means diddly when you have it hooked up to your monitor. This is because if your monitor can only show 60FPS, it doesn't matter if your computer can render more then that. If your computer can render 180FPS, you'll only see every third frame in the game. In this case, Vsync on or off shouldn't change what you see.

I agree with WA1. Running a single 5870 at 1080 is fine. I doubt you'd need more then that right now.
m
0
l
Related resources
January 17, 2010 5:45:53 AM

Waited too long to return it back to newegg. I'm new at this, it's first build. Learning as I go. I'm just gonna keep it I reckon for the future (just paid alot more now than I would have later on!). Are there even any computer monitors that have over 60hz refresh rate at there recommended resolution?

Also does every monitor get bad flicker when you run way over 60fps with 60hz refresh rate?
m
0
l
a c 173 U Graphics card
January 17, 2010 6:20:28 AM

I'm not sure of any LCDs that do. There should be some 75 and possibly 120Hz monitors but they will be expensive.

I also don't know of any flicker like you mention. I've never seen it. Any chance your talking about tearing? If you don't know what I'm talking about, either read a guide or just leave Vsync on.
m
0
l
January 17, 2010 6:38:37 AM

Ok. It's flicker. Not tearing. I see it before the game even starts but It no issue with vsync on. It happens because my monitor can't render over 60fps.
But to another issue what is up with crysis and crysis warhead (the 2 games I cant max out with my 5870 crossfire). I get worse fps in crossfire than with 1 card???
m
0
l
a b U Graphics card
January 17, 2010 6:44:27 AM

you cant feel the difference if the fps is above 24...
m
0
l
a b U Graphics card
January 17, 2010 6:49:34 AM

sayantan said:
you cant feel the difference if the fps is above 24...

That's a fallacy. Perhaps you can't though.
m
0
l
a b U Graphics card
January 17, 2010 6:54:34 AM

randomizer said:
That's a fallacy. Perhaps you can't though.


I mean your eye wont sense the difference whether you are playing at 24 fps or 60fps or higher....but your eye can sense it below 24
m
0
l
a c 376 U Graphics card
January 17, 2010 6:59:01 AM

30 fps is usually considered the goal for smooth gaming, anything above your refresh rate is entirely meaningless. Even if you get a monitor with a refresh above 60hz I doubt you will notice a real difference on frame rates over 60 anyway. If you don't need 2 HD5870s(and you shouldn't) just put the second up on Ebay. You can probably get $400ish for it.
m
0
l
a b U Graphics card
January 17, 2010 7:00:12 AM

That's still not right. 60FPS is noticeably smoother than 30FPS, and most certainly 24FPS as well, unless you have vision impairment.
m
0
l
a c 376 U Graphics card
January 17, 2010 7:05:34 AM

Yeah, you can definitely notice higher than 24 fps. It's probably somewhere around 50 fps where it stops making any noticeable difference to most people.
m
0
l
a b U Graphics card
January 17, 2010 7:11:50 AM

Old issue but basically film and tv can get away with 24 fps because of motion blur. The camera actually records everything that happened during that 1/24th of a second.

In a game each frame only shows what is happening at a single point in time. In order to do motion blur in games you'd need to re render each frame. I don't see why nvidia or ati doesn't do this though as it seems like your video card is three times faster than it needs to to be.
m
0
l
a b U Graphics card
January 17, 2010 7:12:30 AM

randomizer said:
That's still not right. 60FPS is noticeably smoother than 30FPS, and most certainly 24FPS as well, unless you have vision impairment.


I certainly dont notice a difference between 30 and 60 FPS.

Although i fine this site very interesting on this matter ;) 

http://www.100fps.com/how_many_frames_can_humans_see.ht...
m
0
l
Anonymous
a b U Graphics card
January 17, 2010 9:29:04 AM

30 fps has a weird 'shimmer' effect, it's clearly inferior to 60 fps. Even 40fps is a lot superior to 30 imo.

@ OP buy another two monitors and eyefinity them, it's beyond amazing.
m
0
l
a b U Graphics card
January 17, 2010 9:48:04 AM

all the movies whether on a dvd or BR produces video at 30fps,25fps,24fps depending upon the format but not more than 30fps...besides xbox360 also locks the fps at 30...so why lock fps to 30 if you can really see whats happening within 1/60 th of a second?????????????????I hope you guys won't tell xbox doesn't offers smoth gaming experience!!!
m
0
l
a c 84 U Graphics card
January 17, 2010 9:55:47 AM

whether you can tell a difference between 30 and 60fps or not, has alot to do with the monitor you are using. 2ms gtg means absolutely nothing when the real world transitions take like 20-30ms... try it with crt and see the difference :p  (and have the refresh rate at 85Hz or 100Hz so it won't flicker)


edit: xbox feels 'smooth' because it's constant 30, but try playing a game which drops from over 60 to 30 and you should feel the difference...
m
0
l
a b U Graphics card
January 17, 2010 9:56:23 AM

Everyone is a little different...

Most people assume they are all the same as they see themselves.

Most people are Very happy believing all are equal. It makes the World a very easy safe place to live, peacefully I hope.


We all watch our TV's that refresh the Screen at 60Hz (or at least remember the older tech ones we used to have) and find it works just great right...???

It was not so for me when I saw them for the 1st time.

I grew up on a boat with no TV in Malta :) 
Then later moved to the Isle of Man when I was 6 where we got a TV...

I clearly remember watching this thing, or trying to - It had Such a terrible flicker to me, I mean REALLY bad, almost like when you are tuning a TV and the "frames" start cycling up or down!!! It was painful to watch...

Now there was nothing wrong with the TV, same on others for me - as everyone else accepted it without complaint, I assumed it was just how things were and took them for what they were.

To make it watchable I had to adjust my consciousness to slow down temporarily my perception of the world to make the flicker go away LOL

Now I know better and worry that the old 60Hz TV has been damaging the consciousness of the human race - where some may have permanently "slowed" they consciousnesses at a very young age and never recovered or realised what they did to themselves :ouch: 

I won't be buying a 60Hz flat screen, put it this way. Even my 85Hz 22" CRT requires me to "adjust" to not see the flicker. 100Hz is better, but still flickers if I a not careful :sleep: 
m
0
l
a b U Graphics card
January 17, 2010 10:09:01 AM

or sell one now while people are still price hiking them over recomeded retail - or at least here in the UK they are still £50 ($81 at current exchange rate) over their initial starting prices :ouch: 

Then buy another again later when they are cheaper when you get a game that needs it and save yourself a bunch of cash :) 

Is a sound plan unless you have games you like that have min FPS dropping too low - in an ideal world I would not want my FPS dropping below 40fps as a min.
m
0
l
a c 173 U Graphics card
January 17, 2010 10:21:43 AM

Interesting argument, but wrong from what I know. MS said you had to program for 720P minimum, so thats 60FPS not 30. In addition, as a closed system we have no idea what they may have done to compensate for low frame rates.
m
0
l
a b U Graphics card
January 17, 2010 10:53:52 AM

Quote:
30 fps has a weird 'shimmer' effect, it's clearly inferior to 60 fps. Even 40fps is a lot superior to 30 imo.

It sure does. I've been trying to upload a mere 70MB youtube video comparing 50FPS and 24FPS (because 24FPS has been mentioned as "smooth" and my HDD gets slaughtered recording 60FPS) but my Internet is so unstable at the moment that after 4 attempts I've still not been able to upload the whole thing.

EDIT: Finally got it, but forgot that youtube transcodes the video to 24FPS anyway, which means I got nowhere...

sayantan said:
all the movies whether on a dvd or BR produces video at 30fps,25fps,24fps depending upon the format but not more than 30fps...

We covered that already. Movies have motion blur, they aren't comparable to games (which generally have no motion blur or very poor motion blur that ends up making it worse).

sayantan said:
besides xbox360 also locks the fps at 30...so why lock fps to 30 if you can really see whats happening within 1/60 th of a second?????????????????

Well for starters, you can't make fast movements on a console anyway. Thumbsticks restrict you to turning like a sloth.

sayantan said:
I hope you guys won't tell xbox doesn't offers smoth gaming experience!!!

That's debatable. I've seen games bring an xbox down below 20FPS many times.
m
0
l
a b U Graphics card
January 18, 2010 2:41:00 AM

Bitchin' Fast! 3D 2000 :"Well for starters, you can't make fast movements on a console anyway. Thumbsticks restrict you to turning like a sloth."

well it maybe,you should also know that reflex of our brain lies somewhere in between 0.15s to 0.2s ,so an fps of10 is also bit high for our brain if you want to react exactly at the 5th frame or 6th or 7th frame or any of the given n th frame.

Lets take an example,you are a sniper and your enemy will appear exactly at the 5th frame and vanishes at next frame ,where each frame exists on screen for 0.1 s...I bet you wont be able to shoot him unless you use an automatic assault rifle....

so 10 fps is more than enough for your reflexes.....but certainly not enough for the visual perception....
therefore fast movement or reflex has nothing to do with visual perception.....don't mix them up....thats all.. :non: 
m
0
l
a b U Graphics card
January 18, 2010 2:50:35 AM

I'm not talking about reflexes. I'm talking about smoothness of motion (or lack thereof) as a result of speed of motion. If you pan a camcorder that records at 10FPS very slowly, it will appear to be smooth. If you increase the speed that you pan the camera, it will no longer be smooth unless you increase the framerate or add motion blur. The reason is because the brain can't interpolate what happened in between each frame and persistence of vision isn't adequate either, as the frames are too infrequent.
m
0
l
a c 216 U Graphics card
January 18, 2010 3:04:59 AM

I personally find anything below 40 fps to be very difficult on me. I usually get motion sickness below 40 fps. 30 fps may be smooth, but it is much harder on constitution than 40+ fps.

The human eye can detect a difference up to 1000 fps in the right situations. Anyways, here is an artical that explains some of the things you are refering to: http://www.100fps.com/how_many_frames_can_humans_see.ht...
m
0
l
a b U Graphics card
January 18, 2010 3:12:43 AM

There is no limit to what the human eye can detect since it doesn't work with frames.
m
0
l
a b U Graphics card
January 18, 2010 7:33:32 AM

For instance I was playing fear 2 with vsync off and getting over 200 fps with temps well in the 80s. I turned vsync on where I was getting 60fps and temps never got over 70C. I couldn't tell the difference in gameplay either
said:
For instance I was playing fear 2 with vsync off and getting over 200 fps with temps well in the 80s. I turned vsync on where I was getting 60fps and temps never got over 70C. I couldn't tell the difference in gameplay either


you mean you dont notice screen tearing? you better have your eye checked.
m
0
l
a b U Graphics card
January 18, 2010 7:35:15 AM

24fps is good for movies, in a videogame it isnt.

your mouse alone will tell you the difference between 24/30/60 variable/ 60 fixed fps.
m
0
l
a b U Graphics card
January 18, 2010 7:40:27 AM

For games, 60fps should be considered playable. Marginally below that 10-15 below that is a problem!
m
0
l
a b U Graphics card
January 18, 2010 7:46:10 AM

hell_storm2004 said:
For games, 60fps should be considered playable. Marginally below that 10-15 below that is a problem!


games remains playable even at 24-30fps..
m
0
l
a b U Graphics card
January 18, 2010 7:50:36 AM

When i meant playable, i meant satisfactorily playable. Below 40fps games lead to stuttering of images.
m
0
l
January 18, 2010 7:58:07 AM

I agree. Crysis is hard to play at times with fps dipping around 20fps even with 5870 in crossfire. 60fps is the sweet spot, anymore than that is useless. I put vsync on in every game to reduce heat and play at 60fps constant (besides crysis).
m
0
l
a b U Graphics card
January 18, 2010 8:13:36 AM

mstang783 said:
I agree. Crysis is hard to play at times with fps dipping around 20fps even with 5870 in crossfire. 60fps is the sweet spot, anymore than that is useless. I put vsync on in every game to reduce heat and play at 60fps constant (besides crysis).


yeah anything below 24 will feel stuttering ...I have also experienced this but once it goes above 24or 25fps its looks fine..
m
0
l
a b U Graphics card
January 18, 2010 8:37:31 AM

sayantan, you keep quoting your experience and using it as some global fact. Just because your eyes are "slow" doesn't mean all of our eyes are slow. I can't imagine playing any game except Crysis below 30FPS, preferably above 40FPS and ideally above 60FPS. Crysis is too demanding to expect more than 30FPS much of the time on my system without dropping the detail, but I'd much prefer 60FPS.

Nobody can say "X is smooth but below Y will stutter" because this is stating objective fact and nobody here has stated any facts, only opinions. You can, however, say "for me, X is smooth but below Y will stutter" because that is a personal experience and it is stated as such. It's also wrong to say "you can't see the difference between X and Y" because how do you know what anyone else can see?
m
0
l
a b U Graphics card
January 18, 2010 9:16:28 AM

For me is a bit weird. I usually don't notice the difference between 30fps constant and 60fps or more constant. What I do notice is a drop in fps like drops from 60 to 50 or 40 to 30. Don't know why :lol:  .
m
0
l
!