Sign in with
Sign up | Sign in
Your question

Maximum fps?

Last response: in Graphics & Displays
Share
March 1, 2012 8:57:40 AM

hi. if my Horizontal Refresh Rate is 54.2 - 83.8KHz and my Vertical Refresh Rate is 49 - 75Hz, what is the maximum FPS i can reach in a game? I was told your maximum FPS is limited to the refresh rate of your monitor.

my monitor is this
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

thanks! :hello: 

More about : maximum fps

a c 171 U Graphics card
a b C Monitor
March 1, 2012 9:15:29 AM

yes, the maximum fps is limited to your monitor (75hz/fps for you) supposing that your graphics card and cpu can maintain that fps. Technically your fps can be higher, but you won't visually see it as it is limited by the refresh of your monitor. for optimal image quality always enable vsync which synchronises the frames with your monitors refresh rate. Different people have different interpretations of what an acceptable frame rate is. I find that 30+ fps is good. others like 60 fps steady. Competative gamers want fps as high as it can go.
m
0
l
March 1, 2012 9:20:05 AM

so the maximum FPS is limited to the vertical refresh rate of 75 and not the horizontal refresh rate of 83.gotcha. and are all monitors limited to their vertical refresh rate?

thanks

edit: also do you mean by "synchronize"? If my monitor refresh rate is 75, will vsync make my machine work harder to have 75 fps?
m
0
l
Related resources
a b U Graphics card
March 1, 2012 9:30:28 AM

computernewb said:
so the maximum FPS is limited to the vertical refresh rate of 75 and not the horizontal refresh rate of 83.gotcha. and are all monitors limited to their vertical refresh rate?

thanks


Please notice that the horizontal refresh rate is 83000Hz and FPSs are limited to the monitor's lower rate of 75Hz in your case (or 60Hz or 120Hz usually).
m
0
l
a b U Graphics card
March 1, 2012 9:36:30 AM

computernewb said:
so the maximum FPS is limited to the vertical refresh rate of 75 and not the horizontal refresh rate of 83.gotcha. and are all monitors limited to their vertical refresh rate?

thanks

edit: also do you mean by "synchronize"? If my monitor refresh rate is 75, will vsync make my machine work harder to have 75 fps?


Vsync is only working if your card produces more FPS than the monitor refresh rate. It can only lower the card's FPS to match it to your monitor.
m
0
l
a c 376 U Graphics card
a b C Monitor
March 1, 2012 12:15:55 PM

Vsync should only be used when your framerates are consistently above your refresh rate in order to prevent screen tearing. It does not really improve image quality beyond that and has nothing to do with performance.
m
0
l
March 1, 2012 5:14:08 PM

ok thanks. and what is the higest refreshing rate for a monitor? 120?
m
0
l
March 1, 2012 5:45:42 PM

I think so, more than 60 fps normally don't make sense... except the monitor is made for 3D gaming ect (then you need the doubled frame rate)
m
0
l
a c 376 U Graphics card
a b C Monitor
March 1, 2012 5:56:29 PM

Yeah, normal monitors are 60 or 75hz. 3D monitors are 120hz, 60hz for each eye.
m
0
l
a b U Graphics card
a b C Monitor
March 1, 2012 6:29:42 PM

theres 240 for lcd, maybe more now
plasmas typically have like 600
oled is supposed to be even better
m
0
l
a b U Graphics card
a b C Monitor
March 1, 2012 6:57:40 PM

willard said:
Note that the human eye cannot perceive differences in framerate much above 30. Anyone who says they can is mistaken.

http://en.wikipedia.org/wiki/Flicker_fusion_threshold#D...


you cant see the difference between 30 and 60 fps? ...
have you even tried? its night and day difference
m
0
l
a b U Graphics card
March 1, 2012 7:03:09 PM

neon neophyte said:
you cant see the difference between 30 and 60 fps? ...
have you even tried? its night and day difference

Read what I linked, your brain is not capable of telling the difference. You might be bottoming out less, but given a 30 FPS steady rate and a 60 FPS steady rate side by side, humans cannot perceive any difference. In fact, as low as 16 FPS produces a consistent image for 50% of all people tested.

And yes, I've seen 60 FPS. It looks exactly the same as 30 FPS, which looks exactly the same as a billion FPS.

Clarification:

This all really depends on a few things. Very large, very bright images cause greater stimulation of the rods and cones in your eye, as per Bloch's Law, and thus these cells are able to "fire" more quickly (this is also why you see better in black and white in the dark, your cones can absorb light longer than your rods). This results in being able to perceive flicker in higher framerates. Though unless you're looking at something the size of a movie screen, the effect isn't going to be huge.
m
0
l
March 1, 2012 7:10:41 PM

willard said:
Read what I linked, your brain is not capable of telling the difference. You might be bottoming out less, but given a 30 FPS steady rate and a 60 FPS steady rate side by side, humans cannot perceive any difference. In fact, as low as 16 FPS produces a consistent image for 50% of all people tested.

And yes, I've seen 60 FPS. It looks exactly the same as 30 FPS, which looks exactly the same as a billion FPS.


even though my eyes cant tell between 30fps and 60 fps will i notice a difference when im actually playing a game? like battlefield 3? i currently am playing it at around 30 fps but i want to get a new graphics card to play around 50+. but if i cant tell, then it might just be a waste of money

thanks
m
0
l
a b U Graphics card
March 1, 2012 7:17:12 PM

computernewb said:
even though my eyes cant tell between 30fps and 60 fps will i notice a difference when im actually playing a game? like battlefield 3? i currently am playing it at around 30 fps but i want to get a new graphics card to play around 50+. but if i cant tell, then it might just be a waste of money

thanks

You should really be looking to maintain a good, steady framerate. There's nothing wrong with going higher than 30 FPS, and I typically shoot for at least my monitor's refresh rate. Produces a very smooth image with no tearing by turning on Vsync, and gives you a healthy buffer before the framerate dips enough for you to notice.

If you're running at 30 FPS, and you suddenly have a much higher load (say, a ton of players are nearby), you might drop down into the low 20s or upper teens. Your eye definitely can detect that, and you'll perceive it as a not very fluid experience.

What I was talking about applies ONLY to steady, unchanging framerates. Movies exploit this by running at 24 Hz. Fast enough for the vast majority of people not to notice (remember, 16 FPS is enough for half of all people), but low enough to not need a ton of storage space. You can still perceive flickering brightness at that rate sometimes (depends on how bright the scene is), so you see that doubled to 48 Hz sometimes to smooth things out, though it's usually just displaying the same image twice. This is also not an issue on LCDs, because the brightness remains constant (comes from a backlight at 200+ Hz).
m
0
l
a b U Graphics card
a b C Monitor
March 1, 2012 7:28:26 PM

"The human visual system does not see in terms of frames; it works with a continuous flow of light information.[12] A related question is, “how many frames per second are needed for an observer to not see artifacts?” However, this question also does not have a single straightforward answer. If the image switches between black and white each frame, the image appears to flicker at frame rates slower than 30 FPS (interlaced). In other words, the flicker fusion point, where the eyes see gray instead of flickering tends to be around 60 FPS (inconsistent). However, fast moving objects may require higher frame rates to avoid judder (non-smooth, linear motion) artifacts"

http://en.wikipedia.org/wiki/Frame_rate

have a nice day
m
0
l
a b U Graphics card
March 1, 2012 7:35:42 PM

neon neophyte said:
ive read the articles that say what you are saying before.

theyre wrong. seriously, just look. its easiest to tell when theres lots of movement on the screen, like when you spin in a fps.

honestly, just check for youself. if you cant see the difference, its just you.

So the scientists, who have doctorates in this kind of thing by the way, are all wrong? I guess decades of research into how the human eye functions needs to be thrown out then. Either that or you're under the influence of a placebo effect.

Personally, I'm inclined to believe the people who have dedicated their lives to this, studied it at length, submitted their findings to peer reviewed journals, had other scientists verify their findings and then be accepted into the body of scientific knowledge over some guy on the internet who says they're wrong.

It's an illusion. You expect a difference, so you perceive one.
m
0
l
a b U Graphics card
a b C Monitor
March 1, 2012 7:39:09 PM

you are not backed up by a scientific community. your eyes do no see in fps, and they can see the difference between 30-60 fps with an image which ISNT in motion. with images in motion it requires higher fps to look fluid. honestly, youd KNOW this if you had even bothered to look at varying fps and compare instead of misquoting a crappy article
m
0
l
a b U Graphics card
a b C Monitor
March 1, 2012 7:41:26 PM

and illusions dont work by what you expect to see.

i cant say you are seeing an illusion because of your preconceptions. you sir, are just wrong
m
0
l
a b U Graphics card
March 1, 2012 8:02:30 PM

neon neophyte said:
you are not backed up by a scientific community. your eyes do no see in fps, and they can see the difference between 30-60 fps with an image which ISNT in motion. with images in motion it requires higher fps to look fluid.

I'm not saying your eyes see in FPS. If had comprehended what I've been saying, you'd have understood that.

I've been talking about the Flicker Fusion Threshhold, the point at which rapidly changing images (or pulsing light, or things of that nature) are merged into a fluid image by your brain. At the average light levels, this occurs at roughly 16 Hz for 50% of all humans. It's worth noting that increasing the amount of light stimulating your eye will increase this threshold, but you'd need more than a hundred fold increase to get near 60 Hz (and the eye is incapable of hitting 60 Hz at any brightness).

Quote:
honestly, youd KNOW this if you had even bothered to look at varying fps and compare instead of misquoting a crappy article

I have seen varying framerates, as I've already said. Why do you feel such a strong need to declare you can do things which are physiologically impossible?

Would you like some further explanation? How about this? In this excerpt, CFF stands for Critical Flicker Frequency, or the Flicker Fusion Threshold.

Quote:
The Talbot-Plateau Law describes the brightness of an intermittent light source which has a frequency above the CFF. This law states that above CFF, subjectively fused intermittent light and objectively steady light (of equal colour and brightness) will have exactly the same luminance. In another words, brightness sensation from the intermittent light source is the same as if the light perceived during the various periods of stimulation had been uniformly distributed over the whole time. The Talbot-Plateau Law applies only above the CFF.

In other words, your eye perceives an image that is displayed at or above the CFF in exactly the same was as it does a static, unchanging image.
m
0
l
a b U Graphics card
March 1, 2012 8:03:56 PM

neon neophyte said:
and illusions dont work by what you expect to see.

I'm not saying you see what you expect to see. I think you think you see what you expect to see. I have no doubt that your brain is producing the same visual stimulus regardless.

Quote:
i cant say you are seeing an illusion because of your preconceptions. you sir, are just wrong

Science says you're the one who's wrong. I've provided ample evidence, you just keep saying "nuh uh, I can tell the difference!"
m
0
l
March 1, 2012 8:27:20 PM

willard said:
I'm not saying you see what you expect to see. I think you think you see what you expect to see. I have no doubt that your brain is producing the same visual stimulus regardless.

Quote:
i cant say you are seeing an illusion because of your preconceptions. you sir, are just wrong

Science says you're the one who's wrong. I've provided ample evidence, you just keep saying "nuh uh, I can tell the difference!"



Normal video filmed and played back at 30 fps appears fluid to humans. So adding another 30 fps to make 60 does not make it any "more" fluid.

However, the reason video games benefit from 60 fps over 30 fps is because unlike real-life video, video games don't have the natural blur effect that cameras record when recording moving scenes. Since video games are simply "still" scenes that are stitched together artificially, there is no blur effect to help smooth the transition from one frame to another. That's why video games displaying at 60 fps DO appear smoother to the human eye... even though they can't perceive the additional 30 frames.
m
0
l
a b U Graphics card
March 1, 2012 8:28:36 PM

neon neophyte said:
"The human visual system does not see in terms of frames; it works with a continuous flow of light information.[12] A related question is, “how many frames per second are needed for an observer to not see artifacts?” However, this question also does not have a single straightforward answer. If the image switches between black and white each frame, the image appears to flicker at frame rates slower than 30 FPS (interlaced). In other words, the flicker fusion point, where the eyes see gray instead of flickering tends to be around 60 FPS (inconsistent). However, fast moving objects may require higher frame rates to avoid judder (non-smooth, linear motion) artifacts"

http://en.wikipedia.org/wiki/Frame_rate

have a nice day

Just noticed this ninja edit. Nice job backtracking.

At any rate, there are a couple things I'd like to point out here.

1. This section of the wikipedia article has literally zero citations, aside from one simply clarifying that the eye does not see in frames.

2. The example is a worst case scenario, as white light will be vastly brighter than any actual game, and thus stimulate your retina more and be higher up on the curve in that chart I posted.

3. This is talking about interlaced frames, which have to be run at double speed to produce the same smoothness. For a progressive scan display (as in, 1080p display), you do not need to double the 30 FPS to 60 to achieve fluidity.

4. You left out an important bit two sentences later:

Quote:
For example, motion blurring in digital games allows the frame rate to be lowered, while the human perception of motion remains unaffected. This would be the equivalent of introducing shades of gray into the black–white flicker.


Care to try again?
m
0
l
a b U Graphics card
March 1, 2012 8:30:53 PM

mmaatt747 said:
Normal video filmed and played back at 30 fps appears fluid to humans. So adding another 30 fps to make 60 does not make it any "more" fluid.

However, the reason video games benefit from 60 fps over 30 fps is because unlike real-life video, video games don't have the natural blur effect that cameras record when recording moving scenes. Since video games are simply "still" scenes that are stitched together artificially, there is no blur effect to help smooth the transition from one frame to another. That's why video games displaying at 60 fps DO appear smoother to the human eye... even though they can't perceive the additional 30 frames.

Modern game engines introduce motion blur to solve this problem, and have been for a while. This is also only an issue in fast moving objects, and is related to perceiving discontinuous jumps in the object's location.
m
0
l
March 1, 2012 8:36:39 PM

willard said:
Modern game engines introduce motion blur to solve this problem, and have been for a while. This is also only an issue in fast moving objects, and is related to perceiving discontinuous jumps in the object's location.



That's right! The very REASON they started adding in blur artificially was to improve the experience on hardware that couldln't run at 60 fps, because the inherent choppy effect is noticeable at 30 fps.
m
0
l
a b U Graphics card
March 1, 2012 8:47:56 PM

mmaatt747 said:
the inherent choppy effect is noticeable at 30 fps.

That really depends on the situation. If your screen is filled with very fast moving object (like say, rockets flying across the screen), then they're going to look choppy. And yes, that choppiness is going to go down as your frame rate goes higher due to the object moving less between each frame.

It's worth noting, however, that everything else on the screen aside from those few fast moving things will look just fine, and this is entirely a function of screen size. Bigger screens will be more noticeable, as will sitting closer to the screen, as both increase the relative distance the image travels between frames.

Motion blur does a lot to cancel this out, but very fast moving object will still appear choppy on large screens (part of the reason why theater screens operate at a higher framerate). These kinds of fast moving objects tend to be the extreme minority of what's going on, though, and even at 60 Hz the effect is present.

The point I've been trying to make is that 30 Hz is the "good enough" mark, where the game is going to appear to be fluid. 60 Hz vs 30 Hz is most certainly not like "night and day" as neon has been claiming.
m
0
l
March 1, 2012 8:48:07 PM

You can easily tell the difference playing fps games on a 120hz monitor vs 60 hz as well, the motion is visibly smoother assuming you are pushing 120+ fps in the game.
m
0
l
a b U Graphics card
March 1, 2012 8:51:02 PM

ph1sh55 said:
You can easily tell the difference playing fps games on a 120hz monitor vs 60 hz as well, the motion is visibly smoother assuming you are pushing 120+ fps in the game.

Again, this will only be noticeable in objects that are moving very fast and on larger screens. The rest of the image looks exactly the same.
m
0
l
March 1, 2012 8:56:05 PM

willard said:
Note that the human eye cannot perceive differences in framerate much above 30. Anyone who says they can is mistaken.



I agree that the amount of improvement you get largely depends on the video game your playing and the amount of motion in the game. However, I was just taking issue with the above statement. This statement is true for live motion video but not for video games with their artificial frames.
m
0
l
a b U Graphics card
March 1, 2012 8:59:13 PM

mmaatt747 said:
I agree that the amount of improvement you get largely depends on the video game your playing and the amount of motion in the game. However, I was just taking issue with the above statement. This statement is true for live motion video but not for video games with their artificial frames.

It's a gross generalization, but I wasn't expecting to need to produce copious amounts of clarification to explain. The intent was to state that under normal circumstances (i.e. normal screen size and brightness, nothing moving so fast it's jumping between frames) 30 Hz and 60 Hz are indistinguishable.

I'll agree that games benefit from higher framerates, but they benefit very little, and in many circumstances not at all. 30 FPS is definitely the point where you hit extreme diminishing returns.
m
0
l
March 1, 2012 9:33:22 PM

willard said:
So the scientists, who have doctorates in this kind of thing

...who, precisely? Without any names/institutions/affiliations this is nothing but rhetoric.

willard said:
I guess decades of research into how the human eye functions needs to be thrown out then

In this thread I see two more or less contradictory Wikipedia articles, both with fairly meager citations. I do not see the findings of "decades of research." Care to throw around a few published scientific journal articles from Nature or Science or the New England Journal of Medicine to back this up? If not, you can stop arguing over who has science on their side. Its disingenuous.

Whenever I see these silly arguments over what self-appointed forum opthamologists think the human eye can or cannot see, I can't help but think, does it even have to be optical in nature? Couldn't it be that we like 60 FPS better in part because the controls and the UI respond more smoothly than at 30 FPS? Maybe our eyes can't register data that fast, but I bet our brains themselves can. Its obvious [to me] there is some difference, anyway. I'll leave it up to the PhD's to decide how best to apply a metric to that.

For the OP's purposes, yes, it is worth trading up to a better card to get into higher framerates if its within your budget. If you're only seeing 30 FPS on average now, your minimum framerates are probably dipping significantly below that sometimes. Regardless what you can see or perceive over 30, tighter FPS variation is enough reason to shoot for better performance. Plus, a better card would keep more of its performance as you turn on more detail as well.
m
0
l
a b U Graphics card
a b C Monitor
March 2, 2012 9:39:00 PM

i can see the difference. its not even subtle.

you work out the details, if they add up to me not seeing a difference, you are wrong sir
m
0
l
!