How fast can the eye see?

Talon582

Distinguished
Mar 18, 2002
18
0
18,510
I've seen that there is sometimes arguments on what a good rate of FPS is. Some people say no less then 60 is acceptable, others say 30 is ok. After all movies run at 24 and tv runs at 30. Well if you do the math that's a diffrence of 1 frame every .016 seconds (if I did my math correctly). My question is does anyone know medically how fast your eyes can detect movement, because 1 frame every .016 seconds doesn't seem like a whole lot to me. As for my stance on the issue I say play it at whatever rate your comfortable with.
 

eden

Champion
Well for me, it differs by game. Some at 30 FPS seem very well. Others need 60, and it really feels smooth then!
However I noticed that when adds on TV use 30FPS smoothness, the add is more than likely a low-budget one. When adds use 20FPS, somewhere more laggy or the usual FPS, it feels more like it was paid well to be done, including its music. So sometimes being too smooth can actually ruin the experience on TV. Especially movies, since it's better to have 24-FPS than 30, cuz it then somewhat turns fake.

--
I can't beleive Dungeon Siege has a pitchfork weapon called "Hoe"! :lol:
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
One problem is that framerates are not constant and because of this an average framerate is not enough information. The minimum framerate is the key.

For example, a video card in a particular game might average 30 frames per second. At certain parst of the game, usually during high action, the framerate might drop to say 8 frames per second. During low activity framerates might be as high as 100 (boosting the average). Now with a faster card the average might be 70 fps, the minimum might be 22 fps, and the maximum might be 150 fps.

In theory 30 fps and 70 fps would both be smooth but clearly one card would have serious problems during the high action points and one card might not.

<b>I have so many cookies I now have a FAT problem!</b>
 

Talon582

Distinguished
Mar 18, 2002
18
0
18,510
Maybe my maths wrong but I see 70FPS as 1 frame every 0.014 seconds. Logically your answer of .857 doesn't work. Because that would mean for two frames that would take 1.714 seconds to draw. I get my answer by 1.00/70.0= 0.014. 0.014 seconds to me seems faster then the brain could interpret the image, so that leads back to my question about does anyone know how fast the eye can detect movement?
 

HokieESM

Distinguished
Apr 22, 2002
34
0
18,530
An even funnier comment, but definitely related is: how fast can your monitor refresh? :smile: I've heard SO many people say, "My new Ti4600 can run Quake III at 150 fps!"... which is all well and good until their cheap-ass monitor only refreshes at 75 Hz (which is 75 times per second). Hmmm....

As far as other posters comments, most of the time, what's published is "AVERAGE frames per second". I've seen some actual data points for a typical game and someone with an "average" of 30 fps can see 7 or 8 fps at peak times (like the previous poster mentioned). At 60 fps, though, the person probably only dips into the 30s... which, like you said, is very similar to TV (and is perfectly fine).

The video market, in my opinion, is run by a bunch of marketers... sure, the engineers/designers make the cards, but then the testers/marketers work their damndest to find a single test where their card blows everything out of the water. Its getting to the point to where certain cards run certain games so much better than their competitors that they need to make AGP/PCI versions (or have 2 AGP slots on the motherboard) so that you can run both cards. <grin>

I ramble FAR too much :smile:
 

Conehead

Distinguished
Feb 4, 2002
602
0
18,980
The eye can see at 24 FPS, TVs in America run at 30 FPS, but in Europe, TVs run at 24 FPS. Hence anything over 24 FPS appears to be the same as 24 FPS. That is why I don't justify buying a $400 video card that will give me 100 FPS when I can get a $150 one that will do around 30 FPS.

"If there's grass on the field...play ball!!"
 

jankphil

Distinguished
May 17, 2002
333
0
18,780
you may not be able to see over 24 fps, but their is a noticeable difference in the eye between 30 and 60 fps. Try it out, play a game forced at 30 fps, then play a game forced at 60 fps. The 60 fps will feel smoother. I agree that a 400 dollar card is unreal. But then again, most people buying those cards will use them for the next 3-5 years. So you can buy a 150 dollar card every 2 years, or buy a really expensive card that lasts 5 =). Personally I prefer the cheaper one =P
 

williamc

Distinguished
Mar 8, 2002
837
0
18,980
According to my biology book from 11th grade (published in 1991) the human eye can distinguesh a maximum of 32fps. Thats a tricky statement though, thats saying the EYE, not your brain, can distinguess 32 individual images per second. Your brain doesnt bother with individual images per say in the way most people would think, since light is residual full motion can be apparent around 20 constant fps. The smoothness of an image though is individualized for everyone. Some people it makes no difference in what kind of smoothness they "feel" they are seeing between 32 and 60fps. Some people can feel the difference though...i can. Game feels alot smoother at 70fps than 30. But, at the same time, i cannot see the difference, only feel it in the control. If games would run at a true 30fps that'd kick, problem is, they don't. When you score a 30fps average. That second of frames might look like this:
------------------- - - - - -----------------------. That breakup in the middle would definitely be extremely visible to the person playing a game....but you still got 30frames....just not evenly spaced. Dunno if this answered any questions of the origional poster or not...just thought it was an interesting topic worth rambling about.

Later.

The itsy bitsy spider climbed up the empires state building, along came goblin, wiped the spider out
 

Toejam31

Distinguished
Dec 31, 2007
2,989
0
20,780
Interesting articles on the subject:

<A HREF="http://www.planetdescent.com/d3help/framerate.shtml" target="_new">FRAMES PER SECOND (FPS) and THE REFRESH RATE</A>

<A HREF="http://amo.net/NT/05-24-01FPS.html" target="_new">Human Eye Frames Per Second 2</A>

<A HREF="http://www.penstarsys.com/editor/30v60/30v60p3.htm" target="_new">The Human Eye (and Visual Cortex)</A>

<A HREF="http://www.lostcircuits.com/video/dti3d/3.shtml" target="_new">The Pulfrich Effect</A>

<A HREF="http://www.useit.com/alertbox/9511.html" target="_new">How Much Bandwidth is Enough? A Tbps!</A>

<A HREF="http://www.penstarsys.com/editor/30v60/30v60p1.htm" target="_new">30 Frames per Second vs. 60 Frames per Second: A Technical Overview</A>

Toejam31

<font color=red>First Rig:</font color=red> <A HREF="http://www.anandtech.com/mysystemrig.html?rigid=17935" target="_new"><font color=green>Toejam31's Devastating Dalek Destroyer</font color=green></A>
<font color=red>Second Rig:</font color=red> <A HREF="http://www.anandtech.com/mysystemrig.html?rigid=15942" target="_new"><font color=green>Toey's Dynamite DDR Duron</font color=green></A>
__________________________________________________________

<font color=purple>"Some push the envelope. Some just lick it. And some can't find the flap."</font color=purple>
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
Movies double flash each frame, so basically you get 48 frames flashed each second. Also, with recorded real-life footage, you get motion blurring. The helps keep everything smooth. If you get a fast paced action scene like a car chase, and freeze frame it, you'll get a very blurred car. Or if the camera is fixed on the car, you'll get a very blurred scene.

One of the problems with games is, the produce a very sharp picture even when using FSAA. So, if the framerate drops a little, the motion doesn't seem as fluid. If you were to get a very highspeed camera to record at, say 240fps, And then keep every tenth frame, dropping the ones in between. The movie produced from that wouldn't appear as fluid as one recored from a regular camera. The motion blurring effect may not be adequate to produce a perception of full motion.

<font color=red><b>A man is only as old as the woman he feels</b></font color=red>
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
Hmmm... this has changed from when I knew it in the past. Many gaming web sites in the past were really shook up on this issue and they concluded that we can only really see differences in fps up to 60fps (rounded of course). I know for a fact that NTSC standard for television is 32 FPS. That's the standard for your typical shadow mask rounded TV screen. Other things like HDTV I know nothing about.

I didn't know movies double frame, although it makes sense. I don't know if this 60fps thing is true, all I know is that I read about it on many respected online gaming sites and one gaming magazine as well. As for high framerates, I always assumed that people bragged about it because it determined the life expectancy of their system. Higher framerates, while not useful, were justified because it meant the system would most likely be able to run future games for the next 2 years fairly well. Another thing I think people are forgetting is individual perception. Our eyes are constantly changing, from when we're born to when we get old and die. When we are first born, babies can only see within about 30 degrees straight ahead, and not to the side. This allows them to focus on things quickly and easily and not be overwhelmed from stress (that could potentially damage the developing brain prematurely). Also, a baby's vision is extremely blurry. As time passes, things become clearer, but then the older we get, things start to fade again. Vision is best in males (for women it's different) in their mid/early twenties, like 24 for instance. That's the age right before it starts degrading.

The argument that higher framerates are necessary for games is a compelling one. TV can get away with 32 fps, because it displays real-life pictures that from a distance look closer to something we see in real life (of course it's never exactly as we see it but you get the point). Games, on the other hand, have cartoonish qualities to them. While they strive to be realistic and in rare opportunities seize this chance to look very close to real life, most of the time they fail to be this way. One more failing is that in games, objects move very fast, which means that a stuttering effect is a possible danger even with v-sync turned on, because we begin to see flashes of individual frames. That's why from my own personal experience running games at 32fps is less than ideal. I can notice frame shifts and stutters. I tend to find that games become smooth at around 37fps, although that's just a rough guess based on my playing half-life with the netgraph turned on. 24-28fps is just very stuttery and ruins my enjoyment of the game.

It definitely is also important to note not just the average framerate but also the minimum and maximum framerates. If the minimum dips to 8 fps then I don't think that would be very playable, even if the average was 35.

Another thing I also noted in another forum post is that anti-aliasing is much more important than running games in high resolutions, and if the Matrox Parhelia's 16xFAA turns out to be what it claims to be, then the performance will be better as well. Keep in mind that running 1024x768 with 4x FSAA is the same as running 2048x1536 as they are both the same number of pixels.

1024x768x4=3145728
2048x1536=3145728

And since anti-alising framerate performance will improve over time due to better math routines, while running at higher resolutions will not, it is makes more sense to measure our framerates at lower resolutions with some sort of anti-aliasing turned on rather than measuring our framerates at gigantic resolutions. Add to this also the fact that many monitors people use today are not very high end and do not have adequate horizontal frequency to display such high resolutions as well as the cost of the monitors that can. Some monitors also tend to develop moire at higher resolutions and some geometry problems as well, making it even more impractical. If we take into account the future advances in anti-aliasing technology and optimizing further the programming behind the GPU and memory chips on the graphics cards themselves, we will start to see that in benchmarks on all the web sites, no one will be testing 1600x1200 and other high resolutions anymore as we see them doing right now, simply because 1600x1200 will be slower than anti-aliasing and may even have less quality. At most, 1600x1200 might be tested by hardware sites as a "guide" but in general no one will ever even use this resolution.
 

flamethrower205

Illustrious
Jun 26, 2001
13,105
0
40,780
Haha, purely my fault- I divided 60 by 80 (one of those days). Here's an interesting thing- if human eye can see at 32 fps, why is it I easily make out something that changes itself 70 times a second (ie, I can see its positions perfectly)?

My frog asked me for a straw...dunno what happened he's all over the place :eek:
 

TRENDING THREADS