Sign in with
Sign up | Sign in
Your question

What is the point. . .

Last response: in Graphics & Displays
Share
June 17, 2007 7:15:51 PM

of having framerates higher than what our human eyes can see? I can understand increasing resolution, color depth, various filtering modes and other quality improvements, but is there a point to having framerates above 60? Or even 30? As long as the framerate doesn't drop to the point where the video image is no longer smooth, what does it matter?

For example: in Oblivion, if one can get a good framerate at high resolution with all high-quality options enabled, and get 30+ FPS, isn't that good enough, "future-proofing" aside?

Other than sheer bragging rights, of course. But I'd like to know if there's any substance to the bragging?

Regards,

Altazi

More about : point

a c 130 U Graphics card
June 17, 2007 7:25:24 PM

Oh no now you gone and done it stand by for a landslide of replies telling you your wrong and dont know what your talking about.
Think of it like being a cat with a very long tail in a room full of rocking chairs. :lol: 
All joking aside though this topic comes up from time to time for the record im with you but some of the others would argue(and they will) :lol: 
That its more complicated than that and they(scientists)dont know what the limit of the human eye is.

Tin hats at the ready
Mactronix
June 17, 2007 7:40:56 PM

well last i read, the human eye could differentiate upto about 100 fps.

but yeah im happy with my 30+ fps in games...except online FPS multiplayers...i prefer them to be 60+ :oops: 
Related resources
June 17, 2007 7:54:27 PM

Quote:

For example: in Oblivion, if one can get a good framerate at high resolution with all high-quality options enabled, and get 30+ FPS, isn't that good enough, "future-proofing" aside?


in my opinion, that's just what's it all about, buying a pc that can last for a couple of years
and ofcourse a lil bit bragging :wink:
June 17, 2007 8:14:08 PM

As far as I know. 30 - 50fps is as good as the human eye can make out.

I think the only good an increase on that would do us is if there is tearing or a slight glitch on the screen.
June 17, 2007 8:17:42 PM

I don't know about the scientists' saying we can see rates up to 100FPS, but here in the US, we watch NTSC video at 60 fields per second, and that looks smooth enough to me. As long as graphics include items like motion blur, so we don't see a succession of razor-sharp shifting images, I'm OK with that.

I like first-person shooters, and do all right as long as the framerate doesn't descend to the point it's a slide show.

I want to know if there are any objective reasons for the (IMO) excessive framerates I see mentioned in various posts.
June 17, 2007 8:28:02 PM

Frame rates dont really matter much once they get over a certain point anyway.

If ya want an example, move your mouse super fast and try to watch the arrow, higher framerates will make that arrow not-drag and trail, but it doesnt matter because your eyes cant keep up with it anyway.
June 17, 2007 9:11:24 PM

Quote:
well last i read, the human eye could differentiate upto about 100 fps.

I bet that your memory is hazy you never (send me all your money together with credit card details) read it but it was the sales person trying to sell you the 100Hz tv.
Strange that before they came along 60 was plenty(i love submiminal messages) now they have 100Hz tvs we all need one of them
June 17, 2007 9:13:43 PM

Quote:
...we watch NTSC video at 60 fields per second, and that looks smooth enough to me...

I suppose you know a field consists of only every other scanline, so it's even only 30 frames per second, but...
Quote:
...As long as graphics include items like motion blur, so we don't see a succession of razor-sharp shifting images...

...this is exactly one of the sticking points. As far as I know graphics cards don't do motion blur or not that I've noticed anyway. There are 2 types, objects or characters moving and the camera panning, e.g. you moving your virtual head in an FPS. I find the latter can be rather jerky in some games, even though I would still categorise them as playable. As to on-screen character animations including motion blur, I think this is one of the new distinguishing features in DX10 with the upcoming "Crysis" title. I don't think it's been done thus far (could be wrong).

The other thing that comes to mind is that benchmarks often only give average framerates, where perhaps the lower minimum framerates are much more important. My hardware has never been great up to now and it certainly can be frustrating when the framerate slows down just exactly in the fight scenes because of the increased load rendering enemies.
June 17, 2007 9:22:31 PM

I play BF2 alot and have all the settings on the maximum highest @ 1280x1024, along with forced 16X AF. I had been playing a while and hadn't noticed anything jittery since I upgraded to 2 Gigs of RAM. Just yesterday I used Fraps just for fun (for the first time) and my Frames per second barely dropped below 35 (30 minimum) and were usually above 40. In my other games the FPS are around 60. I didn't notice anything different so maybe my eyes are just bad! :oops:  I never understood the whole video and video game FPS difference- and I am an avid video editor and 3d game player!

In other words, I don't care as long as it's over 30.
a b U Graphics card
June 17, 2007 9:45:02 PM

One good application for frame rates above 60 is stereo 3d, since this inherently cuts it in half. I believe though I'm not sure that the human eye can only see 12 to 15 distinct images per second. However the eye isn't like a camera, it has no shutter so it's used to seeing things constantly changing, with motion blur etc.. The higher the frame rate the more natural that blur looks. It also matters in terms of latency for fast twitch games. At 30 fps your image may be as old as 1/15th of a second at 60 fps it might be 1/30th of a second.

However depending on your hardware you're going to have input latency anyway. Unless you have just the right setup so there are more important things to consider.
June 17, 2007 10:02:11 PM

It's all about keeping your minimum frame rates above a certain point, to keep your in-game performance as fluid as possible.
June 17, 2007 10:28:27 PM

Normal humans can see up to 72 fps(registering movement only). Gamers and fighter pilots can see faster. Ask anyone who plays counter-strike everyday. You need 60fps minimum to pull off any good shots. SO getting up to 72fps is what you NEED, plus, have you ever spun around really quickly in a game? You can EASILY see the frames if you are running below 60, and that is not good enough for life or death situations. Just cause tv is 24fps does not mean that your eyes work that slowly. SO running ABOVE 72frames is not pointless (future proofing is a VERY important thing.) But then again why do they make cars that go faster than the speed limit? Cause they can and it is cool to do so. You are asking why be better than the bare minimum. If people did not buy fast cards you would not get your slow one for cheap. Why get there now when you can get there later does not cut it. The people who buy the fastest help push everything forward. So a bit pointless to et the TOPPP. but it will happen anyway. Plus the top would more more slowly if it was not being pushed.
June 17, 2007 10:46:13 PM

well considering your average screen refresh are is on 60hz, you would thinks its useless, but actually, i high FPS really mean your response time is much higher, and you have the most up-to-date image. its not that you actually see 110+ frames

Also, to add your eye refresh at a MAX of 24hz. Which is Why TV is broadcasted at 24hz even if you TV is 60
June 17, 2007 10:47:29 PM

Ok i just had to post again.
Some good points mentioned here, especialy the one about motion blur. Now heres a question. If motion blur was added to an image before or as it was being rendered would that mean that less FPS were needed? Surely the blur in the seperate images helps to deceive the eye into seeing fluid motion.
I also see the point in higher frame rates upto a point but i've seen people say then can see the difference between 90 and 100 etc, which i don't believe. Same as when people say they can tell the difference in music formats played over the same speaker system, can believe it upto a point they i think they are fooling themselves.
And i'm sure people just pluck numbers out the ether. Not picking on Rabidpeanut here it's just his quote of the 72 is an exact figure not a range so springs to mind while typing.
So anyway heres what i think it boils down to. It's not really a question of how many FPS that the eye can perceive more a question of how few using which ever technique are required to fool the eye.
June 17, 2007 11:12:34 PM

Well, I am not saying that I only want to achieve the "bare minimum" for frame rates. If we say 60FPS is acceptable, what is the POINT of getting a graphics card / system that supports 200+ FPS? Just buying bigger, faster, more power-hungry equipment for boasting purposes doesn't seem to make a lot of sense to me.

Even in all of your kind responses I see a large variation in what is considered acceptable minimum framerates. Maybe some ultra-twitch gamer can need 100+FPS, but for casual gamers such as myself, I just want to understand if there is any benefit whatsoever to buying equipment entirely for the purpose of increasing my framerate capability.

In general, I have been happy with 30+ FPS, and don't see much reason to go any higher. On a high-end system capable of delivering 200+ FPS in a given game, it seems that the processing "horsepower" needed for this performance is simply producing waste heat. Waste heat = wasted energy (green enthusisasts take note!) and also waste heat = unneeded stress on expensive computer equipment.

Finally, what point is there in having FPS higher than what the MONITOR can display? According to the specifications, my 23" 1920 x 1200 Sony LCD monitor isn't capable of more than 60Hz vertical. I know that LCDs can have image lag, but even CRTs aren't generally capable of > 100Hz vertical refresh rates. So. . . why > 60FPS?
June 17, 2007 11:44:46 PM

I run COD2 Multiplayer at 125fps (In DX7 mode), partly because I know that if everyone starts throwing smoke grenades ill still have a resonable fps at about 30 and partly because for some bizzare reason it lets you jump a little bit higher.

Its nice to have constant fluid motion when everyone else is practically waiting for the next frame to load.
a c 143 U Graphics card
June 18, 2007 12:16:45 AM

Quote:
But then again why do they make cars that go faster than the speed limit? Cause they can and it is cool to do so.


'Cause separating fools from their money is good. Also, if it helps them get out of the gene pool that's also good. :p  :p  :p 
June 18, 2007 12:38:01 AM

Quote:
But then again why do they make cars that go faster than the speed limit? Cause they can and it is cool to do so.


'Cause separating fools from their money is good. Also, if it helps them get out of the gene pool that's also good. :p  :p  :p 

Hey... I use that extra speed, my arrest record is living proof.

Cops don't like it when you go 135 in the US. :twisted:
a b U Graphics card
June 18, 2007 12:44:40 AM

Quote:
Normal humans can see up to 72 fps(registering movement only). Gamers and fighter pilots can see faster. Ask anyone who plays counter-strike everyday.


I generally agree with this. Keep in mind that like any human physiology, there are significant differences between people. Some people naturally perceive faster movement than others. In addition, like with so many other things, motivation, concentration, training, and practice can increase ones natural rate.

Another factor is fatigue. The mind works to manufacture perceptions of constant motion while viewing images that are multiple snapshots over time. At 60 to 75 fps, most people see no flickering. At lower slightly rates, the mind might not perceive the flickering, but it may have to work harder to provide the non-flickering motion. The result, over longer periods of time, can be an increase in fatigue even though one may not consciously perceive a difference in image quality.

For those interested in learning more, here is an article from Wikepedia about frame rates and the flicker fusion frequency.

http://en.wikipedia.org/wiki/Frame_rate
June 18, 2007 12:48:49 AM

Are you stalking me? ;) 
June 18, 2007 5:28:31 AM

I checked out the link to the Wikipedia article. I didn't really learn much. here's one quote from the article:
Quote:
Modern video cards, often featuring NVIDIA or ATI chipsets, can perform at over 160 fps on intensive games such as F.E.A.R. One single GeForce 8800 GTX has been reported to play F.E.A.R. at up to 386 fps (at a low resolution apparently).

Now, to me, 386 FPS seems pretty ridiculous, especially when you consider this point also quoted from the ariticle:
Quote:
When vertical sync is enabled, video cards only output a maximum frame rate equal to the refresh rate of the monitor. All extra frames are dropped. When vertical sync is disabled, the video card is free to render frames as fast as it can, but the display of those rendered frames is still limited to the refresh rate of the monitor. For example, a card may render a game at 100 FPS on a monitor running 75 Hz refresh, but no more than 75 FPS can actually be displayed on screen.

So, even the 386 FPS example won't be displayed at anything higher than 75 FPS. Again, what's the point? What purpose is served by having a graphics card render literally hundreds of frames that don't even get displayed on the monitor?
June 18, 2007 11:32:53 AM

Quote:
well last i read, the human eye could differentiate upto about 100 fps.

I bet that your memory is hazy

true true, i thought id read it. its actually a rendering rate of approximately ten billion triangles per second which is sufficient to saturate the human visual system. :lol:  :lol: 

if you can ccess it: http://www.swift.ac.uk/vision.pdf
June 18, 2007 11:35:23 AM

What if you bought a GPU that could barely make the cut? What if you chose a video card that could get only 60 FPS @ your chosen resolution with max detail? That would be worth it, right?

What if then another generation of games came out and you were longer to able to achieve 60 FPS and now get 30 FPS?

What if we stop comparing new GPU's + Old Games?

What if we drop this thread because there is no point to it?

You buy the best card for you... no one else.
June 18, 2007 1:13:26 PM

Quote:
So, even the 386 FPS example won't be displayed at anything higher than 75 FPS. Again, what's the point? What purpose is served by having a graphics card render literally hundreds of frames that don't even get displayed on the monitor?


One thing that hasn't been mentioned here is Vertical Synchronisation, V-sync. When this is enabled, your PC will wait for the monitor to completely draw a frame before drawing the next one. This means that if your refresh rate is 60hz then you will generally be seeing an image which is 17ms behind what is happening. If you disable V-sync then your PC will overwrite the frame buffer while the frame is still being drawn. This leads to horizontal lines 'tearing' where you can see the difference between one frame and the next. Meaning that if your PC is rendering frames at 300fps, then the latest piece of data being drawn on the screen is only 3ms old. A difference of 14ms.

Personally I play my games with V-sync on because I think tearing is ugly. However there are plenty of people on the CS scene who KNOW 14ms is the difference between winning and losing. Unfortunately I have no sources so I can't back this up but I believe it to be true.
June 18, 2007 4:05:01 PM

Superfly, you don't need to stick around if you don't want to. If there are no other reasons for extreme framerates than the two I mention below, then you are correct - there IS no further reason for this thread.

It seems as if the only reasons justifying extreme framerates are:

1. Future proofing - next-gen games will require more graphics "horsepower". To get DX10 graphics, we will be dragged into Windows Vista - whoopee. And, of course, the software will continue to lag the hardware. . .

2. Bragging rights for fanboys who don't understand that they can't see their 500 fps anyway.

Did I miss anything? There are no other reasons?

It seems strange to me that everyone is so focused on framerate, when it seems to be of limited value. Once a minimum fps threshold is reached (30-60 fps, subjective), then I'd rather see features and capabilities that improve the images. Of course, once we achieve true photorealism, where else is there to go?
June 18, 2007 4:14:17 PM

Maybe the validity of the thread is being questioned because no-one is currently able to buy hardware that runs at 200fps in all titles, at all resolutions at all settings.
I think you will have to look long and hard to find someone who specifically buys a high end GPU only to play a 3 year old game at 1024*768.
June 18, 2007 4:38:18 PM

Quote:

2. Bragging rights for fanboys who don't understand that they can't see their 500 fps anyway.

Did I miss anything?


Did you read my post? You do get a benefit from having a frame rate which is higher than your monitor's refresh rate. You may consider 14ms to be pointless but I argue that it isn't.

Get a stopwatch and see how close you can get to stopping it exactly on one second (while watching). After a tiny bit of practice you will regularly get within 20ms. So if people are winning and losing games depending on whether they get the headshot first the high frame rate makes a difference.
June 18, 2007 5:28:47 PM

The Image in brain is refreshed EXACTLY 24 frames per second. This is the reason why cinema film has 24 , 25 or 30 fps. Standard PAL (non-interlaced) refresh rate is 25 frames per second.NTSC - 29.97 .
In interlaced mode , the refresh rate doubles , but in this case only half of information is drawn on the screen at once.
As long as you have no less than 24 FPS , you should not notice a difference. However , there may be other factors in play. For example , graphics cards usually calculate 2-3 frames ahead. So , if you have 24 FPS , delay when movement direction is changed may be (and usually is) noticeable.
June 18, 2007 5:43:07 PM

in my experience* i can deffinately see over 60fps, i'm not happy unless it's over 75fps, coincidently my refresh rate. :) 

*Played CS1.6 on my old pc @ 10-30fps going to CSS on my newer pc to having 60+ it's 1million times betters and REALLY noticable.
June 19, 2007 1:17:45 AM

Gustafarian,

I did read your response.

Being able to stop the stopwatch at 1.0s +/- 20ms doesn't meaure your reaction time. That's just building up a sense of time & rhythm.

I take your point on having the latest, updated frame in the monitor's buffer, though. It does seem to buy you a few milliseconds. I don't know if there are uber-twich gamers out there who can truly resolve down to a few milliseconds of visual input and respond in kind, but I'm sure that there are more people who THINK they can. I'd like to see an unbiased study of this - links, anyone? At any rate, it seems like there are other places that can create more grief, like network lag.

Still, for the 99% of the people who aren't the twitchiest of the twitchy, I doubt it this group will care if the image they are viewing is 10-20ms delayed.

I have worked on motorcycle (motocross) suspension analysis tools, and had someone tell me that the professional rider could tell a difference of 1mm (0.04"). I told him that was rubbish, and the data we collected supported my position much better than his. I never found a rider with a "golden butt". It's hard not to look at this the same way.
June 19, 2007 3:07:11 AM

i'm really uncertain what you are asking.
here's my guess.
why do people constantly upgrade their computers, video cards, ect...
with regards to games, game makers constantly add little [sometimes huge] extras to visual quality and realism with each new game. this is increasingly harder for hardware to keep up with. as time goes by your shiny new rig that got 130fps in game "A" only gets 11 fps in fresh release game "B". same with programs. my.02 :roll:
June 19, 2007 3:12:45 AM

Quote:
I checked out the link to the Wikipedia article. I didn't really learn much. here's one quote from the article:
Modern video cards, often featuring NVIDIA or ATI chipsets, can perform at over 160 fps on intensive games such as F.E.A.R. One single GeForce 8800 GTX has been reported to play F.E.A.R. at up to 386 fps (at a low resolution apparently).

Now, to me, 386 FPS seems pretty ridiculous, especially when you consider this point also quoted from the ariticle:
Quote:
When vertical sync is enabled, video cards only output a maximum frame rate equal to the refresh rate of the monitor. All extra frames are dropped. When vertical sync is disabled, the video card is free to render frames as fast as it can, but the display of those rendered frames is still limited to the refresh rate of the monitor. For example, a card may render a game at 100 FPS on a monitor running 75 Hz refresh, but no more than 75 FPS can actually be displayed on screen.

So, even the 386 FPS example won't be displayed at anything higher than 75 FPS. Again, what's the point? What purpose is served by having a graphics card render literally hundreds of frames that don't even get displayed on the monitor?
That would be a hint to the fact they can increase the resolution and detail :p 
I see what your saying and i agree but like someone said it's all about minimum FPS in a game. When buying a video card i buy best i can afford or the best for the money i'm prepared to pay. While i don't always buy the top of the range card, when new it will be able to play some of the games i play on the 100+ FPS but i'm expecting the card to last a fair while and will be doing way lower on games in the future, so sometimes the amount of FPS over whats good to play now also indicates to some degree that the card will be good for future games.
June 19, 2007 11:17:23 AM

Quote:
Being able to stop the stopwatch at 1.0s +/- 20ms doesn't meaure your reaction time. That's just building up a sense of time & rhythm.

Correct, but I think that is what people do in computer games. You use your timing to make you chracter jump from the ledge or trigger the explosive when they are standing on it. You only use your reflexes when you don't hear someone creeping up to you and have to spray and pray.

Quote:
II take your point on having the latest, updated frame in the monitor's buffer, though. It does seem to buy you a few milliseconds. I don't know if there are uber-twich gamers out there who can truly resolve down to a few milliseconds of visual input and respond in kind, but I'm sure that there are more people who THINK they can. I'd like to see an unbiased study of this - links, anyone? At any rate, it seems like there are other places that can create more grief, like network lag.

True but it might mean you win quick draws 51% rather than 50% of the time. I'd also love to see a study on this. I bet Valve and id know!

You are right about other factors. That's why companies like razer have 'ultra-polling' mice and keyboards, people buy monitors LCDs with low response times and no-one who plays CS properly uses a wireless router.

Quote:
I have worked on motorcycle (motocross) suspension analysis tools, and had someone tell me that the professional rider could tell a difference of 1mm (0.04"). I told him that was rubbish, and the data we collected supported my position much better than his. I never found a rider with a "golden butt". It's hard not to look at this the same way.

Ha ha, true!
June 19, 2007 12:30:29 PM

Quote:
Superfly, you don't need to stick around if you don't want to. If there are no other reasons for extreme framerates than the two I mention below, then you are correct - there IS no further reason for this thread.

It seems as if the only reasons justifying extreme framerates are:

1. Future proofing - next-gen games will require more graphics "horsepower". To get DX10 graphics, we will be dragged into Windows Vista - whoopee. And, of course, the software will continue to lag the hardware. . .

2. Bragging rights for fanboys who don't understand that they can't see their 500 fps anyway.

Did I miss anything? There are no other reasons?

It seems strange to me that everyone is so focused on framerate, when it seems to be of limited value. Once a minimum fps threshold is reached (30-60 fps, subjective), then I'd rather see features and capabilities that improve the images. Of course, once we achieve true photorealism, where else is there to go?


One other reason I can think of is the fact that if when your card can render frames at a faster rate, it means it's less likely to drop below that magic threshold where you can see the flicker. (You did mention the minimum threshold, I'm just looking at it from a different angle then I think you were.)
If I had a video card that ran obivion indoors and 500 FPS maybe, just maybe I wouldn't see jitter outdoors either. :p 
June 19, 2007 1:25:26 PM

There is a lot of reading to do in this thread and I don't have the time right now so this may already have been said.

The eye does not see FPS. We see what we see and we are very good at seeing movement, changes at over 100 FPS have been picked up in scientific tests by good athletes, fighter pilots, and gamers.

I don't have links for this any more, the computer I had the article bookmarked on was lost in a house fire, but the article was almost 3 years ago.
June 19, 2007 2:18:02 PM

Quote:
It's all about keeping your minimum frame rates above a certain point, to keep your in-game performance as fluid as possible.


That is it in a nutshell. It's not that we need 200 FPS, because our eyes won't notice a different after a certain point. What gamers do want is a system that can play a game above a certain FPS all the time.

Different parts of a game require more power from your system. Let's say you are walking down an empty hallway in Half-Life 2. Nothing is happening, so your system doesn't have to work as hard and you get, oh, 150 FPS. Suddenly ten guys show up and start shooting you. Your FPS dip down to 70, but most likely your eyes won't notice the difference.

That is why high FPS is helpful. Like Kaotao said, it's all about keeping things above a minimum FPS.
June 19, 2007 6:05:19 PM

Quote:
Did I miss anything? There are no other reasons?


One clear reason is how the 30fps or 100 fps or whatever you're seeing is actually delivered.

If you are running at 30fps, especially in a dynamic game, each frame is not perfectly spaced out within the second, i.e. displayed every 1/33 of a second. You'd certainly notice it if 29 frames were displayed in the first 0.5 second and the last frame taking the remaining 0.5 second.

This is one reason why people would also often want to have as high a frame rate as possible, because it's better to temporarily drop 'down' to 60fps from 200fps rather than from 60fps to 20fps, which would be very noticeable.
June 19, 2007 7:22:05 PM

The problem with getting 30fps, is that it isn't always there. Being at 30fps means that in heavy action scenes you'll most probably get no more then 10 fps, and THAT su*k!

This is the only reason you probably will also want to get at least 60fps. The more the better.

On the other side, as soon as I'm at more then 30fps, I can't tell the difference honestly... until it goes below that level :wink:
June 19, 2007 7:36:03 PM

You also want to know if your video card isn't a piece of junk or is defective
June 19, 2007 8:03:25 PM

From my experience you also want a constant fps for better mouse response sensitivity. Anyone who is good at video games in general realizes this. If your fps is bouncing from 25-60fps mouse movements can become jerky due to frame update speed.
June 21, 2007 6:51:22 PM

Quote:
If you are running at 30fps, especially in a dynamic game, each frame is not perfectly spaced out within the second, i.e. displayed every 1/33 of a second. You'd certainly notice it if 29 frames were displayed in the first 0.5 second and the last frame taking the remaining 0.5 second.

On the other hand, consider TV and film. US uses the NTSC standard at 30 frames per second. Europe uses PAL at 25 frames per second. Film runs at 24 frames per second. When these formats are converted in order to broadcast in a different country or medium, I believe it is common practice to either drop or double-up a frame at regular intervals, which in effect also results in unevenly spaced frames. Clearly then some of this is tolerable, although I agree that if the framerate is only measured by sampling every second, your scenario of one 0.5 second frame could happen, yet we might never see it from the specs.
June 21, 2007 7:05:20 PM

Quote:
I don't know if there are uber-twich gamers out there who can truly resolve down to a few milliseconds of visual input and respond in kind, but I'm sure that there are more people who THINK they can. I'd like to see an unbiased study of this.

There was a TV program here a while ago, posing the question how racing drivers deal with high speeds. It contended that the reaction time to a visual impulse is more or less the same for every human being, namely about 200ms to my recollection (auditory impulses are processed quicker). To prove the point they played a game with Michael Schumacher whereby the interviewer holds up a ruler vertically, which he is going to let go randomly. Michael had to catch the ruler between the palms of his hands. By measuring the distance the ruler has fallen you can work out the reaction time. The upshot of the programme was that racing drivers know the track and anticipate what is going to happen several corners ahead. This is how they appear able to react quicker than others.
June 21, 2007 10:35:58 PM

Even if it takes one 200ms to react to a visual impulse how much of that is time for your brain to process and how much of that is your eye.

If the discrepancies between trials was low then you know the eye is reacting very quickly and the brain and muscles are the delay. If the variation was very high that would point to the eye only seeing at low frame rates.
!