Sign in with
Sign up | Sign in
Your question

VSYNC....Why? How? When?

Last response: in Video Games
Share
February 2, 2013 12:50:35 AM

Hey guys,

So recently I was wondering why my 670 couldn't play far cry 3 and crysis 3 beta at frames much higher than 55 fps... and why they were both locked at 60 fps.

So I completely overlooked Vsync! I disabled it in both games, and am now achieving avg. 70 fps in fc3 and 60 fps in crysis 3 beta (when it would drop to 40).

I do not notice screen tearing with it disabled, so I guess I have no real reason to use it...

Why does vsync cut down the fps so much? I literally had a 20+ fps increase by disabling it.

Does it lock your fps at a certain limit?

Does it try and keep up with your monitor's refresh rate?

Thanks!

More about : vsync

February 2, 2013 12:59:30 AM

Vnsync allows your game fresh rate to sync with your monitors refresh rate, and in doing so provides more stability in game and reduces artifacting and screen flickering (though your screen can still flicker).

So yes, vsync does cap your frame rate because it doesn't let your frame rate exceed your monitors refresh rate. If you can get 60-70 frames that's optimal because our eyes can't detect differences beyond 70 FPS anyway.
February 2, 2013 1:01:22 AM


Does it lock your fps at a certain limit?

Does it try and keep up with your monitor's refresh rate?


yes

disable it if you get no screen tearing

but you may play a different game and get tearing so need to re--enable it

i always leave it off as have a 120hz monitor
Related resources
February 2, 2013 3:18:14 AM

I have a 120Hz monitor.

Vsync will attempt to cap it at 120 fps?
February 2, 2013 3:20:18 AM

StefanSS123 said:
I have a 120Hz monitor.

Vsync will attempt to cap it at 120 fps?


Correct, given that your computer can produce 120+ FPS.
February 2, 2013 3:33:01 AM

calmstateofmind said:
Correct, given that your computer can produce 120+ FPS.


Of course, a single 670 would not be able to make 120 fps out of Far Cry 3 on High/Very High settings with 2x MSAA.

The thing that confuses me, however, is why it was locked at 60 fps when I had vsync on. It would never go past 60.1 fps (according to EVGA PRECISION X Software) unless I looked at the sky (which would increase to 75 fps). However, it would gladly go down to the 40 fps range.

When I turned vsync off, not only did my fps jump up to approx 65-70 fps, it would not dip below 55, ever.

Does vsync bring down your fps when it is enabled?
February 2, 2013 3:47:57 AM

Well I don't know what other games you're playing...and a computer still has a frame rate when idle on the desktop or browsing the web, so 120 FPS is easily achievable.

It sounds to me like your computer isn't powerful enough to run past 60 FPS with vsync on (unless looking at the sky). Vsync is very taxing because it double buffers each frame and makes a comparison on which it should print to screen, causing a need for more processing power.

Try lowering your graphics settings and see what kind of frame rate you get then, with vsync enabled.

Edit: Like I said though, your eyes get to where they can't tell a difference in frame rates around 70 FPS. And with a 120Hz monitor you shouldn't be getting much tearing at all anyway, so vsync shouldn't be needed.
February 2, 2013 4:12:16 AM

StefanSS123 said:
Of course, a single 670 would not be able to make 120 fps out of Far Cry 3 on High/Very High settings with 2x MSAA.

The thing that confuses me, however, is why it was locked at 60 fps when I had vsync on. It would never go past 60.1 fps (according to EVGA PRECISION X Software) unless I looked at the sky (which would increase to 75 fps). However, it would gladly go down to the 40 fps range.

When I turned vsync off, not only did my fps jump up to approx 65-70 fps, it would not dip below 55, ever.

Does vsync bring down your fps when it is enabled?

Sometimes games don't properly recognize that you have a 120hz monitor. It is also possible you have not set your monitor to refresh at 120hz. Right click the desktop, click screen resolution, go to advanced settings, and look under the monitor tab to make sure it is set to 120hz.

Also, under Nvidia control panel, under 3D settings, you can set the preferred refresh rate for games that don't automatically set it to 120hz. This feature is not available if you have 3D Vision drivers installed.

In one case, in Dirt 2, v-sync was forcing 60hz cap, even though my monitor is 120hz, but that is the only game in which that was the case, that I've played.

Note: it isn't common for a game to not use 120hz by default, but it does happen in some.
February 2, 2013 4:18:27 AM

calmstateofmind said:
Vnsync allows your game fresh rate to sync with your monitors refresh rate, and in doing so provides more stability in game and reduces artifacting and screen flickering (though your screen can still flicker).

So yes, vsync does cap your frame rate because it doesn't let your frame rate exceed your monitors refresh rate. If you can get 60-70 frames that's optimal because our eyes can't detect differences beyond 70 FPS anyway.


Mind finding an article that says that? That is a myth that has spread through the internet, but all studies show that our eyes don't work that way. We don't see in FPS, we recognize change. The more change there is, the more we recognize it. The only truth to what you say would be that most people have monitors which have 60hz refresh rates. That means that no matter what your FPS are, the monitor only updates 60 times, so people won't see a difference other than tearing, but with a 120hz monitor, there is a difference, though it is less and less obvious as you go up in FPS.

There is also latency issues which change as your FPS go up, which for me, is more obvious, as I get nausea the lower my FPS go. At 30 I get sick within a few minutes, at 60 FPS, I can last 30-60 mins, and at 80+ FPS, I can go all day without nausea.
February 2, 2013 4:29:00 AM

To answer the OP's question. The reason to use v-sync is to remove screen tearing. For many, when your FPS are below your refresh rate, screen tearing isn't as obvious, but it still happens. I personally notice it quite obviously, but there are people here that don't notice it. When your FPS are higher than your refresh rate, you get more tearing as you go higher and higher, making it painfully obvious to most.

V-sync also prevents your GPU(s) from working harder than they need to, when you are getting higher FPS than your monitor can display.

What v-sync does is restrict the GPU from accessing the frame buffer while it is updating the screen, which it does at the rate of your monitors refresh rate. 60hz monitors do this 60 times per second. 120hz monitors do it 120 times a second. This is to prevent the screen buffer from changing during the updating to the screen, which results in partial images being displayed.

The reason you lose FPS with v-sync on, is that the monitor will force your GPU to wait occasionally. V-sync also prevents you from having FPS higher than your monitor can display, so you can never have higher than that. You may also find that the further away from an even divisible of your refresh rate you are, the more likely that you GPU has to wait, reducing FPS more.
February 2, 2013 4:50:06 AM

It's not a myth, and it has nothing to do with optics. I learned about this in a neuroscience course I took a few years ago. Our brain processes some 400-500 billion bits of information every second, but we're only aware of a few thousand (our conscious). Due to the way we humans are built, survival has taught us to focus our attention on specific instances, usually those that are active in the world, as those are ones that either pose a threat to us (tiger jumping at us/snake slithering on the ground), or provide us with reward (spotting a fish in the river/finding brightly colored easter eggs around the yard). But in return, we ignore patterns that are constant and not interesting to us, and those patterns are ignored by our conscious.

The fact that the computer renders an x amount of information at x rate, and our minds are continuously watching this process occur, we aren't processing this through consistency, but instead we look for inconsistency (lag). Our eyes and mind are still processing the world outside the monitor at a fixed rate, and it's the difference, not only between frame to frame within the monitor, but then the monitor to real world, that causes us to notice a difference in frame rates. And like I said, at around 70 FPS, our conscious self can't process information fast enough to notice a difference between the monitors information stream and our real world information stream.

This is the same as watching a still pond and not being able to notice that the water, though it looks still, is constantly moving, and it's not until someone throws a pebble or the wind blows that you see the water moving.

The restraint comes from our mind, not our eyes.
February 2, 2013 5:09:45 AM

bystander said:
Mind finding an article that says that? That is a myth that has spread through the internet, but all studies show that our eyes don't work that way.


http://www.germaninnovation.org/shared/content/document...

Briefly touches on how we process visual information.

http://www.eurekalert.org/pub_releases/2006-07/uops-prc...

Defines an approximate amount of information the retina sends to the brain; some 10 million bits per second. And even of that 10 million bits, we're probably only conscious of 20-40% at any given time (closer to 50% when actively playing a video game like CoD).

720/1080p consists of billions of bits of information per second, yet we're only processing about 5-6 million at best. See where I'm coming from now?
February 2, 2013 2:15:38 PM

calmstateofmind said:
http://www.germaninnovation.org/shared/content/document...

Briefly touches on how we process visual information.

http://www.eurekalert.org/pub_releases/2006-07/uops-prc...

Defines an approximate amount of information the retina sends to the brain; some 10 million bits per second. And even of that 10 million bits, we're probably only conscious of 20-40% at any given time (closer to 50% when actively playing a video game like CoD).

720/1080p consists of billions of bits of information per second, yet we're only processing about 5-6 million at best. See where I'm coming from now?


But you are making an assumption that our brain sends what we see in FPS, they don't. The rods and cones in our eyes are always receiving info, and sending them as fast as they can to the brain, and they do not receive and send them all at the same time or rate. So while some will pickup colors from one frame, others pick them up from another frame.

Think about it, what happens with the 4k resolution screens. Do you think that we can only see less 18 FPS, because they have 4 times the pixels? Your theory doesn't work.
February 2, 2013 3:52:09 PM

bystander said:
But you are making an assumption that our brain sends what we see in FPS, they don't.


No, I'm not. I said that we process the inconsistency in frame rates in comparison to the rate at which we process the real world. Photons are continuously bombarding our eyes so there's no such thing as rendering individual "frames". A computer however does render an entire frame before being displayed, and by rapidly and continuously displaying images on the screen, it tries to simulate the same phenomena that our eyes undergo, thus being on par with the rate at which our eyes take in information from the real world.

bystander said:
Think about it, what happens with the 4k resolution screens. Do you think that we can only see less 18 FPS, because they have 4 times the pixels? Your theory doesn't work.


Photons are still being diffracted off the monitor and we still process the monitor at the real world rate. Like I said...

Quote:
Our eyes and mind are still processing the world outside the monitor at a fixed rate, and it's the difference, not only between frame to frame within the monitor, but then the monitor to real world, that causes us to notice a difference in frame rates.


Meaning, if a computer is streaming information so rapidly and consistently that it blends in with the real world, we see no difference in the change of frames being presented on screen. It's only when the frame rate drops below the rate at which intake and process the real world that we notice a difference, and thus can detect a lag in frame rates. This happens roughly at 70 FPS (and lower).

Taking it a step further, our mind has to then also consciously process the occurrence of this dip in information streaming, otherwise we won't notice it and in our reality a lag in frame rate never happened. This could apply to a small group of pixels on the screen, or the entire frame.

Our eyes themselves don't have the capability to detect minuscule differences beyond these rates. And furthermore, the nature of the retina is to focus on a specific point in space and we process that information accordingly. That doesn't mean we don't also process other information taken in by the eye (peripheral vision), but that what the retina focuses on is what we are consciously processing. We have to consciously be aware of the lag if we're to claim that it occurred, and this is obtained by redirecting the orientation of the retina to focus on the active subject. This is why when we see something out of the corner of our eye it's natural for our eyes to direct their focus (the retina) to it (and this is the 20-40% I was referring to, of which we do process 90-100% of this 20-40%).

http://webvision.med.utah.edu/book/part-i-foundations/s...

I was a physics major before I switched and graduated with a computer science and biology degree, though I do still study physics and get good debates since my roommate is a physics grad student; I'm very aware of the basic processes between physics and biology. ;) 
February 2, 2013 5:28:56 PM

No where in your articles did they mention a rate at which you process FPS. You are incorrectly interpreting what was written. That is also a single study. There are numerous other studies that have different data gathered and different interpretations.
February 2, 2013 5:42:48 PM

bystander said:
No where in your articles did they mention a rate at which you process FPS. You are incorrectly interpreting what was written. That is also a single study. There are numerous other studies that have different data gathered and different interpretations.


Like I said, I took a neuroscience course on communication biology and this is where I've gotten my information. Those articles are just random sites I've found to counter your argument and support mine. I'm not concluding anything from those articles because I'm already deeply familiar with their content, plus a lot, lot more.

You could try to learn something as apposed to clearly showing you don't the technics of what you're talking about, and I do, and making an argument based off personal opinion, a few internet articles and "internet myths"...

All the best.
February 2, 2013 5:52:22 PM

Well, if you can find anywhere, where it says that we only see up to 70 FPS, I'll listen, but there are numerous sites you can search that say otherwise, with studies to back it up.

You also must be aware that science is constantly evolving. What is written by one study 30 years ago, may no longer be valid.
February 2, 2013 5:59:32 PM

bystander said:
Well, if you can find anywhere, where it says that we only see up to 70 FPS, I'll listen, but there are numerous sites you can search that say otherwise, with studies to back it up.


Why do you need a bunch of articles? Do you not have a pair of your own eyes? Try it yourself...with your own eyes, and your own computer screen, and see if you can notice differences beyond ~70 FPS. If you aren't able to consciously process it, you can't detect it.

"Tell me and I'll remember. Show me and I'll know. Involve me and I'll understand." - Chinese proverb

Edit: The course material was current, we had several speakers visit and there was also a lab where we did multiple experiments and eventually tested all senses, not just vision. I took this course fall of '11. And you want to look for publications, not articles. Anyone can write an article or do a study, but publications are what really speak truth. I'll see if I can scrounge something up for you...
February 2, 2013 6:11:35 PM

calmstateofmind said:
Why do you need a bunch of articles? Do you not have a pair of your own eyes? Try it yourself...with your own eyes, and your own computer screen, and see if you can notice differences beyond ~70 FPS. If you aren't able to consciously process it, you can't detect it.

"Tell me and I'll remember. Show me and I'll know. Involve me and I'll understand." - Chinese proverb


Most people can't test this properly, which may be where you are going wrong on this. Your hardware limits you to seeing 60 FPS, unless you are one of the rare exceptions who has a monitor higher than 60hz. Your monitor will only display 60 FPS, so that is all you can see a difference up to.

My monitor is a 120hz, and there is a difference as you go up. Though after 80 FPS, the differences are rarely noticeable except when I turn my view rapidly in a 1st person game.

The problem with the topic is that no study can say how many FPS we can see. In some conditions, studies show we can notice an image flickered at us at 1/440 of a second, and we can recognize the image. You may also have remembered CRT's, which flashed images to the screen, and people could see the flickering at 60hz, and the flickering was noticeably lower up to 100hz or higher. Yet, you take a video of a slow moving fog, and you cannot tell the difference between 1hz and 60hz.

Here is one that gives you tests you can do by yourself to see that we do see past 70 FPS.
http://www.100fps.com/how_many_frames_can_humans_see.ht...

http://amo.net/NT/02-21-01FPS.html
February 2, 2013 6:35:24 PM

You can still notice differences with higher refresh rates because you're changing the behavior of the monitor all together...120Hz rendering and 60Hz rendering are completely different in how they convert the original source information to display on screen, and thus producing a different flow pattern of information (which is why 120Hz can reduce blurring, tearing and jitter - it's more advanced and uses advanced techniques/technologies).

It's like comparing a curveball to a fastball. The platform is the same (space-time/monitor), the content is the same (baseball/frames), but the way in which the content gets from point A to point B aren't the same (pitcher to catcher/data to printed on screen) and so the end result isn't the same. They're just not comparable in the context that we're talking about.

This might help: http://www.pcmag.com/article2/0,2817,2379206,00.asp
February 2, 2013 7:05:46 PM

calmstateofmind said:
You can still notice differences with higher refresh rates because you're changing the behavior of the monitor all together...120Hz rendering and 60Hz rendering are completely different in how they convert the original source information to display on screen, and thus producing a different flow pattern of information (which is why 120Hz can reduce blurring, tearing and jitter - it's more advanced and uses advanced techniques/technologies).

It's like comparing a curveball to a fastball. The platform is the same (space-time/monitor), the content is the same (baseball/frames), but the way in which the content gets from point A to point B aren't the same (pitcher to catcher/data to printed on screen) and so the end result isn't the same. They're just not comparable in the context that we're talking about.

This might help: http://www.pcmag.com/article2/0,2817,2379206,00.asp


Man, you are a wealth misconception. I'm talking monitors, not HDTV's. There is a HUGE difference.

A 60hz monitor not only updates its image 60 times a second, and no more, it accepts 60hz of frames. That is the same with a TV. While a 120hz monitor also accepts 120hz of frames (or 120 FPS), an HDTV does not.

A 60hz monitor cannot display more than 60 FPS. Anything extra is either ignored or results in two partial images on the monitor. You cannot test to difference past 60 FPS, because your monitor won't let you. And PC's don't get to use HDtv's refresh rates beyond 60hz either.
February 2, 2013 8:30:57 PM

This is the reason exactly why I made this video!

http://www.youtube.com/watch?v=dEYf_yUvwhQ

Check it out. I compare normal vsync, adaptive vsync and no vsync at all on the very same GPU as yours. I'm showing screen tearing too. I hope you can use it. :) 
February 2, 2013 8:39:40 PM

bystander said:
Man, you are a wealth misconception. I'm talking monitors, not HDTV's. There is a HUGE difference.

A 60hz monitor not only updates its image 60 times a second, and no more, it accepts 60hz of frames. That is the same with a TV. While a 120hz monitor also accepts 120hz of frames (or 120 FPS), an HDTV does not.

A 60hz monitor cannot display more than 60 FPS. Anything extra is either ignored or results in two partial images on the monitor. You cannot test to difference past 60 FPS, because your monitor won't let you. And PC's don't get to use HDtv's refresh rates beyond 60hz either.


Okay, so the amount of frames rendered per cycle (Hz) at 120Hz differs from that at 60Hz; up to 2x more to be accurate. Correct? So then, how exactly are those two valid subjects for testing the consistency of what the eye can and can not detect on its own? The sole difference between the monitor's rate of refreshing (60Hz vs 120Hz), and the number of frames rendered per cycle, as well as different techniques/technologies used to render the frames, automatically results in a difference in display. And given that you have a healthy eye, this will always result in a different visual experience. There has to be independent variables to test a dependent variable...what you're trying to doing though is test a dependent variable (the eyes performance) with other dependent variables (60Hz display, 120Hz display), and that's not valid.

The nature of what's being constantly displayed, both at 60Hz and 120Hz, depends on the way in which the monitor renders the information and prints it to screen. You yourself just stated that 60Hz and 120Hz differ in this aspect, with 120Hz rendering up to twice the amount of frames per cycle than 60Hz (and I'm sure there's a lot more to it than just doubling frames). Thus, the visual display of a monitor is directly dependent upon the number of frames rendered per second with respect to the monitor's refresh rate and techniques used. You see?

And so, the ability for the eye to detect differences in the visual display of the monitor is based off its continuos behavior, which is then based off the refresh rate, frames rendered per cycle, techniques, etc. If then, you have multiple dependent variables (the way a screen looks at 60Hz, and then at 120Hz), you aren't testing the relationship between the dependent variable (what should be the eyes) and independent variables (the monitor's display) anymore, but instead are testing the performance of one independent variable (60Hz rendering) with another independent variable (120Hz rendering).

Due to the fact that both are independent variables, it's a given that they're going to behave differently, and the ability of the eye to detect differences is completely made irrelevant, and as a result the eye's performance can't be tested.

You have to set up an experiment with proper parameters if valid conclusions are to be made. And you also just supported my argument, so...
February 2, 2013 8:52:10 PM

A 120hz monitor (not HDTV) does not display information differently than a 60hz monitor. The only difference between the two is the rate at which they display the information.

If you can see a difference between 60hz and a 120hz monitor and FPS to match those hz, with v-sync on, you can see the difference between 60 FPS and 120 FPS.

Again, the only difference between 60hz monitor and 120hz monitors is how many FPS they display. The GPU also renders frames, so if the GPU isn't keeping up, the same frame gets repeated.

And if you read the article about HDTV's, the only difference between the two, is that they create and insert additional frames. So if you can tell a difference, you do pick up on the additional frames. Of course that does not work with PC's using them, as they turn off that feature when hooked up to a PC.

EDIT: one confusing issue here is FPS is most often referred to how many frames the GPU creates, not how many are displayed. So for most users, 60 FPS is all their hardware CAN display. That isn't saying we could not see more, only the hardware does not display more. If you have the hardware to display more, you can see a difference. The OP has such a monitor, so his hardware can display those extra frames.
February 2, 2013 9:00:59 PM

bystander said:
The only difference between the two is the rate at which the display the information.


That's still a difference...you would want to keep the refresh rate the same (120Hz in this case since 60Hz can't accommodate higher FPS) and just change the FPS, if you wanted to effectively test the performance of the eye to detect error in monitor display.

Did you not read what I just wrote? :pfff: 
February 2, 2013 9:05:19 PM

calmstateofmind said:
That's still a difference...did you not read what I just wrote? :pfff: 


Isn't that the difference between seeing more than 70 FPS and not? It is just the difference between how many cycles are displayed per second.

If you see that as something different, then god help me, what the hell are you talking about?
February 2, 2013 9:09:39 PM

I don't you understand the role the monitor's refresh rate plays in the displaying of images.
February 2, 2013 9:40:06 PM

bystander said:
I don't you understand the role the monitor's refresh rate plays in the displaying of images.


Because the refresh rate determines how many frames per cycle (per second) are rendered, no? This change in refresh rate effectively changes the fluidity of the monitor's continuos display, and causes a different visual experience, no?

bystander said:
A 60hz monitor not only updates its image 60 times a second, and no more, it accepts 60hz of frames.
...
While a 120hz monitor also accepts 120hz of frames (or 120 FPS)
...
A 60hz monitor cannot display more than 60 FPS.


So in this experiment, the control variable would be the refresh rate. That must remain constant if we're attempting to test the relationship between the eye's performance with respect to the number of FPS displayed. The dependent variable is the eye, with its visual experience depending on the independent variable, the number of FPS issued from the GPU to the monitor (what we change to obtain varying results from the eye). These GPU issued frame rates would be set by altering the graphic settings, so we could test the eye's performance at 5 FPS, 15 FPS, 30 FPS, 45 FPS, 60 FPS, 80 FPS, 100 FPS, and so on (but not be influenced by the refresh rate).

It's not valid though to change the refresh rate from 60Hz to 120Hz and still make conclusions of the eye's performance in respect to the frames issued from GPU to display per second, because the refresh rate controls how many frames are displayed and that changes the nature of the display. Like you said, there could be some 300 frames given to the monitor, but if it's at 60Hz, only 60 FPS are being displayed, whereas if it were 120/240Hz+, you would get more FPS. Again, the FPS would be manipulated by the graphics settings, not the refresh rate.

Refresh rate is what needs to remain constant if the experiment between the eyes performance and # of frames given to the display is to stay valid. And so that takes us back to the original debate, that being what's the threshold of FPS that the eye can/can't detect differences in when looking at a computer monitor. And from what I've experienced and been taught, that number is ~70 FPS.

I hope this makes things clearer...? :??:  :) 

Edit: This brief definition of variable roles might also better help visualize how the experiment would be laid out.

http://answers.yahoo.com/question/index?qid=20070913172...
February 2, 2013 11:04:20 PM

You see, that is my point. You cannot say the human eye can't see more than 70 FPS, when testing on a 60hz monitor. It is not possible for that system to display more than 60 FPS regardless of how many frames are created.

You MUST test on a 120hz monitor, or all data is irrelevant.

And it is possible to test 60FPS on a 60hz monitor and 120FPS on a 120hz monitor and have it a fair comparison. That is a pure 60 FPS to 120 FPS comparison. Of course you can do the same on a 120hz monitor with a 60 FPS cap and v-sync on vs 120 FPS on a 120hz monitor.

There is zero deference between 60 FPS on 120hz monitor and 60 FPS on a 60hz monitor. If the continents of the frame buffer is the same, there is nothing that changes on the screen. LCD/LED screens are solid state, meaning that they just turn the pixels on and they stay on.
February 2, 2013 11:11:43 PM

Though to be honest, the best test would be to play something that is 1st person, and can maintain 120 FPS, then just change the hz of your monitor from 60, 75, 85, 100, 110, and 120hz on the same monitor.

That will get you purer results. 90 FPS on a 120hz monitor results in every other frame being doubled up, it is not a smooth 90 FPS.
February 3, 2013 12:05:00 AM

bystander said:
You see, that is my point. You cannot say the human eye can't see more than 70 FPS, when testing on a 60hz monitor. It is not possible for that system to display more than 60 FPS regardless of how many frames are created.


I'm sorry but that's the first time you'de made this claim. And that would be an accurate statement to make. If the screen only goes to 60Hz, how could the subject experience 70 FPS?

bystander said:

You MUST test on a 120hz monitor, or all data is irrelevant.


Correct, as I just stated in my last post.

bystander said:
And it is possible to test 60FPS on a 60hz monitor and 120FPS on a 120hz monitor and have it a fair comparison. That is a pure 60 FPS to 120 FPS comparison. Of course you can do the same on a 120hz monitor with a 60 FPS cap and v-sync on vs 120 FPS on a 120hz monitor.


Okay, well you just changed what you're talking about all together. Initially you started out talking about testing the performance of the eyes (and mind) in respect to FPS, and now you're just testing refresh rate performances between two monitors...those are totally different topics.

bystander said:
There is zero deference between 60 FPS on 120hz monitor and 60 FPS on a 60hz monitor. If the continents of the frame buffer is the same, there is nothing that changes on the screen. LCD/LED screens are solid state, meaning that they just turn the pixels on and they stay on.


From a scientific standpoint, one that's taken by every article and study that you so dearly cling to, the two monitors with varying refresh rates used in an experiment to test the eye's performance, and our conscious capacity, to detect and become aware of inconsistencies in monitor frame rates, would not be valid; at all.

Like I said, all the best.
February 3, 2013 12:13:31 AM

calmstateofmind said:

From a scientific standpoint, one that's taken by every article and study that you so dearly cling to, the two monitors with varying refresh rates used in an experiment to test the eye's performance, and our conscious capacity, to detect and become aware of inconsistencies in monitor frame rates, would not be valid; at all.

Like I said, all the best.


Like I said, you don't understand how LCD/LED's work. The articles you've posted and I've posted were in the CRT era. They flash images to the screen at the rate of the refresh rate. LCD's do not. LCD's turn a pixel on and leave it on. The refresh rate is how often it "can" update the pixels on the monitor. There is no difference, on an LCD/LED monitor, between two refreshes and one, in the same time period, if the pixels do not change, because the pixels stay untouched for the same time. There are no flashes, there is no altering of the pixels, they are solid state, and do nothing until told to change.

You also fail to see that FPS, as we see it, are the refreshes from the monitor, when the data in the frame buffer changes. The perfect display of FPS, is when the FPS and hz match, giving you even time periods between every frame. It may not be easy to test, but if you want to see what the eye can notice, you ultimately want to change the hz to match the FPS, as it is the refreshes we see when the data changes. A data change does nothing until the refresh happens.

My initial dispute was about how the human eye can't see more than 70 FPS. You never did find anything to back up your claim. I also disputed that it is even possible to test on a 60hz monitor.
!