Sign in with
Sign up | Sign in
Your question
Closed

120hz monitor?

Last response: in Graphics & Displays
Share
November 15, 2011 11:35:04 PM

stewood said:
Will a 120hz monitor hurt gaming performance?
I found a really good deal on this monitor
http://www.tigerdirect.com/applications/searchtools/ite...
should i get this or is there a better 300$ 120hz monitor?


oh and how much better does a 120hz monitor look? I have heard it looks MUCH smoother but it seems there are skeptics to this notion
Score
0
a b U Graphics card
a b C Monitor
November 15, 2011 11:49:44 PM

120Hz is always better than 60 or something around it but it's way better for 3D lol...
Spec you should look for when choosing monitor:
120Hz
2ms
LED Backlit
1920x1080(At least for every monitor size)
Score
0
Related resources
a b U Graphics card
November 15, 2011 11:50:46 PM

lol no, a 120hz is ideal for gaming. On a normal 60hz monitor, only 60 frames per second can be rendered. If you get over 60fps at any point, then you are not displaying the extra ones. So, if you normally get over 60fps in games then a 120hz will look much smoother. Also, response time is just as important, and that one has 2ms so it's a fantastic gaming monitor. I'm not a big fan of Acer, but it's a decent monitor.
Score
0
November 15, 2011 11:57:28 PM

Most monitors run between 60hz-75hz. 120hz is typically used for 3D gaming, but if you're not into that then it's really just the smoothness that you'll gain. The simplest way to see the difference is moving your mouse cursor, or window browser around and you'll see a much clearer, smoother motion of your cursor.

The in-game performance you'll see highly depends what kind of frames you're getting. 120hz indicates how often a monitor refreshes an image, a sort of fps for your monitor. If you're looking to play modern games on high-settings you almost certainly will not see the smoothness of 120hz, rendering it useless. Here's some quick examples to illustrate this point.

Monitor: 120hz
In-game FPS: 50fps
Result: The game will look exactly the same on a 60hz monitor.

Monitor: 60hz
In-game FPS: 100fps
Result: The game will look like it's running at 60fps.

Monitor: 120hz
In-game FPS: 120fps
Result: The game will look really smooth, you'll finally take advantage of the 120hz monitor.

People will argue that the human eye cannot perceive anything greater than 30fps, but if you do the simple test I mentioned above, the difference is very real.

Bottom line: 120hz monitors are geared more towards FPS, twitch games like counter-strike or unreal tournament. Hardcore, competitive gamers prefer 120hz. If you want to play modern games at high/ultra settings, you're better off getting a bigger monitor with a lower refresh rate, for the same price.
Score
0
November 16, 2011 12:40:37 AM

^Very debatable. Movies can run at like 24fps you won't notice, because of still camera movement and blurring.

In gaming, up to around 40-50fps I can notice a difference, but after that it gets butter smooth. The thing is that, the only time I think you'll notice a difference in 120hz is when you turn and move around rapidly where the screen changes entirely. This is where you may notice it, but in terms of *smoothness*, not much of a jump if any I would say.

I could be wrong though, just my observation.
Score
0
November 16, 2011 12:51:10 AM

patapat said:
Most monitors run between 60hz-75hz. 120hz is typically used for 3D gaming, but if you're not into that then it's really just the smoothness that you'll gain. The simplest way to see the difference is moving your mouse cursor, or window browser around and you'll see a much clearer, smoother motion of your cursor.

The in-game performance you'll see highly depends what kind of frames you're getting. 120hz indicates how often a monitor refreshes an image, a sort of fps for your monitor. If you're looking to play modern games on high-settings you almost certainly will not see the smoothness of 120hz, rendering it useless. Here's some quick examples to illustrate this point.

Monitor: 120hz
In-game FPS: 50fps
Result: The game will look exactly the same on a 60hz monitor.

Monitor: 60hz
In-game FPS: 100fps
Result: The game will look like it's running at 60fps.

Monitor: 120hz
In-game FPS: 120fps
Result: The game will look really smooth, you'll finally take advantage of the 120hz monitor.

People will argue that the human eye cannot perceive anything greater than 30fps, but if you do the simple test I mentioned above, the difference is very real.

Bottom line: 120hz monitors are geared more towards FPS, twitch games like counter-strike or unreal tournament. Hardcore, competitive gamers prefer 120hz. If you want to play modern games at high/ultra settings, you're better off getting a bigger monitor with a lower refresh rate, for the same price.



Its actually 60 frames a second the human eye can perceive not 30.
Score
0
November 16, 2011 2:36:08 AM

On Newegg is $40 dollars more, so that could be $40 dollars on your pocket.
On Newegg there is free shipping and depending on your location tax free.
Don't know about tiger direct's shipping or tax .... you should verify to see if you are really saving.

I don't like the stand..... but the monitor seems right for the price.
Score
0
November 16, 2011 3:03:39 AM

I have a 120hz 27 inch LED 3D monitor, i personally got it for the size and 120hz (3D is just lame)
since going to that and of course selecting 120hz in the games it has been a more enjoyable gaming experience with less strain on the eyes, not to mention being able to track on screen movement a lot easier :) 

I purchased a Samsung 27A950 and love playing games on it at 120hz
Score
0
November 16, 2011 3:38:02 AM

I have that monitor, it's really nice for 3D gaming.

For the whole fcking 50~60 to 60fps "gaming" BS, it's just that BS. Human eye receives a continuous stream of light and use's a chemical sensor to translate it into electrical signal for your brain. Its the brain, not the eyes, that does the processing and determines what you see and don't see.

Since the eye is constantly streaming electrical signals to the brain then it can stated that the eye "see's" at 1000 fps, or whatever number you want. Its your brain that the limit on individual frames will exist and it's MUCH lower then what your eye is chemically capable of sensing. Your brain can't process more then 20~24 full distinct images per second, anything higher and the images start to blur together. Brain is designed to look for differences in light patterns, not strictly the patterns themselves. Thus high contrast frames are noticeable where low contrast frames are not. If you were flashing a queen of hearts at 60fps and on the 49th frame you put a jack of spades, very few humans (virtually none) would be able to tell you what was flashed, only that ~something~ was different in the picture. Now have a video at 60fps of the same queen but this time it's moving around, on the 49th frame flash the same jack of spades, a significantly less portion of the population will even know something was different. This is because even though its 60 frames, 59 of them are identical and thus what you really have is 2fps, one frame being 983.4ms and another being 16.6ms. Once you start moving things the frames start blending together and the brain can't keep up with such a small difference. This gets even more evident if instead of a white/black or red/black high contrast you move to a forest green/lime green type low contrast. The difference becomes imperceptible to the human brain and the change doesn't even register.

So in the end what you get is that the brain is able to easily detect high contrast changes, or large changes (fast motion) of light patterns at a high rate of speed in excess of 50~60, 16.6ms refresh. That same brain won't be able to differentiate what the difference was, only that there ~was~ a difference. At 42ms the brain still can't tell exactly what the changes are, only that their are changes. Once you move to subtle changes then the brain has an even harder time differentiating between those and telling that anything changed at all.

Conclusion, the brain doesn't use "FPS" as a metric, different light patterns are perceived differently and at different "speeds". In the context of monitors, a 60hz will provide you with a frame refresh time of 16.6ms, a 100hz monitor will provided you with a refresh time of 10ms. To put it in perspective, 60hz is 60/1000 or 0.06 of a second, 10ms is 10/1000 or 0.01 of a second. No human reflex's on the planet are that fast. It takes longer for the signal to go from the server to your PC, through your NIC through the CPU and software then to your eyes, through your brain and central reasoning system, then the difference of those frames.

There is ~zero~ competitive advantage in 100hz vs 60hz, anyone who tells you otherwise is just blowing smoke up your a$$ hole. Simply put, YOU the human and your reflex's are the bottleneck in gaming performance, not the screen's refresh rate.
Score
0
November 16, 2011 7:26:20 PM

digitalexplosives said:
I have a 120hz 27 inch LED 3D monitor, i personally got it for the size and 120hz (3D is just lame)
since going to that and of course selecting 120hz in the games it has been a more enjoyable gaming experience with less strain on the eyes, not to mention being able to track on screen movement a lot easier :) 

I purchased a Samsung 27A950 and love playing games on it at 120hz


Your eye strain is just psychological. I can assure you it has not changed. If you went from an LCD monitor to another LCD monitor, your eye strain is the same, you just think its better.

On LCD monitors, the refresh rate doesn't mean the image is re-drawn on screen 60 times a second (incase of 60hz). Back in the days of CRT monitors, the projector would redraw the screen 60 times a second, which would mean the screen would literally to and from black many times. And if this time was low, it caused huge eyestrain.

However LCD monitors display a constant image. Its crystals with a constantly lit back light. The only thing the refresh rate is, is how many rendered frames it can display per second from your GPU. The LCD screen doesn't flash black and white...it just changes to that image. Basically, the crystals change and it produces a different color for each pixel...that is called response time which means the speed the LCD can change its image. However it does not flash, if you had eye strain, it had nothing to do with refresh rate...it could have had something to do with contrast or brightness or something like that, but the increase in refresh rate is simply no relevant to your eye strain.

I do partially agree however that 120hz may be a bit smoother, because in fps games where the image constantly changes, 60fps may be not enough to capture everything, which you may notice.

palladin9479 said:
I have that monitor, it's really nice for 3D gaming.

For the whole fcking 50~60 to 60fps "gaming" BS, it's just that BS. Human eye receives a continuous stream of light and use's a chemical sensor to translate it into electrical signal for your brain. Its the brain, not the eyes, that does the processing and determines what you see and don't see.

Since the eye is constantly streaming electrical signals to the brain then it can stated that the eye "see's" at 1000 fps, or whatever number you want. Its your brain that the limit on individual frames will exist and it's MUCH lower then what your eye is chemically capable of sensing. Your brain can't process more then 20~24 full distinct images per second, anything higher and the images start to blur together. Brain is designed to look for differences in light patterns, not strictly the patterns themselves. Thus high contrast frames are noticeable where low contrast frames are not. If you were flashing a queen of hearts at 60fps and on the 49th frame you put a jack of spades, very few humans (virtually none) would be able to tell you what was flashed, only that ~something~ was different in the picture. Now have a video at 60fps of the same queen but this time it's moving around, on the 49th frame flash the same jack of spades, a significantly less portion of the population will even know something was different. This is because even though its 60 frames, 59 of them are identical and thus what you really have is 2fps, one frame being 983.4ms and another being 16.6ms. Once you start moving things the frames start blending together and the brain can't keep up with such a small difference. This gets even more evident if instead of a white/black or red/black high contrast you move to a forest green/lime green type low contrast. The difference becomes imperceptible to the human brain and the change doesn't even register.

So in the end what you get is that the brain is able to easily detect high contrast changes, or large changes (fast motion) of light patterns at a high rate of speed in excess of 50~60, 16.6ms refresh. That same brain won't be able to differentiate what the difference was, only that there ~was~ a difference. At 42ms the brain still can't tell exactly what the changes are, only that their are changes. Once you move to subtle changes then the brain has an even harder time differentiating between those and telling that anything changed at all.

Conclusion, the brain doesn't use "FPS" as a metric, different light patterns are perceived differently and at different "speeds". In the context of monitors, a 60hz will provide you with a frame refresh time of 16.6ms, a 100hz monitor will provided you with a refresh time of 10ms. To put it in perspective, 60hz is 60/1000 or 0.06 of a second, 10ms is 10/1000 or 0.01 of a second. No human reflex's on the planet are that fast. It takes longer for the signal to go from the server to your PC, through your NIC through the CPU and software then to your eyes, through your brain and central reasoning system, then the difference of those frames.

There is ~zero~ competitive advantage in 100hz vs 60hz, anyone who tells you otherwise is just blowing smoke up your a$$ hole. Simply put, YOU the human and your reflex's are the bottleneck in gaming performance, not the screen's refresh rate.


^Agreed, you have zero advantage, but 60-100fps might be noticeable in an extremely fast changing environment. 60FPS is usually about as far as you can see, but its debatable if you can notice it. If you put two screens side by side and did a test, the possibility that somebody may notice that less visual information is lost in 100fps may not be as unrealistic as you might think.

However as you said, the 100hz gives you zero advantage or noticeable difference while playing whatsoever.
Score
0
a b U Graphics card
November 16, 2011 7:46:31 PM

While gaming there is a big difference between playing with a 60hz monitor vs. a 120hz monitor, at least for me it is. I can't make logical sense of it, but I know when I went from an 90hz CRT to a 60hz LCD there was severe image-tearing that was not present with the old CRT. On some games it was better than others, and if a game had motion blur that hid image-tearing very well. For example, Counterstrike 1.6 was terrible with a 60hz monitor, it was almost unplayable for me with a 60hz monitor the image-tearing was so bad, but in other games where I got 100+ fps such as Quake live it was not very noticeable. It really depends on the person too, hardcore gamers will notice these subtle things more than a casual one. I guess I am more sensitive than most to this like this, because when I saw a plasma TV for the first time I noticed a subtle flickering at all times, like an old 60hz CRT monitor would flicker. But I shouldn't notice this since plasma runs at 600hz refresh rate, so it's strange. Also, when you compare a 120hz TV to a 60hz you can definitely notice the smoothness of the 120hz. Now comparing a 120hz TV to a 240hz it's very hard to tell any difference at all. Anyway, that's just my experiences I won't bother trying to back it up scientifically but I promise that I can notice a difference between the two, especially while playing a fast-paced first person shooter game.
Score
0
a c 216 U Graphics card
a c 128 C Monitor
a b 4 Gaming
November 16, 2011 9:10:41 PM

I have a 120hz monitor. When I first started using the 120hz monitor, the first thing I noticed was the image was more crisp. This is most likely a result of a higher quality monitor in comparison to my old one.

Then I went into gaming. I didn't notice a large difference. It was definitely smoother at times, especially when moving fast, but I was trying to tell if it was just myself wanting it to seem smoother or not.

Later on, after gaming with it a while (and in 3D), I played a game that was giving me about 60 FPS average and all of a sudden it hit me like a truck. The game just didn't feel fluid anymore. I lowered the graphical settings and it jumped up to 80-90 FPS and all of a sudden it felt a lot better.

One of the reasons it felt so unsettling with the lower FPS is partly to not being as smooth and also due to a little more latency that wasn't present at higher FPS. I experience "simulator sickness" with lower than 40 FPS severely and it appears the higher I go, the less I experience this. Simulator sickness is known for causing motion sickness type symptoms. For me, that means nausea for others, they experience eye strain.

While the difference may not be huge when you first start using 120hz, it is pretty big when you go back to 60hz.
Score
0
November 18, 2011 12:21:50 AM

gmcizzle said:
While gaming there is a big difference between playing with a 60hz monitor vs. a 120hz monitor, at least for me it is. I can't make logical sense of it, but I know when I went from an 90hz CRT to a 60hz LCD there was severe image-tearing that was not present with the old CRT. On some games it was better than others, and if a game had motion blur that hid image-tearing very well. For example, Counterstrike 1.6 was terrible with a 60hz monitor, it was almost unplayable for me with a 60hz monitor the image-tearing was so bad, but in other games where I got 100+ fps such as Quake live it was not very noticeable. It really depends on the person too, hardcore gamers will notice these subtle things more than a casual one. I guess I am more sensitive than most to this like this, because when I saw a plasma TV for the first time I noticed a subtle flickering at all times, like an old 60hz CRT monitor would flicker. But I shouldn't notice this since plasma runs at 600hz refresh rate, so it's strange. Also, when you compare a 120hz TV to a 60hz you can definitely notice the smoothness of the 120hz. Now comparing a 120hz TV to a 240hz it's very hard to tell any difference at all. Anyway, that's just my experiences I won't bother trying to back it up scientifically but I promise that I can notice a difference between the two, especially while playing a fast-paced first person shooter game.



It's self reinforcement, otherwise known as placebo effect.

There might be a difference in the quality / contrast ratios of the screen's involved but your gaming skill does not go up fro 60~75+ refresh rate. As I've stated before we're talking the difference of 0.06 and 0.01 seconds.

The only difference would be in extremely fast moving high contrast images, and even then it's not something you'd consciously discern.
Score
0
a b U Graphics card
November 18, 2011 12:23:57 AM

If I saw two monitors side-by-side of 60hz and 120hz I could tell which is which instantly, I guarantee you.
Score
0
November 18, 2011 12:29:10 AM

gmcizzle said:
If I saw two monitors side-by-side of 60hz and 120hz I could tell which is which instantly, I guarantee you.


If I saw a Lamborghini and a Honda side by side I could tell which is which instantly, I guarantee you.

120hz monitors tend to be significantly higher quality with more vibrant colors and sharper contrasts. You would get the same result by using a high end 60hz monitor.

Example, I could take two of those Acers and put them in-front of you running the same demo. One would be at 60 the other at 120, and you would be incapable of discerning which is which unless I told you.

http://en.wikipedia.org/wiki/Confirmation_bias
Score
0
a b U Graphics card
November 18, 2011 12:38:10 AM

http://en.wikipedia.org/wiki/Screen_tearing

This is the biggest problem I have with 60hz monitors for gaming. Screen/image tearing is very real and it's very easy, again at least for me, to notice it.
Score
0
November 18, 2011 3:31:14 AM

gmcizzle said:
http://en.wikipedia.org/wiki/Screen_tearing

This is the biggest problem I have with 60hz monitors for gaming. Screen/image tearing is very real and it's very easy, again at least for me, to notice it.



Umm they have this new technology .. you know ... where it sync's frame renders with your screen's refresh cycle ... what's it's .. name ... ohh V-Sync. Yeah .....
Score
0
a b U Graphics card
November 18, 2011 3:45:13 AM

Vsync works for casual games, for fast hardcore competitve FPSes vsync is a no-go, even with triple buffering, as it adds lag. That's why the hardcore gamers go for the 120hz monitors, and the 60hz works for everyone else.
Score
0
November 20, 2011 11:09:13 PM

gmcizzle said:
Vsync works for casual games, for fast hardcore competitve FPSes vsync is a no-go, even with triple buffering, as it adds lag. That's why the hardcore gamers go for the 120hz monitors, and the 60hz works for everyone else.



Umm no. Just no.

All VSync does is sync the framebuffer renders to the screen refreshes, your video card will output at it's fastest possible in either way. You could have 1000 "FPS" but your always limited by your screens physical drawing speed, thus while you think V-SYNC is "lowering" your FPS, it's not really, as those additional renders were just overwriting each other in memory. And as has been stated numerous times, you are not a cyborg, your brain will not notice the difference between 0.016s and 0.008s frame times. You will not see the bad guy faster, you will not have faster reflex's and it will not give you a competitive advantage. There will be absolutely zero difference between 60hz and 120hz on the same screen. The slowest component of the HMI is not the screen but you, the human being. Making your monitor draw faster will not make you think faster.

Seriously .... 1/60 vs 1/120 is a ridiculously small number.
Score
0
a b U Graphics card
November 20, 2011 11:40:47 PM

palladin9479 said:
Umm no. Just no.

All VSync does is sync the framebuffer renders to the screen refreshes, your video card will output at it's fastest possible in either way. You could have 1000 "FPS" but your always limited by your screens physical drawing speed, thus while you think V-SYNC is "lowering" your FPS, it's not really, as those additional renders were just overwriting each other in memory. And as has been stated numerous times, you are not a cyborg, your brain will not notice the difference between 0.016s and 0.008s frame times. You will not see the bad guy faster, you will not have faster reflex's and it will not give you a competitive advantage. There will be absolutely zero difference between 60hz and 120hz on the same screen. The slowest component of the HMI is not the screen but you, the human being. Making your monitor draw faster will not make you think faster.

Seriously .... 1/60 vs 1/120 is a ridiculously small number.

Right, but what I'm saying is that the syncing using vsync introduces lag between mouse movement and frame rendering.
Score
0
November 21, 2011 5:53:37 AM

gmcizzle said:
Right, but what I'm saying is that the syncing using vsync introduces lag between mouse movement and frame rendering.


How in the hell would that happen? Seriously mouse input has ZERO to do with screen rendering, the CPU will process the movement at the same speed regardless of your refresh rates.

Your repeating all these myths, mind as well introduce the rabbits foot, throwing salt over your shoulder and pagan sacrifices over your SCSI chain.

Simple math, 1/60 = 0.0166, or one frame per 0.016 seconds. 1/120 = 0.0083, one frame per 0.0083 seconds. Your brain can not tell the difference between normal light patterns at that speed. You need extremely high contrast, black and bright white in sequential frames for your brain to know something is different.

So unless your an alien or a human cyborg none of the above things makes any difference at all.
Score
0
a b U Graphics card
November 21, 2011 12:10:54 PM

I tested this out myself before posting. I turned VSync on and enabled triple buffering and there was definitely noticeable input lag. I tried it on CoD BO where I normally get 90fps constant. Try it for yourself if you don't believe me.
Score
0
November 21, 2011 11:22:55 PM

gmcizzle said:
I tested this out myself before posting. I turned VSync on and enabled triple buffering and there was definitely noticeable input lag. I tried it on CoD BO where I normally get 90fps constant. Try it for yourself if you don't believe me.


Umm ...... ok not even touching this one. Your under placebo effect.
Score
0
November 22, 2011 6:35:32 AM

i think some of you might have misunderstood
we all know 120hz is need for 3d gaming for the shits and giggles, so lets skip that

you might say human eye can only see 60hz, but there is a huge difference between 60hz and 100hz that will definitly feel it.
i mean if you are not that serious about playing a fps game, 35+ fps is good enough for you which means you have a ok video card and a 60hz monitor

if you are seriously about fps gaming then buy the top of the line video card and tweak your setting where you get 100+fps constantly, heres where your 120hz lcd comes in.

when you swipe your mouse quickly across your screen as you are turning fast in game, you eyes will be able to pick up more things than if you have a 60hz lcd, meaning you will spot people a bit more clearly.
thats if you are really that good and have the reflex, i had a samsung 226BW, then upgraded to a acer gd235, i mean, sure lcd is a lcd, its only slightly better, but when you are at that level of gaming, every little millisecond counts, so to me, 120hz with 100+fps in game, is a huge difference, to some others? might not be as much.

i will find bf3 as unplayable at 60fps, with gtx590 and lower some settings i can get 100fps minimum as the graphic doesnt matter to me. but some people will think running it at 30fps is pretty good so i guess i can tell you there is a difference, but does it worth the money? its really up to you
Score
0
November 22, 2011 7:02:49 AM

sorry forgot to mention one thing, the acer 120hz lcd is really nice, i love it but that was almost a year ago since i bought it, at that time what was lack is that the monitor's size, however with the nvidia 3d vision 2, there are 120hz lcd with bigger size like 27inches. i dont know if that big is good for gaming or not since i never had one...
Score
0
November 22, 2011 7:21:44 AM

ibanezrg2550 said:
i think some of you might have misunderstood
we all know 120hz is need for 3d gaming for the shits and giggles, so lets skip that

you might say human eye can only see 60hz, but there is a huge difference between 60hz and 100hz that will definitly feel it.
i mean if you are not that serious about playing a fps game, 35+ fps is good enough for you which means you have a ok video card and a 60hz monitor

if you are seriously about fps gaming then buy the top of the line video card and tweak your setting where you get 100+fps constantly, heres where your 120hz lcd comes in.

when you swipe your mouse quickly across your screen as you are turning fast in game, you eyes will be able to pick up more things than if you have a 60hz lcd, meaning you will spot people a bit more clearly.
thats if you are really that good and have the reflex, i had a samsung 226BW, then upgraded to a acer gd235, i mean, sure lcd is a lcd, its only slightly better, but when you are at that level of gaming, every little millisecond counts, so to me, 120hz with 100+fps in game, is a huge difference, to some others? might not be as much.

i will find bf3 as unplayable at 60fps, with gtx590 and lower some settings i can get 100fps minimum as the graphic doesnt matter to me. but some people will think running it at 30fps is pretty good so i guess i can tell you there is a difference, but does it worth the money? its really up to you



Already been disproved.

"60hz" is one frame per 0.0166~ seconds, "120hz" is one frame per 0.00833~ seconds. Your human brain doesn't see things in frames, it analyzes light patterns and does object recognition. Your not a cyborg, your brain simply isn't capable of doing that faster then 20 to 30 times per second. Thus any still light patterns faster then 20~ish times per second is perceived as motion. At 50 times per second your at fluid motion, anything faster won't change anything. Thus 50/60/120/1000/10,000 images per second is perceived the same by the human brain. Not only that, but your human reaction time is such that it takes you much longer then 0.0166 seconds to do object recognition to even know what your looking at.

The whole "100hz for gamers!!!010101" is just a myth that started getting spread around the internet. People noticed their new shiny 120hz monitors looked cleaner and better then their old dirty 60hz monitors. The fact that 120hz screens are usually better quality then 60hz ones (other then professional displays) never crossed their minds. I could take two acer 120hz screens and put them in front of you, one at 60hz and one at 120hz, both running the same game / demo / video and you would be unable to tell them apart.
Score
0
November 22, 2011 9:09:22 AM

A friend has this monitor and really likes it. I have read it has pretty poor input lag compared to other more expensive 120hz monitors, and about equivalent input lag to decent 60hz. Unless your framerate is usually above 80 or so I don't think it would be worthwhile.

btw, it is easy to tell when my games slow down from the 60fps limit to 50fps. So that is total bs that you cannot tell a difference above 50. Just because your brain perceives motion does not mean all motion will be perceived the same. You are just wrong and small amounts of input lag are noticeable.
Score
0
a c 216 U Graphics card
a c 128 C Monitor
a b 4 Gaming
November 22, 2011 1:46:01 PM

palladin9479 said:
Already been disproved.

"60hz" is one frame per 0.0166~ seconds, "120hz" is one frame per 0.00833~ seconds. Your human brain doesn't see things in frames, it analyzes light patterns and does object recognition. Your not a cyborg, your brain simply isn't capable of doing that faster then 20 to 30 times per second. Thus any still light patterns faster then 20~ish times per second is perceived as motion. At 50 times per second your at fluid motion, anything faster won't change anything. Thus 50/60/120/1000/10,000 images per second is perceived the same by the human brain. Not only that, but your human reaction time is such that it takes you much longer then 0.0166 seconds to do object recognition to even know what your looking at.

The whole "100hz for gamers!!!010101" is just a myth that started getting spread around the internet. People noticed their new shiny 120hz monitors looked cleaner and better then their old dirty 60hz monitors. The fact that 120hz screens are usually better quality then 60hz ones (other then professional displays) never crossed their minds. I could take two acer 120hz screens and put them in front of you, one at 60hz and one at 120hz, both running the same game / demo / video and you would be unable to tell them apart.


Read this: http://www.100fps.com/how_many_frames_can_humans_see.ht...

The eye certainly sees beyond 20-30 FPS.

The brain certainly feels a difference as well. How else can you explain why games below 50 FPS give me motion sickness, and games above 60 FPS do not.
Score
0
November 22, 2011 2:59:23 PM

palladin9479 said:
Already been disproved.

"60hz" is one frame per 0.0166~ seconds, "120hz" is one frame per 0.00833~ seconds. Your human brain doesn't see things in frames, it analyzes light patterns and does object recognition. Your not a cyborg, your brain simply isn't capable of doing that faster then 20 to 30 times per second. Thus any still light patterns faster then 20~ish times per second is perceived as motion. At 50 times per second your at fluid motion, anything faster won't change anything. Thus 50/60/120/1000/10,000 images per second is perceived the same by the human brain. Not only that, but your human reaction time is such that it takes you much longer then 0.0166 seconds to do object recognition to even know what your looking at.

The whole "100hz for gamers!!!010101" is just a myth that started getting spread around the internet. People noticed their new shiny 120hz monitors looked cleaner and better then their old dirty 60hz monitors. The fact that 120hz screens are usually better quality then 60hz ones (other then professional displays) never crossed their minds. I could take two acer 120hz screens and put them in front of you, one at 60hz and one at 120hz, both running the same game / demo / video and you would be unable to tell them apart.



if thats what you think, people who did those experiments, i wonder if they really did on a computer with a computer game and if the guy was a pro gamer thats able to distinguish the difference between 60 and 120hz. if im playing bf3 at 100+fps, i can run and jump around as if im playing cs1.6, even with the much more detailed map, spot enemies with my eyes in a direction doesnt take much more than a half of a second, because all the images came into my brain are crispy clear. and with 60hz i feel stranded, the feeling of unease as it takes longer to spot people and during a game one second is way too long already. but like i said the difference is subtle.
you said put a 120hz and a 60hz beside each other and compare, its true that ur eyes might not noticed the difference if your just standing there watching, how the comparison should've done is give a person whos been using 120hz for years a 60hz lcd and ask him for the difference.
so you cant say just because someone else's brain cant feel/see the difference doesnt mean no one else can, human are all different, if you do something for a long time, you get better at it and your body and mind will make changes for it....

http://www.youtube.com/watch?v=l57UtSplDRU
this was done around 50-60fps because of fraps, and i've been looking for a solution that records without losing fps, because when i have 100+fps, i can do way better.

can you hear the sound of waving when an E string and a A string are slightly not in tune on the guitar? to a musician yes, to someone else prob not
best example would be watch someone play live and he adjusts the tuning knob of a string while playing because it was just very slightly out of tune, but to most of the audience, it make no difference at all...
Score
0
a c 216 U Graphics card
a c 128 C Monitor
a b 4 Gaming
November 22, 2011 3:38:50 PM

You did make a good point about change and how our brain and body get used to different frame rates.

For example, recently I was playing Dragon Age 2 and was playing at 90+ FPS on a 120hz monitor. All of a sudden, I noticed everything looked a little bit choppy, as if it was a slide show (a mild one). I looked at my monitors info and noticed it dropped out of 120hz and was at 60 hz with 60 FPS.

It was very apparent the difference between the 2. Another odd thing was 3D vision vs normal gaming. After using 3D vision for a few months, when I played games in normal mode that I had previously played in 3D vision, my eyes felt strained. I realized what would happen was anytime I looked at a different object that was at a different depth, my eyes would refocus, because in 3D vision, objects require you to refocus if they are at different depths. My mind got used 3D vision, and changing back caused my mind to have to readjust.

I know when I first started using 120hz, I didn't notice much of a difference. It just kind of felt smoother, but nothing major. However, going from 120hz down to 60hz is very different. I notice huge differences in smoothness.

One last thing. A few years ago, I used to game at 30-40 FPS, mostly because my machine could only manage that many. I always felt motion sickness when I played, but as I got more used to a game, it would lessen.

Later, after having put together a better machine, I found that higher FPS reduced the effects of motion sickness. It also continues to lessen as I go beyond 60 FPS.

After doing some reading, one of the probable causes to this motion sickness otherwise known as simulator sickness, is input lag. As your FPS go higher, input lag is generally reduced. I don't know if it's the increased FPS, or the reduced input lag, but the difference between 60-90 FPS, on a 120hz monitor at least, is very noticeable when it comes to simulator sickness. (others feel eye strain, I usually feel nausea).
Score
0
November 22, 2011 7:41:18 PM

ahhhh so many posts above, sorry, I didn't read em..

I just bought a 120 hz monitor and here is my honest opinion:


120 HZ...
People will say various things about what the human eye can perceive. I notice increased smoothness up to about 80 FPS.. and I will sometimes turn my settings down from Ultra if it means getting an average FPS of 80+

Anything below 40 FPS is EXTREMELY noticeable, and uncomfortable for me to play.



Also for those thinking about 3D: I'm still trying to get into it, it does make me feel sick after 10-15 minutes. That being said I had a couple mind-blowing moments playing it, and hope I can adjust to it.
Score
0
November 25, 2011 6:53:21 PM

I'll tell u what I wouldn't have spent $700 on a monitor makn $10 an hour if it wasn't a definitive improvement over 60hz. The difference in Quakelive with 120fps on a 60hz monitor and 120fps on a 120hz monitor is unequivocally pronounced.
Score
0
November 26, 2011 7:56:03 AM

I'm currently reading this on a CRT Viewsonic p225f 20" (viewable) 3:4 monitor at 1024x768 @ 120 Hz. I bought it for $50 used (last year) after my $1000 (in 2002) 20" CRT monitor exploded.

A long time ago I lowered my resolution to make things bigger (bad eyesight) and upped the refresh rate from 60 to 120 to reduce eye strain.

I play older games because I'm more into fun/fast game play over fancy graphics. I synced up my television on (my DVI out) to watch a movie once and when I went to play a game even though the monitor was still at 120Hz it was only showing 60 fps, and I could totally tell the difference.

The biggest difference was when moving and especially turning. There was a horrible 'choppyness' when turning fast. Mind you, this is the EXACT SAME MONITOR, just something in the vsync was misconfigured.

I then configured everything correctly so that my monitor was at 120Hz while my 720p television was at (it's max) of 60Hz in clone mode (they are side by side) and watched while playing a game demo and it was totally apparent. This is not some subjective psychological thing, it's REAL.

My problem is that I've been looking for a new flat monitor that can do REAL 120Hz (not TV interpolation) and I still can't figure out what's what. Also I'm pretty sure that some of the interfaces used (DVI?,HDMI?,DisplayPort?) can't actually support over about 72Hz even at really low resolutions.

Until then I guess I'll just cruise Craigslist for old CRTs, because 60Hz at any resolution is just unacceptable for gaming. :( 
Score
0
a c 216 U Graphics card
a c 128 C Monitor
a b 4 Gaming
November 26, 2011 1:57:53 PM

Guest3 said:
I'm currently reading this on a CRT Viewsonic p225f 20" (viewable) 3:4 monitor at 1024x768 @ 120 Hz. I bought it for $50 used (last year) after my $1000 (in 2002) 20" CRT monitor exploded.

A long time ago I lowered my resolution to make things bigger (bad eyesight) and upped the refresh rate from 60 to 120 to reduce eye strain.

I play older games because I'm more into fun/fast game play over fancy graphics. I synced up my television on (my DVI out) to watch a movie once and when I went to play a game even though the monitor was still at 120Hz it was only showing 60 fps, and I could totally tell the difference.

The biggest difference was when moving and especially turning. There was a horrible 'choppyness' when turning fast. Mind you, this is the EXACT SAME MONITOR, just something in the vsync was misconfigured.

I then configured everything correctly so that my monitor was at 120Hz while my 720p television was at (it's max) of 60Hz in clone mode (they are side by side) and watched while playing a game demo and it was totally apparent. This is not some subjective psychological thing, it's REAL.

My problem is that I've been looking for a new flat monitor that can do REAL 120Hz (not TV interpolation) and I still can't figure out what's what. Also I'm pretty sure that some of the interfaces used (DVI?,HDMI?,DisplayPort?) can't actually support over about 72Hz even at really low resolutions.

Until then I guess I'll just cruise Craigslist for old CRTs, because 60Hz at any resolution is just unacceptable for gaming. :( 


I notice the difference to, in the same ways you do, but it is much more noticeable going from 120hz after getting used to it, back to 60hz.

As far as monitors go, I don't know if there are any interpolated versions. That's a TV thing. Also, to make sure you can interface with it in 120hz, the monitor must either have a dual link DVI connection, or a displayport connection, otherwise you will be limited to 720p or lower resolutions.
Score
0
February 20, 2012 7:43:40 PM

@palladin9479

As good as some of your information sounds, it's hard to believe any of your facts when you don't seem to know about vsync, the input lag it causes, and screen tearing.

Seriously, how do you spout all the information you're spouting as fact yet you don't even know about the input lag vsync causes?

Like gmcizzle said, vsync off is preferred to not have any input lag, but this can also result in screen tearing, but 120 hz helps eliminate tearing so you can play without tearing AND have no vsync on.

Not even gonna touch the whole brain recognizing 60 vs 120 debate, cause I think the main advantage to 120 is the vsync off+no tearing.
Score
0
February 20, 2012 11:53:53 PM

crateria said:
@palladin9479

As good as some of your information sounds, it's hard to believe any of your facts when you don't seem to know about vsync, the input lag it causes, and screen tearing.

Seriously, how do you spout all the information you're spouting as fact yet you don't even know about the input lag vsync causes?

Like gmcizzle said, vsync off is preferred to not have any input lag, but this can also result in screen tearing, but 120 hz helps eliminate tearing so you can play without tearing AND have no vsync on.

Not even gonna touch the whole brain recognizing 60 vs 120 debate, cause I think the main advantage to 120 is the vsync off+no tearing.


Except ... mouse input's have zero to do with the GPU driver. What your describing is bad programming, the game itself is not processing your mouse input until after it renders a frame. A common practice in console games is to limit the input processing to only happen in-between screen refreshes. Sounds like programmers are porting this habit to PC games.

The biological fact remains, your human brain can't discern between light patters at 0.016s vs 0.0083s. There is no getting around this part.

All V-sync does is lock screen draws to monitor refreshes, nothing else, this is to prevent subsequent frames from being overlaid on a frame during mid-draw.

I've only ever seen the "input lag" your talking about in console ports or poorly programmed games. Everything else I play doesn't show this. The game is waiting on multiple frames to be processed before updating your games actions, thus while your fraps display is at 60s your actually only getting ~30 fps.

I have 2 x GTX-580 Hydros to go with my Acer 1920x1080x120 screen. I've tried BF3 @120 vs @60 and there is no difference.
Score
0
February 22, 2012 12:26:19 PM

Fact 1: With VSync ON you get mouse lag.
Fact 2: With VSync OFF you get screen tearing.

I don't know if tearing is fixed with 120hz monitors because i haven't tried any. But the above apply in every game i have ever tried for the last 12-13 years.
Score
0
a b U Graphics card
February 22, 2012 1:34:12 PM

^yeah but triple buffering should mostly eliminate input lag from the vsync, its not 1999 anymore
Score
0
a c 216 U Graphics card
a c 128 C Monitor
a b 4 Gaming
February 22, 2012 2:18:40 PM

jjb8675309 said:
^yeah but triple buffering should mostly eliminate input lag from the vsync, its not 1999 anymore


Triple buffering adds to input lag and always has because the image shown has been in queue for 1 frame. All tweak guides warn about it. Triple buffering has the advantage of smoothing out the FPS when your FPS are below your monitor's refresh rate. Without it, your FPS will always be a multiple that can divide evenly into the refresh rate. I.E. a 60hz monitor will either work at 60 FPS, 30, 20, or 15 FPS without triple buffering.

That said, a 120hz monitor does not eliminate tearing, although it is reduced. Also, vsync still increases input lag, but that too is reduced as the higher your FPS are, the less time there is between the queued frames.

EDIT: Theoretically, I'm not as sure if there really is increased lag with triple buffering. Most tweak guides mention it, but if you consider the steps used to make triple buffering happen, I'm not sure why it would increase input lag anymore than v-sync already does.

In the triple buffer system, you have the buffer that the monitor updates its image from, and alternating frames it generates the frames on which get flipped to the 1st one when ready. All you are doing with triple buffering in place of normal two buffer systems is add a frame the card can start generating the next frame on without interfering with any delay imposed by v-sync.

So basically I believe it should be noted that v-sync adds the input lag in part due to lowering FPS which triple buffering helps minimize.
Score
0
a b U Graphics card
February 22, 2012 2:37:20 PM

^really ive never noticed/had a problem with it, maybe its the perceived smoothness but I run it 100% of the time and can't notice any problems,

not that im an expert of triple buff but it seems to run just fine.

I understand what your saying but it was sold to me as counteracting the input lag of vsync which I also always use
Score
0
a c 216 U Graphics card
a c 128 C Monitor
a b 4 Gaming
February 22, 2012 2:39:30 PM

jjb8675309 said:
^really ive never noticed/had a problem with it, maybe its the perceived smoothness but I run it 100% of the time and can't notice any problems,

not that im an expert of triple buff but it seems to run just fine.

I understand what your saying but it was sold to me as counteracting the input lag of vsync which I also always use


I was updating my thoughts while you posted, I think you might find my edit make more sense to what you are experiencing.

Understanding the mechanics does make me believe it probably does reduce input lag compared to v-sync without triple buffering, however, it likely still imposes more input lag than a system without v-sync on at all if it could have more FPS than the monitor can display.
Score
0
a b U Graphics card
February 22, 2012 2:41:23 PM

^
yeah I got your edit and that is how I understand it.
yeah agreed it likely helps vsync a bit but overall there is always some input lag, yes
Score
0
a b U Graphics card
February 22, 2012 4:06:56 PM

Perception of input lag, image tearing, refresh rates, etc. really depends on the person, some notice it more than others. I've played competitive FPS's for many years so for me the input lag is extremely noticeable, but for some others it may not be felt at all.
Score
0
February 22, 2012 9:26:19 PM

Hands down, yes buy a 120hz monitor. One of the best upgrades I've ever done. I guess I have sensitive eyes but I notice a huge difference in smoothness. It's twice the hz/fps per second and I notice it. I love it. Going back to 60hz drives me crazy now. I'm not really convinced it'll help you in FPS games, maybe a tiny bit in really twitchy situations. I bought a benq xl2410t, love it. The colors could use a little work, but overall the amazing non existent input lag with 120hz is simply amazing. I play a lot of Left 4 Dead 2, BFBC2, and BF3.
Score
0
a b C Monitor
a b 4 Gaming
February 22, 2012 10:50:09 PM

legendkiller said:
120Hz is always better than 60 or something around it but it's way better for 3D lol...
Spec you should look for when choosing monitor:
120Hz
2ms
LED Backlit
1920x1080(At least for every monitor size)

The 2ms and 120Hz isn't always true. Not everyone plays FPS competitively, so the 60Hz won't be the end of the world. Input lag, aspect ratio, and contrast ratio are more important.

Here is why response time isn't crucial to a point:

Response times are not an exact science, it's more of a marketing ploy. Basically the manufacturer tests the panels by displaying colors on the screen, the short amount of time recorded to change from one color to another is the response time.

In the past response times were measured in BWB which is going from black to white and then back to black again. Fairly simple and straight forward, but it have high response time results. As a marketing ploy response times switched to GTG or grey to grey. GTG measures the smallest amount of time it takes the pixels to change from one shade of grey to another. As a result you can report lower response times. So... if company ABC makes single monitor and then markets it as two different monitors like... Model A (20ms BWB Response Time) and Model B (2ms GTG Response Time), guess which one would be flying off the shelves.

Okay, so the lowest recorded times are used to advertise response times. So what about the other recorded times? They are thrown out the window. It is quite possible there could be the following situation:

Monitor ABC: Lowest time recorded = 2ms, highest time recorded 300ms
Monitor XYZ: Lowest time recorded = 5ms, highest time recorded 250ms

Monitor ABC seems to be faster because of the 2ms response time measured. But it also peaked at 300ms which is a lot worse than Monitor XYZ. Since those high numbers are tossed out with the trash consumers don't know any better.
Score
0
February 22, 2012 10:56:23 PM

jjb8675309 said:
^
yeah I got your edit and that is how I understand it.
yeah agreed it likely helps vsync a bit but overall there is always some input lag, yes


The largest source of input lag would be the Human.

I'm assuming the definition your using is the delay from when you make a movement to when it's displayed on screen? Which becomes a factor of input drivers (keyboard / mouse) only. The only way your game is going to have any noticeable delay is if it's delaying processing till after current frame is rendered.

When people talk FPS they really should be using time notations, 60fps sounds small when compared to 120 until you look at the time denominations. 0.016 vs 0.0083 seconds. Inserting a 1 frame delay doesn't add anything perceptible to humans. Also render times are not static. It's really becoming a case of placebo effect.
Score
0
a c 216 U Graphics card
a c 128 C Monitor
a b 4 Gaming
February 22, 2012 11:03:07 PM

palladin9479 said:
The largest source of input lag would be the Human.

I'm assuming the definition your using is the delay from when you make a movement to when it's displayed on screen? Which becomes a factor of input drivers (keyboard / mouse) only. The only way your game is going to have any noticeable delay is if it's delaying processing till after current frame is rendered.

When people talk FPS they really should be using time notations, 60fps sounds small when compared to 120 until you look at the time denominations. 0.016 vs 0.0083 seconds. Inserting a 1 frame delay doesn't add anything perceptible to humans. Also render times are not static. It's really becoming a case of placebo effect.


For some of us, there is another factor that comes into play with FPS and input lag. Simulator sickness. While it's hard to see a difference between 40 fps and 75 fps, there is a big difference on how my stomach handles it. At 40 fps, I get motion sickness within 30 mins or less. At 75 fps, I don't get motion sickness, at 30 fps, I get motion sickness in about 5-10 mins. It seems to be a scale on how fast I get motion sickness from games, going to about 75 to 90 FPS removes motion sickness from games. That's not a placebo.
Score
0
!