Sign in with
Sign up | Sign in
Your question
Closed

LCD refresh rate in gaming - 60hz vs. 75hz ?

Last response: in Graphics & Displays
Share
February 1, 2010 3:38:29 PM

Hello!

I just ordered myself the new 24" BenQ G2420HDBE Full HD monitor which has recieved many good reviews from users. Got it for a real bargain price in Finland, only 165 € ~ $230.
It has 5ms and 60hz and my old Samsung SyncMaster 940B has 8ms and 75hz (no visible ghosting though)!

So,

is there any notable difference - in gaming, movies, common usage etc. - between new 60hz and 75hz screens?


Thanks!
February 1, 2010 3:46:57 PM

Image wise, not really, although if you kinda look out of the corner of your eye at a 75Hz you wont see the flicker you get with a 60Hz.

Some people get tired eyes at 60Hz though for whatever reason and need a higher refresh rate.
Score
0
a b U Graphics card
February 1, 2010 3:47:14 PM

not the 75 hz, that will give you a faster refresh rate

but the 5 ms, will give you better, faster feedback on games
Score
0
Related resources
February 1, 2010 5:08:43 PM

RealityRush:

I thought the flickering was only with the old CRT monitors that use totally different technology? I indeed remember getting my eyes tired with my 10 years old CRT monitor.

Upendra09:

Yeah, 5ms response time should not leave any ghosting.


But still, what is the biggest difference in LCD screens in 60hz and 75hz? I have hear that in motion, the bigger hz, the better picture in movement? But is there any notable difference 60hz vs. 75hz?
Score
0
February 1, 2010 5:27:39 PM

you will notice the difference only if you have realy traned eyes
Score
0
a b U Graphics card
February 1, 2010 5:42:59 PM

Yes, there is no flicker with LCD's or there shouldn't be. Whilst the images still have to be sent to the monitor like they always have the LCD's are continually lit so unless the backlight flickers you shouldn't notice anything.

Also, refresh rates are useless as it is physically impossible for a lcd monitor to maintain a constant fps even from one part of the screen to another due to the response time variance of the crystals themselves.
Score
0
a c 216 U Graphics card
a b 4 Gaming
February 1, 2010 6:22:06 PM

Again, as mentioned, LCD's are solid state and never flicker. The difference between 60Hz and 75Hz is that you can have 75 frames displayed per second instead of a max of 60. More frames is generally smoother, but most won't noticed the difference. After 50+ FPS, it feels pretty comfortable.
Score
0
February 1, 2010 6:26:27 PM

bystander said:
Again, as mentioned, LCD's are solid state and never flicker. The difference between 60Hz and 75Hz is that you can have 75 frames displayed per second instead of a max of 60. More frames is generally smoother, but most won't noticed the difference. After 50+ FPS, it feels pretty comfortable.

well sed :) 
Score
0
February 1, 2010 6:28:46 PM

olkka said:
I thought the flickering was only with the old CRT monitors that use totally different technology? I indeed remember getting my eyes tired with my 10 years old CRT monitor.


Yeah, sorry I wasn't specific, CRT's used to flicker, LCDs don't.

olkka said:
But still, what is the biggest difference in LCD screens in 60hz and 75hz? I have hear that in motion, the bigger hz, the better picture in movement? But is there any notable difference 60hz vs. 75hz?


For the majority of the population there is no noticable difference. 60Hz is generally the fastest people can see, beyond that it wont matter to their eyes. Some people I'm sure can see faster, but not by much, I doubt even close to a whole 15Hz.

On moving pictures, it makes a small difference, but barely any because again, our eyes are used to blurring as we turn, 75Hz would barely just make it slightly less blurry to our eyes.

120Hz and 240Hz TV's are advertised by Sony a lot for their "improved motion image", but honestly, the only reason to get a 120Hz one is for 3D viewing, not even sure why you would need 240Hz........... 5D viewing? :p 
Score
0
a b U Graphics card
February 1, 2010 8:07:55 PM

You'll only notice 75Hz if your graphics card can drive that or videos are in 75FPS.
Score
0
a b U Graphics card
February 1, 2010 9:00:17 PM

RealityRush-
The 120mhz 240mhz thing is basically done to add extra, computer generated frames to the picture. This is done to smooth out moving images. It has nothing to do with 3d, BTW. In opposition to what you said, there is definitely a reason to get 120mhz tvs, however, I think most people would be hard pressed to see the difference between 120 and 240.

Think of it the same way that DVD upscaling works- the DVD is the same, but the chip in the player makes it "better" by using more resolution to increase the quality of the picture.
Score
0
February 1, 2010 9:22:13 PM

festerovic said:
RealityRush-
The 120mhz 240mhz thing is basically done to add extra, computer generated frames to the picture. This is done to smooth out moving images. It has nothing to do with 3d, BTW. In opposition to what you said, there is definitely a reason to get 120mhz tvs, however, I think most people would be hard pressed to see the difference between 120 and 240.

Think of it the same way that DVD upscaling works- the DVD is the same, but the chip in the player makes it "better" by using more resolution to increase the quality of the picture.


No, 120Hz is a requirement for 3D.

Because it is essentially drawing 2 frames for every 1 frame on a 60Hz picture.
Score
0
a b U Graphics card
February 1, 2010 9:23:26 PM

Except of course you need a monitor that has a maximum response time of 8ms.
Score
0
a b U Graphics card
February 1, 2010 9:23:36 PM

No, it is not a requirement for 3D, a 8400GS will run Crysis, just as a 60Hz screen will run 3D, it is a recommendation.
Score
0
a b U Graphics card
February 1, 2010 9:25:01 PM

8400gs running crysis, blasphemy!!
Score
0
a b U Graphics card
February 1, 2010 9:26:29 PM

My 7300GT ran Crysis. 8400GS > 7300GT.
Score
0
a b U Graphics card
February 1, 2010 9:32:02 PM

Lies!!!! Lies!!!!

Only kidding, did you use it as a slide show presentation?
Score
0
February 1, 2010 10:16:40 PM

sabot00 said:
No, it is not a requirement for 3D, a 8400GS will run Crysis, just as a 60Hz screen will run 3D, it is a recommendation.


..... are you going to really watch a motion picture 3D movie at 30Hz........ that'll be fantastic looking I'm sure....

It's essentially a requirement unless you want to watch Avatar as a slide show instead of real-time.
Score
0
February 1, 2010 11:06:20 PM

There are a lot of misconceptions about how many Frames Per Second (FPS) the human eye can perceive. I've seen arguments in these forums and other places where people say anything above 30 or 50 or even 60fps is a "waste", or isn't really noticeable. I guess these are all ways of justifying low framerates in games, but I found this article which explains this a bit more logically and factually Human Eye Frames Per Second. It's worth a read to clear up all the misconceptions.
Quote:
The overwhelming solution to a more realistic game play, or computer video has been to push the human eye past the misconception of only being able to perceive 30 FPS. Pushing the Human Eye past 30 FPS to 60 FPS and even 120 FPS is possible, ask the video card manufacturers, an eye doctor, or a Physiologist. We as humans CAN and DO see more than 60 frames a second.

Thus, the big misconception that our eyes can only see 30 frames or 60 frames per second is purely due to the fact that the mainstream displays can only show this, not that our eyes can't see more. For the time being, the frames per second capable of any display device isn't even close to the phrase "more than meets the eye".

In terms of practical things we can do, I recommend that you:

- Make sure you use a refresh rate fix in WinXP so that your monitor runs at its maximum refresh rate and not 60Hz (which is terrible for the eyes). A good one for both Nvidia and ATI cards is Refresh Force.

- Set Vsync On. When Vsync is Off you may gain a few fps but the tearing is noticeable even on the best displays. This is because the monitor is limited in how many fps it can display at particular resolutions, so any higher and you're really seeing parts of images at 90fps for example, not the whole image.

- If there is a MaxFPS line in the ini file for your game then set it to your monitor's refresh rate. This seems to help reduce fps spikes and provide much less jerky/stuttery gameplay. By capping your FPS in the game engine to your refresh rate, along with Vsync on, you get less tearing but more importantly you get smoother fps. Setting a high MaxFPS is a placebo...it doesn't seem to improve performance as is often thought and in fact often results in more stuttering not caused by disk activity.

Link to the source . . .

http://forums.overclockers.com.au/showthread.php?t=2071...

The link to the second article ...

http://amo.net/NT/02-21-01FPS.html
Score
0
a c 216 U Graphics card
a b 4 Gaming
February 1, 2010 11:22:22 PM

Just a note about your wall of text. We are refering to LCD's. So the flickering, and strain at 60Hz is not present, as was posted.

Also keep in mind, your video card has to be able to deliver 75 FPS in order for you to take advantage of the 75Hz.

But ya, we humans see far more than 60 FPS, but depending on the situation, our concious mind may not be able to notice the difference, but other times it will.

For most people, having more than 40 fps is comfortable. I prefer 50+.
Score
0
February 1, 2010 11:28:25 PM

Its an informational post. No need to bomb on it.
Score
0
a c 216 U Graphics card
a b 4 Gaming
February 2, 2010 12:04:21 AM

kylelively said:
Its an informational post. No need to bomb on it.


I just wanted to make sure that the difference between the CRT and LCD was noted. They have a very different effect on the eyes.
Score
0
Anonymous
a b U Graphics card
a b 4 Gaming
February 7, 2011 6:44:32 AM

Yeah,the difference is that LCD will tire your eyes easilly more than CRT.
Just ask any eye doctor,there are more people with eye problems now with LCD than there were with the CRT.
Score
0
a b U Graphics card
a b 4 Gaming
February 7, 2011 1:20:38 PM

RealityRush said:
No, 120Hz is a requirement for 3D.

Because it is essentially drawing 2 frames for every 1 frame on a 60Hz picture.


TV's use interpolation to create 120 frames from a 60 frame source; most are not capable of receiving the 120 frames needed by current 3d implementations [3d vision, etc].

And no, 120Hz certainly isn't a necessity; you could do 3d using 30 hz for each eye instead if you really wanted to...
Score
0
a c 130 U Graphics card
February 7, 2011 2:04:50 PM

olkka said:
Hello!

I just ordered myself the new 24" BenQ G2420HDBE Full HD monitor which has recieved many good reviews from users. Got it for a real bargain price in Finland, only 165 € ~ $230.
It has 5ms and 60hz and my old Samsung SyncMaster 940B has 8ms and 75hz (no visible ghosting though)!

So,

is there any notable difference - in gaming, movies, common usage etc. - between new 60hz and 75hz screens?


Thanks!



The whole thing as with most things concerning displays is going to be 100% subjective. Everything from brightness to colour depth to richness of blacks etc really the list is never ending but each person will have a different preference. Some will agree some will argue that something matters and others will call them silly for thinking so.

Technically I wouldnt say you had thing 1 to worry about.

Mactronix :) 
Score
0
February 7, 2011 2:21:40 PM

The only thing notable concerning refresh rates is the impact it has on your eyes. Gaming wise, or image quality wise, it has nothing to do with it.

A short explanation: A 75hz refresh rate will generate more images in one second, compared to a 60hz refresh rate. This means, that while your eye might notice a flicker at 60hz rate (the flicker itself is the transition from one particular image to the next one), at 75hz, you won't be able to notice it. The image has the exact same quality in both cases, but one monitor can generate images faster than the other.

It is assumed, that a 75hz refresh rate is better for the human eye, and doesn't wear out the eyes as badly as a 60hz refresh does. In my case, either is fine, I've been stuck with my eyes in computer screens for close to 20 years now, and still don't need glasses, but I'm guess I'm just lucky.

Bottomline, you shouldn't worry too much about the refresh rate, unless you are really sensitive to it.
Score
0
April 14, 2011 2:25:46 AM

Eye is satisfied with 25Hz as long as motion blur is involved (in celuloid film times it naturally was). Problem is in real world objects move continuously. On monitor they don´t have to. When you move mouse cursor in ideal situation you should see every possible position between start and end of the motion. If you traverse the whole monitor in one second, it is 1900 positions in one second. Then the motion would be fluent. In real what you see is cursor, gap, cursor, gap, cursor, gap... On 60Hz the gap is approx. 30 pixels, much bigger than size of the cursor itself. If you don´t move just cursor, but for instance view in 3D game, things are even worse. You can get used to it, but "what you are able to see" is far far away.
Btw whenever I met CRT on 60Hz I had to change the setting because I clearly saw it flicker all the time and it was really uncomfortable.
Score
0
October 19, 2011 6:05:39 PM

WOW... very informative posts, but why is it getting dark ....

...after reading all that I think I am going blind!!!!
Score
0
a c 271 U Graphics card
a b 4 Gaming
October 19, 2011 6:42:09 PM

This topic has been closed by Mousemonkey
Score
0
!