What is the meaning of LCD "Refresh Rate" and supported timings?

SyntaxSocialist

Distinguished
Jan 20, 2013
153
0
18,690
I'm looking to buy a gaming monitor. I'm looking through a lot of specs and am thoroughly confused. I've read in a variety of places that LCDs don't have any such think as a "refresh rate."

Please help me understand, then: Why am I seeing LCD monitors with refresh rates (or "vertical frequency" or whatever) of 60hz, 75hz, 120hz, 55-75hz...?

And then what is the meaning of the last page (p22 or 3.9) of the manual for this "55-75hz" ASUS VS238H-P monitor? It's talking about supported timings for various resolutions, leading me to believe that the "refresh rate" that the monitor uses will be dictated by the resolution it's set to. So then if I were to buy, say, this BenQ EW2440L 24" Monitor, how do I know if my monitor will actually be running at 120hz (I can't find a "supported timings" sheet for that one)? I would hope that when a frequency like that is advertised, it is for the monitor's native resolution, right?

So:
1. What is the meaning of an LCD monitor refresh rate if, as I've been told, "LCDs don't have refresh rates"?
2. Is it fair to assume that 55-75hz monitors will run their native resolutions at 60hz?
3. Is it fair to assume that an advertised frequency like 120hz will run at a monitor's native frequency?
 
Solution
If it gives a range like that it will likely be the most rounded number ie 60.
For the third if it does indeed run at 120Hz then it should sync to 120fps.

JoshuaHMB

Honorable
Dec 27, 2012
134
0
10,710
Refresh rate is how many frames the monitor can render, so if you have a 60hz monitor and you get 120 FPS in a game, you will actually only be seeing 60 FPS because that is all your monitor is rendering/refreshing-kinda.
 
Jash is close but a bit off in that description. Normal movies or TV was recorded in 30 Frames Per Second (FPS), which is basically one still image 30 of say holding a glass of milk. The next frame show the milk lifted a little higher, for 30FPS, then a little more and so on till you 'see' the person drink it form the glass in a flowing motion. Humans can perceive 60FPS which is what video games try achieving, to make it 'realistic.

Video signals changed from old Cathode Ray Tube (CRT) to LCDs the way things were displayed on the screen, the actual method to 'draw' to the screen also changed. A method to register this is the Hertz used, which is a measurement of the number of times they are 'drawn' to the screen which is 60Hz. With the new Blu-Ray method of video storage and HDMI method to carry that video and data from a device to a display alot more information can be 'pushed' to the LCD (or Home LCD TV), which grew to 120Hz. Basically doubling the number of images 'painted' onto the screen no matter how it was 'recorded' in (send 120images but the video could still be 30FPS). The second standard was 240Hz, now is 4K (4096Hz).

You will notice this when you are at a Big Box store (Best buy. Costco, etc.) and they have the monster TV with the Blu-Ray movie playing which seems smoother more 'real' in the display is the more detail and 'fluid motion' it does in scenes. This would be 120 or more Hz

ONLY NVidia supports 120Hz displays, ATI refuses anything above the 60Hz.

Now LCDs for Computers are different then for Video (TV, Satellite, etc.) normally. A Computer is 'very precise' and needs 'specific' resolutions, not just any 'numbers' (i.e. 1920x1080 not 1895x999) which it follows some standards from CRT days. The old standards were 640x480,800x600,1024x768,1280x1024, and so on. When Widescreen LCDs were implemented this added a new broad range of settings including 1400x900, 1600x1080 and so on to accommodate the 'wide' aspect of the image and not make it look all 'stretched display'.

So when hooking a 'LCD' up, keep in mind you may have to mess with it for some games (not all games nor applications support 1080p or 1920x1080 displays much less above 60Hz) and may look diffent and you may just have to 'settle' for some screen displays.
 


I didn't say it "couldn't", I said ATI / AMD REFUSES per policy to support anything above 60Hz. Thus the drivers (that make all the silicon on your wonderous 7950) do more then just blink on / off the power light is not setup in any way to support above 60Hz. Everyone whom has gotten a 120Hz Monitor has had to switch to Nvidia if they want to use it. Oh and you need to use a DVI-D connection, HDMI on PCs doesn't support the same standard as that of a Blu-Ray/HD TV/Console, so it isn't that 'easy' to just connect a HDMI and your done like on your TV.

And NO it isn't 60Hz Left eye PLUS 60Hz Right eye makes 120Hz. 120Hz is actually 120 times on the LCD screen painting the image per rated millisecond (my case ever 2-3ms), as compared to normal LCD that does only 60 times per rated millisecond. Your trying to equate Hz is the same thing as Frames Per Second (FPS), they are not the same thing.

A frame is the individual rectangles of a ribbon of film http://hospitalandoutreach.files.wordpress.com/2012/03/filmreel.jpg
As I mentioned, normal TV film uses only 30 of these per 'image' you see (1 second of film). Games normally do 60 of these per image you see. LCDs add a new level over this that no matter the 'FPS' being 'drawn' per second (how many frames of that image in a second) it 'paints' the image on the LCD a set 'frequency' or number of times it paints it to the screen, normally 60 times per rated millisecond, no matter how many or few FPS is happening (you can have only 1FPS happening you still 120 times painted on the LCD). 120Hz does it twice as much 'painting' of the image, so the detail and imagery is more realistic and extra details your mind picks up (shadow of a button is more distinct making it seem 3D, the movement of a sword seems more real, etc.).

For point of reference, if you saw the 1st Hobbit movie in the theater as Peter Jackson filmed it (45FPS) many people felt 'sickened' when they did alot of movement was on the screen, because the detail was so intense that your mind perceived you were equating visually movement to actual /movement/ though you were sitting, and thus many people felt nauseous and such. Other (as myself) felt it was much more 'realistic' and enjoyed it, the common slander is 'TV Soap Opera' cinemotography because Soaps use a higher FPS for filming to remove the normal duller and less detailed video for TV .
 

SyntaxSocialist

Distinguished
Jan 20, 2013
153
0
18,690


Any chance you can show me some examples of this happening? I'm frankly flabbergasted that AMD would refuse to offer such support and give that kind of edge to NVIDIA...
 

SyntaxSocialist

Distinguished
Jan 20, 2013
153
0
18,690


No, I think you misunderstand me. Perhaps I misunderstand as well, but I am aware that there is a difference between FPS and an LCD's frequency. But if my GPU is pushing 100 FPS, it'd be nice if my monitor didn't just pick up 60 of those frames. I'm aware that a 120HZ monitor will paint each frame of a 60FPS output twice.

Really, I'm just trying to increase the framerate (FPS) that locks in when I turn on V-Sync. Supposedly with a 120Hz monitor I would have a max FPS of 120.
 
Simple google shows you I am not the sole voice out there
https://www.google.com/search?q=ati+120hz+problem&oq=ATI+120hz&aqs=chrome.4.69i57j0l5.7753j0j1&sourceid=chrome&espv=210&es_sm=122&ie=UTF-8
https://www.google.com/search?q=ati+120hz+support&oq=ATI+120hz&aqs=chrome.2.69i57j0l5.9714j0j1&sourceid=chrome&espv=210&es_sm=122&ie=UTF-8

Then of course is this quote "For best performance, we of course recommend 720p 60Hz/eye (or 120Hz effective). Please note that you
will still need middleware for gaming." http://www.amd.com/us/Documents/AMD-HD3D-FAQ.pdf#search=120hz . Note "120Hz effective", not actually DOING 120Hz, because they don't support that; "we of course recommend 720p 60Hz" (not even supporting 1080p).
 


STOP. Your getting this all confused because you see similiar numbers and believe they are the same values and they are not. If your cranking 100FPS, your monitor / LCD is displaying 100 square frames (like that image I linked to) each second to the monitor. Your EYES will only see 60FPS, because they can't see any more then that. This has nothing to do with 60Hz.

60Hz means your monitor is taking those 100 frames (100 single pictures of you holding your gun, not doing anything else) and 'painting it' on the screen 60 times per rated miliseconds (gamer's monitors rate 2-3ms between new 'painted' image, off the shelf ones are normally 12-15 ms before looking to grab the next 100 frames being outputted to 'paint' on the screen).



Really you can't that is the point of V-sync, it is Syncronizing your GPU output with the monitor's vertical 'painting' (If I remember correctly) to minimize tearing and bleeds of images overloading the monitor's ability to take the output of the GPU and display it without being overwhelemed.

As your doing 100FPS your saying, whether you 'increased it' to 1000FPS or decreased it to 66FPS wouldn't matter you won't see any difference at all, as your eyes can only perceive (take in and process) 60FPS. Anything over 60FPS is just 'dick measuring'



No totally wrong as I mentioned above. FPS is not Hz and don't mean the same thing. As I said above if you have a 120Hz monitor it will take whatever x FPS you output and 'paint it' to the LCD 120 times in the rated miliseconds. So if you output 30000FPS or just 3FPS, it still 'paints' that man frames (30000FPS or just 3FPS) 120 times on the screen per rated miliseconds. The Hz does not increase nor decrease FPS, what a MONITOR can do to 'decrease' FPS is be low rated in milliseconds.

Miliseconds rating, as I mentioned is the amount of time the monitor's chips take to paint to the LCD then turn around and 'ask' the GPU what is the next frames to 'paint'. Older monitors were as bad as 30Milliseconds, normal ones are around 15-24ms response time, games (as I said) are only 2-3 miliseconds. So if the GPU wants to 'render' 10000FPS it won't matter if the monitor is still taking forever to turn around and ask for the next frame that the GPU 'drops' the frames (skipping / lag / in Mulitplayer people seem to teleport across a map, etc.). This is where your V-Sync comes into play (as mentioned).


More confusing math to toss in there; when we are speaking Frames Per Second, that means in a Second of time that many frames are being 'rendered'. In Millisecond rating of the Monitor, a millisecond it .001 of a Second, or it takes 1000 Milliseconds to make a Second of time. So a display that takes 15ms, is .015 of a second to 'paint' a display the Frames, but a Gamer LCD takes only .002 to .003 of a second to 'paint' that same image and ask for the next Frame.