Consequences of Higher Refresh Rates on LCD and CRTs

ummdav1d

Distinguished
Jun 18, 2006
27
0
18,530
Does anyone know if there are any Consequences of Higher Refresh Rates on LCD and CRTs. I can't see any reason to not always use the highest refresh rate. Is there any? Doesn't it always make it more comfortable to view? Also, does anyone have any input on refresh rates and what can be expected from LCDs vs CRTs?
 
i am away to scream in a minute. there...is...no...such...thing...as...a....refresh....rate....on...a....LCD.


now higher refrsh rates on a crt will help cause there will be less chance of eye strain or headsaches due to percepable and imperceptable flicker.

LCD's are judged by how fast an individual pixel can go from black to white and back to black again. this isknown as latency and is measured in ms. typically it is best to get one of 16ms or less for gaming and also to note that the asdvertised latencies are a VERY tough guide only and not the deciding factor.
 

ummdav1d

Distinguished
Jun 18, 2006
27
0
18,530
You know, I didn't know if LCDs had refresh rates, but I'm using my sister's computer right now and she has an LCD with 2 refresh rate options, so I naturally assumed that LCDs did have refresh rates. Do you or anyone else have an idea why there would be a setting for the refresh rate? Maybe it is just an error and actually won't do anything. I had noticed that higher refresh rates on CRT's made them much easier on my eyes, so I always put them up to the max. Is there any consequences to that? If not, then why wouldn't they by default be set to the higher refresh rates? It must use more power or something huh?

Also, what do you mean by "a very tough guide"?

Thanks for the info man!
 
lol sorry for my response there. i have answered another post today where the guy kept going on about refresh rate even when i said it didn't exist.

oh and i meant rough guide. a small typo.

tbh i don't know why windows show different refresh rates. i think it is because of the gfx card. a lcd works by changing individual pixels as and when required. it doesn't refrsh the whole image like a crt does. just leave the refresh rate at 60 and it will be fine.

the higher the refresh rate on a crt might lead to more wear and tear on the elctron gun as it has to move quicker. i think that is the reason but i ain't sure.
 

ummdav1d

Distinguished
Jun 18, 2006
27
0
18,530
Yeah that makes sense, and maybe part of the reason windows has options is due to compatability issues, as in video related errors occuring at certain rates but not others. Anyways, thanks again for the info; it is much appreciated.
 

darkstar782

Distinguished
Dec 24, 2005
1,375
0
19,280
Some of the cheaper LCDs still only have a VGA (not DVI) input. For these (and indeed better ones when for some reason you use a VGA cable and not DVI) there is a refresh rate in terms of the fact that the computer is still outputting a signal at 60Hz or whatever, and the LCD needs to know what that refresh rate is to convert it back to digital, it may accept a number of different input refresh rates.

That doesnt mean it actually displays at a certain refresh rate :)
 

Sciberpunkt

Distinguished
Aug 31, 2006
26
0
18,530
There is a disadvantage to high refresh rates on CRT displays. While flicker is caused by the screen refresh rate being low enough to perceive the decay of phosphor brightness between each screen refresh, high refresh rates cause a kind of "scan smearing" where already bright phosphors are rescanned, which reduces sharpness.

Generally, you want to use the lowest refresh rate possible while still eliminating perceived flicker. For most people, this is between 75-85Hz.
 

TeraMedia

Distinguished
Jan 26, 2006
904
1
18,990
Nice post sciberpunkt.

LCDs may not have pixel refreshing the way CRTs do, but they do receive full (or interlaced) frame updates from the graphics card. This is true whether you're using DVI-D, DVI-A, VGA, HDMI, or component.

So the higher the "refresh rate", the more often your LCD will get a new frame from the graphics card.

Some gamers might know more about this, but I suspect that there may be issues with trying to refresh too quickly - such as if a piece of software is trying to synchronize changes with screen refresh intervals.

Regardless, you won't see flicker on an LCD as you can with CRT, because the pixels stay illuminated until they're turned off.

FYI, electron guns are fully solid-state. The electron path from the gun to the tube is curved in one way (e.g. X axis) by an electric field (like a big capacitor), and in the other way (e.g. Y axis) by a magnetic field. When your CRT renders a screen, the intensities of these two fields are modified to sweep the path of electrons across your screen in a zig-zag pattern. One of the previous posts could be misconstrued to imply that there's a motor or something in the back of your CRT, aiming the gun at the front. There's no motor.
 

Sciberpunkt

Distinguished
Aug 31, 2006
26
0
18,530
I suppose you would need the frame rate to exceed the monitor's refresh rate to realize an advantage to upping the refresh rate beyond visible flicker on a CRT.
...and certainly no electric motors involved. That would be silly. :wink:
 

baddog1

Distinguished
Mar 5, 2006
168
0
18,680
I had noticed that higher refresh rates on CRT's made them much easier on my eyes, so I always put them up to the max. Is there any consequences to that?

The crt is straining at it's highest refresh rates, so it won't last as long But also the picture will be fuzzier, and the contrast will be lessened. On a very good monitor you may not be able to tell. But either way, if you like the way it looks keep it that way. It won't suddenly kill anything.
 

cu2day

Distinguished
Sep 15, 2006
6
0
18,510
I couldn't just sit here and read this... I own a Seanix 19" wide screen 8ms LCD and yes the refresh rate DOES make a difference in certain situations. For your cruising the internet and basic stuff you can't tell the difference but on my screen for example at 60hz on a shooter or fast pased game there is a fair bit of image smearing to the point of it being an annoyance at 75hz the difference in gaming is huge.

If I'm not mistaken I'd think the lower the response time the more important a high refresh rate may be when gaming... I use the DVI connector for the VGA connector it could be a different story.

Believe me this isn't my mind cause when I change my clock settings on my video card my refresh always goes to 60hz and when gaming I can always notice the difference. Choppy looking and smeared and that is with vsync off... I always keep vsync off cause on makes for choppyness.

Oh here's a good read: http://www.tweakguides.com/Graphics_8.html

allow me to quote one part:

The main reason why the refresh rate on an LCD may matter in gaming is because of VSync - which is discussed in greater detail in the Vertical Synchronization setting. The simple fact of the matter is that LCD monitors have to work on the basis of receiving new frames of information from a graphics card's frame buffer like a CRT would: i.e, during the VBI. So when VSync is disabled the graphics card will sometimes race ahead and when the LCD monitor indicates it is ready for a new frame during the blanking interval, the graphics card may provide a partially new frame overlapping an older one, just like it would for a CRT. An LCD monitor will then display this just the same way a CRT monitor would, resulting in visible tearing. The alternative of enabling VSync can resolve this, but in turn can reduce FPS to a fraction of the refresh rate. The lower your refresh rate, the greater the performance drop, which is why a 60Hz refresh rate on an LCD may be problem.

Therefore LCD monitors, despite not actually physically working on the same basis as a CRT, wind up being bound by the same limitations and problems - minus the flicker - because they operate in a software environment originally designed with CRTs in mind.
 
i am aware of that info, however, i feel you are drawing wrong conclusions.

regardless of what affect a higher "refresh" rate setting has, it is nothing to do with the LCD as that info you posted says. also you need better hardware(i don't care what you got) if you get choppiness wiuth v-sync on. either new hardware or get Ati tray tools or dxtweaker to enable tripple buffering in dx games.

tbh, maybe you have more sensitive eyes but i thnik not. i might get some tearing with v-sync off but no smearing. it could be that "8MS" response rate(it will actuallly be 20's)

sorry but the reason for any improvement is most likely gfx on the computer side and not monitor. you are entitled to your opinion but AFAIK you are wrong and that info you posted up kinda proves it.
 

cu2day

Distinguished
Sep 15, 2006
6
0
18,510
Well I'm wrong to consider what I see as smearing of the image but there is a certain difference to me in frame rate smoothness and image tearing.

If refresh rate of an LCD had no effect than 120hz wouldn't make sense on this screen or the new JVC 120hz screen.
http://www.smh.com.au/news/home-theatre/samsung-ups-the-ante-in-widescreen-stakes/2006/08/23/1156012587756.html

The panel will have 1080p resolution and video signal of 120Hz to enhance moving images, doubling the 60 Hz video signal used on conventional LCD TVs

Though it's true that it's not an actual LCD limitation it's a frame rate limitation of how the information is sent from the video card to screen with computers. Though TV has a frame rate as well DVDs and all so I guess higher hz makes sure that all images are always captured rather than being missed in a mis sync. The fact that images come to us at a frame rate makes a higher refresh useful and yes I can for certain without a doubt see the difference. If I knew how to record movie images of my desktop and games playing I'd prove the difference to you.

Hmmm I'm going to try and record a video with my digital camera after work, hopefully it can capture a high enough frame rate to show you the big difference when playing Need for Speed: Most Wanted.
 
hmm, i see what you are saying, however, again this is not an lcd thing really as it is still about just increasing the v-sync that is allowed.

all that would do is allow more FPS to be shown. however, again that is nt an LCD limitation as you know.

i guess i a way you are right it makes a difference but only by telling the gfx card or whatever generates the images to increase them.

as you have said due to a fixed FPS this will make no difference to tv which is odd they have done it. in comps then yes it might help those who use v-sync attain a higher FPS.

all in all i think we are looking at the same object in two different ways. doesn't change what is though regardless of if you agree with me. i cannot explain how without v-sync a higher refresh rate helps but i know it is gfx card related and not LCD and if it is it shouldn't be.
 

cu2day

Distinguished
Sep 15, 2006
6
0
18,510
without vsync a high refresh helps cause of image tearing that happens on even on an LCD in games.

Best said on that link I sent eariler VSync is disabled the graphics card will sometimes race ahead and when the LCD monitor indicates it is ready for a new frame during the blanking interval, the graphics card may provide a partially new frame overlapping an older one, just like it would for a CRT. An LCD monitor will then display this just the same way a CRT monitor would, resulting in visible tearing.

So higher refresh rate should compensate for the higher number of frames being pushed just like it would a CRT monitor. Though technically it is rather silly that LCDs have a refresh rate as they do dots and a refresh is a hinderance if anything.

another quote: Let's look at an LCD's theoretical refresh rate, based on its response time rating. Consider the example of an LCD monitor nominally rated at an 8ms response time. Given 8 milliseconds is 8/1000ths of a second, in one full second it can refresh all the pixels on the screen (if necessary) 1000/8 = 125 times, which makes it equivalent to a 125Hz refresh rate. Yet no 8ms LCD monitor allows you to set a refresh rate even remotely close to this in Windows.

On topic the consequences of a higher refresh is beyond me with LCDs but with CRTs I believe it could cause more heat and shorter life spans because of the heat. Also some CRTs get image distortion when pushed too hard and it's a good sign that you are killing the screen very quickly... I should know I had an ATI video card years ago that would auto set the refresh rate at 100hz and the monitor could only handle 85hz but I didn't know it was always auto reseting to 100hz and needless to say I had a dead screen within 3 days. After I had bought a View Sonic G90fb 19" and set my disktop to 1024x768 that screen could do a refresh of 118hz at that res so 100hz was no problem

LCDs pushed too hard on refresh rate simply tend not to display any image at all and you'll have a hell of a time setting you refresh rate back to an acceptable level if you aren't careful cause the screen will always blank in windows... I should know I tried it out but booting into the boot menu and loading VGA display is the way to get back into windows and change your settings back. Safe mode didn't work to fix the refresh in normal mode.

I guess other than the DVI limitations of refresh rate I guess you could have a higher VGA refresh but I don't think the analog to digital decoders could handle that much stress to properly show on screen. Maybe LCDs should get an HDMI connector from the video card to the screen for higher bandwidth. as that quote said 8ms screen should be able to display up to 125 images a second.
 

baddog1

Distinguished
Mar 5, 2006
168
0
18,680
TFT LCD's have no refresh rate. They are always on. If there is a difference it is your video cards fault. If turning up the cards refresh rate helps prevent tearing, go for it. It won't hurt anything. But an lcd doesn't care, as it has no refresh rate, only a vertical sync rate, which is tied to the frequency of the incoming AC current (uh, 60hz?).

As for smearing or tearing, it can only be due to a slow pixel response time, or maybe due to a discrepancy between the card and the monitor. But, as I said, if you see a difference, do it.

I don't think the improvement could be the placebo effect, could it?
 

TeraMedia

Distinguished
Jan 26, 2006
904
1
18,990
I'm getting irritated about this.

TFT LCD's have no refresh rate. They are always on.

I don't think anyone is saying that the pixels need to be "refreshed" as they do with a CRT image. I think everyone is talking about refresh rate in terms of image or frame refresh rate - the rate at which the frame is updated with a new full screen of pixels. Even though an LCD doesn't rely on refreshing, it does get full new frames with every interval. Because the GPU adaptor GUIs typically call this the "refresh frequency" or "refresh rate", that's what we're all calling it. This is not the same as the vetical retrace interval, mind you, but is simply the number of times each second that a full frame of pixels is transmitted from the graphics card to the display.

I guess other than the DVI limitations of refresh rate I guess you could have a higher VGA refresh but I don't think the analog to digital decoders could handle that much stress to properly show on screen. Maybe LCDs should get an HDMI connector from the video card to the screen for higher bandwidth. as that quote said 8ms screen should be able to display up to 125 images a second.

Regarding HDMI, DVI-D, bit rates, and VGA:
HDMI and DVI-D support EXACTLY the same data rates. They both come in either single-channel (3 diff-signal pairs clocked at up to 165 MHz with 10x data rate), or dual-channel (6 diff-signal pairs clocked at an uncapped frequency with 10x data rate). Each diff-signal pair sends 10 code bits (=8 data bits) per clock. So a full-spec single-channel DVI or HDMI cable is capable of transmitting up to 3 x 8 bits x 165Mbits / second, which translates to 165M 24-bit pixels per second. 1080p is ~ 2M pixels * 60 /s = 120 M 24-bit pixels per second. This is why you don't need a dual-channel cable unless you go higher-res than 1080p, such as the Dell 30" ultrasharp, or have long (horizontal and/or vertical) retrace intervals, such as you might have with a CRT running 1080p. That retrace time might reduce the available time for picture bit delivery to below what is possible at 165 MHz. Why does a CRT have a higher retrace interval? Because it takes time for the CRT to change the voltage on the beam-steering plates from positive to negative (or vice versa). LCDs don't have to steer electron beams, so they don't have this hardware limitation. They do have memory bandwidth limitations for the screen buffer, however. And if a signal is sent in analog, they also have limitations on their A/D converters. Not to mention image-scaling calculations, noise reduction, etc. All of those factors reduce the rate at which an LCD is able to accept new frames of video.

DVI-D and HDMI use identical 10-bit codes, voltage levels, timing and signaling. The key differences are physical (the connectors look different) and that HDMI transmits audio during the "retrace" intervals, which are just margins on the left/right and top/bottom of the pictures being transmitted. DVI doesn't do / support that.

In addition to digital, DVI-I and DVI-A support analog (standard VGA) signalling using essentially the same form factor as DVI-D, with slightly more / different pins and conductor architecture. Something to be aware of when you connect a DVI-? monitor to a DVI-? card using a DVI-? cable - it's worth confirming that your signal is staying in the digital domain so that you're not introducing D/A and A/D distortions or analog-domain signal noise.
 

Raves

Distinguished
Sep 26, 2006
1
0
18,510
Hello, I am new to the forums. A google search lead me here, as I had some questions about LCD, refresh rates and gaming.

So I thought my default refresh rate on my 17" LCD was 60hz. But when I did some benchmarks in BF2 at 1280x1024 @ 60hz, the performance was less than expected (40-60fps). I found out via google search that my default was actually 75hz, so I changed to that value (in winXP and BF2) and reran my benches - and the performance was much better (60-90fps).

I use a DVI to VGA adapter.

I think my experience confirms what some of you have been saying.
 

expertcomputerhelp

Distinguished
Dec 9, 2008
2
0
18,510
You guys sounds pretty darn intelligent to me and definitely make planty of sense and I realize this all happened a while ago, however, to keep all the yadda yadda to a minimum, LCD monitors for the PC are still only better than CRT monitors at viewing text, movies, surfing the web, email, instant msgs, basically LCDs are better at just about everything, except gaming, period! Why, well, just re-read everything prior to this post. I don't have the energy to re-type all that technical stuff. Unfortunately if you are a gaming in this day and age, you will either be stuck with some minor eye strain and possibly a slight headache after playing Call of Duty 4 or Counter Strike Source for an entire evening on your fancy 22'' LCD or you will go out and get yourself a cheap CRT monitor that can display some nice refresh rates that will keep you grinning all evening while you crush everyone on the server without needing to take one aspirin because of your eye strain. Who gives a crap if it stresses out the monitor..lol. In a year or two your only going to buy a new one anyway and probably an LCD because by then the technology "should" be where it needs to be. At least we hope. :-/ As I look at my calender it is 12/09/2008.