I apologize if this question has been asked before, but after some brief searching I've found some info but not enough to make a decision. I am looking into getting a 47" panasonic vierra ips led lcd display for my gaming rig. I was previously using 2x 23" acer monitors. So from the basic reading I saw that 120hz for a tv from a pc input isnt in fact true 120hz its just a 60hz signal just doubled? Many people have said this can lead to interface lag, and the hdtv should be used at a 60hz setting for pc gaming use. How does a pc monitor rated at 120hz act differently than a hdtv rated at 120hz? I watch a lot of linus tech tips and he says hes sees a major difference in interface speed on a 120hz pc monitor when compared to a 60hz.
A TV is often used for less-interactive activities like watching movies, where buffering a few frames ahead so that it can interpolate between frames and make it look smoother isn't a big deal. However, when playing video games, this is undesirable input lag. I think most 120hz tvs have a "game mode" which avoids most of the input lag by skipping the interpolation and showing each frame a second time, making it little different from a 60hz tv in this mode. Also none of the standard TV connections (hdmi, vga, component, composite) are required to carry a full-resolution (1920x1080) at 120hz progressive. I've seen plasma tvs listed as being 600hz, but that's only internally. The video signal is still 60hz at most.
A 120hz monitor is usually designed to support 60 hz per eye with shutter glasses, but to support the 120hz signal at full resolution, it requires dual-link dvi or displayport (thunderbolt would probably also have enough bandwidth for 1920x1080 at 120hz, but so far there are very few monitors that have thunderbolt inputs at all). Most 120hz monitors also support using 120hz in 2D mode (i.e. non-stereoscopic) for smoother graphics. I have my doubts about the magnitude of improvement when using 120hz instead of 60hz, but I haven't actually tried a 120hz monitor (not counting crt monitors, which could only get that high a refresh rate at low resolution), so I could be wrong. If you can get to a physical computer store and compare them in person, that might give you a better idea.
Technically, movies are usually recorded at 24hz, even though the high-end signal cables (hdmi, vga, component) are capable of 60hz. 120hz tvs do have the advantage of being able to run these at an exact multiple of the original frame rate, as long as they're sent at 24hz instead of converted to 60hz beforehand. However, ghosting is not caused by the different refresh rates. I know of two types of ghosting, explained by these wikipedia articles: http://en.wikipedia.org/wiki/Ghosting_%28television%29 http://en.wikipedia.org/wiki/Motion_blur
I don't know which type of ghosting others are claiming to see, but movies will usually have some level of motion blur inherently from the recording (unless it's using stop motion animation), and a bad lcd tv might increase the severity.
Because of the input lag caused by frame interpolation, I'd agree it's pointless to buy a 120hz tv if you'll mainly be using it for gaming. I think a 120hz monitor might be worthwhile, but only if you want active 3d or can get well over 60fps (at full resolution and at least medium settings) in your games.