I'm getting irritated about this.
TFT LCD's have no refresh rate. They are always on.
I don't think anyone is saying that the pixels need to be "refreshed" as they do with a CRT image. I think everyone is talking about refresh rate in terms of
image or
frame refresh rate - the rate at which the frame is updated with a new full screen of pixels. Even though an LCD doesn't rely on refreshing, it does get full new frames with every interval. Because the GPU adaptor GUIs typically call this the "refresh frequency" or "refresh rate", that's what we're all calling it. This is not the same as the vetical retrace interval, mind you, but is simply the number of times each second that a full frame of pixels is transmitted from the graphics card to the display.
I guess other than the DVI limitations of refresh rate I guess you could have a higher VGA refresh but I don't think the analog to digital decoders could handle that much stress to properly show on screen. Maybe LCDs should get an HDMI connector from the video card to the screen for higher bandwidth. as that quote said 8ms screen should be able to display up to 125 images a second.
Regarding HDMI, DVI-D, bit rates, and VGA:
HDMI and DVI-D support EXACTLY the same data rates. They both come in either single-channel (3 diff-signal pairs clocked at up to 165 MHz with 10x data rate), or dual-channel (6 diff-signal pairs clocked at an uncapped frequency with 10x data rate). Each diff-signal pair sends 10 code bits (=8 data bits) per clock. So a full-spec single-channel DVI or HDMI cable is capable of transmitting up to 3 x 8 bits x 165Mbits / second, which translates to 165M 24-bit pixels per second. 1080p is ~ 2M pixels * 60 /s = 120 M 24-bit pixels per second. This is why you don't need a dual-channel cable unless you go higher-res than 1080p, such as the Dell 30" ultrasharp, or have long (horizontal and/or vertical) retrace intervals, such as you might have with a CRT running 1080p. That retrace time might reduce the available time for picture bit delivery to below what is possible at 165 MHz. Why does a CRT have a higher retrace interval? Because it takes time for the CRT to change the voltage on the beam-steering plates from positive to negative (or vice versa). LCDs don't have to steer electron beams, so they don't have this hardware limitation. They do have memory bandwidth limitations for the screen buffer, however. And if a signal is sent in analog, they also have limitations on their A/D converters. Not to mention image-scaling calculations, noise reduction, etc. All of those factors reduce the rate at which an LCD is able to accept new frames of video.
DVI-D and HDMI use identical 10-bit codes, voltage levels, timing and signaling. The key differences are physical (the connectors look different) and that HDMI transmits audio during the "retrace" intervals, which are just margins on the left/right and top/bottom of the pictures being transmitted. DVI doesn't do / support that.
In addition to digital, DVI-I and DVI-A support analog (standard VGA) signalling using essentially the same form factor as DVI-D, with slightly more / different pins and conductor architecture. Something to be aware of when you connect a DVI-? monitor to a DVI-? card using a DVI-? cable - it's worth confirming that your signal is staying in the digital domain so that you're not introducing D/A and A/D distortions or analog-domain signal noise.