Is Digital (DVI) really better than Analogue (VGA)?

littlejoelgriffo

Honorable
Jan 21, 2014
38
0
10,530
I currently have a 22" ViewSonic monitor running at 1680 x 1050 60hz. I am using a VGA cable.
I use my PC for a lot of things, playing first person shooters, watching movies, surfing internet etc.

My monitor also has a DVI input option.

Is it worth buying a DVI cable to use digital or is Analogue fine? If so, in what way would it be better?
 
Solution
There should be no difference. (unless as i said above you tried watching an HDCP video, then the VGA won't even play it.) If you were using one of the new 1440 monitors I'd say to get DVI. I noticed no difference when I moved from VGA to DVI. I did notice when I went from 1024x768 to 1920x1080. I wouldn't say i7's advice wasn't very helpful. It shouldn't cost a lot to get a DVI cable and see for yourself. Perhaps borrow one from a friend and see? But I doubt you'll notice any real differences.

4745454b

Titan
Moderator
Honestly for the most part you'll never notice. You'll need a digital cable if you want to go above 1080/1200 as VGA doesn't support it. The biggest benefit is HDCP support. That requires a digital connection to support the copy protection. Otherwise most eyes will have a hard time telling the two apart at 1080.
 


Digital video signals are not significantly superior to analogue video signals at low resolutions (<= 1920x1080) but as the resolution and colour depth increases, digital signalling wins out. The analogue signal is attenuated and degraded by the DAC hardware on the transmitter, the cable, and the ADC hardware on the receiver. Each introduces a small error into the analogue symbol which can compound into the symbol that is received by the signal sink being slightly different than the symbol that was sent by the signal source.

For example, the signal source might send a colour component with a value of 127 using an analogue voltage level of 0.3472 volts (0.7 volt full scale, 8 bits per symbol, 2.73 millivolts between symbols) but the characteristics of the transmission network may result in this being interpreted as 125 which is slightly less intense than the image source intended.
If a very low value is sent following a high value (a high contrast edge), the low value may be pulled upwards due to the cables internal capacitance and low-pass nature. For example, an intensity of 42 is sent immediately following an intensity of 220. The receiver may interpret these as 218 and 58. The same error occurs when a high intensity symbol follows a low intensity symbol.

These problems are less prevalent at lower pixel clock rates using shorter cables, and more prevalent at higher pixel clock rates with longer cables. Digital symbols can have errors in individual bits, but these bit errors are very rare and only occur occasionally whereas analogue symbols are always degraded to some extent.
 

littlejoelgriffo

Honorable
Jan 21, 2014
38
0
10,530
Wow, so many answers! Thanks!

@i7baby, not very helpful :)
@silverliquicity, all the article says is "better quality" not very specific :/
@4745454b abC, thankyou very much for the practical answer!
@Pinhedd didn't understand a word in the last two paragraphs :D

 

4745454b

Titan
Moderator
There should be no difference. (unless as i said above you tried watching an HDCP video, then the VGA won't even play it.) If you were using one of the new 1440 monitors I'd say to get DVI. I noticed no difference when I moved from VGA to DVI. I did notice when I went from 1024x768 to 1920x1080. I wouldn't say i7's advice wasn't very helpful. It shouldn't cost a lot to get a DVI cable and see for yourself. Perhaps borrow one from a friend and see? But I doubt you'll notice any real differences.
 
Solution

TBB

Reputable
Jun 30, 2014
11
0
4,510
DVI is digital and analog... So that means that where digital fails, analog will keep up?

Personally I noticed SVGA is much better than HDMI (only digital) on my setup for several reasons like color, banding and especially ghosting (running at the same Hz, but blurry).

My hobby is sim racing and when I was really enthusiastic about it I could race for 3 hours max. Now there are no limits, I can just keep on going till I die all because of ghosting. I think it's for the most part my TV because of the banding, but that's the reason why a decent SVGA cable is better on 1080 and lower resolutions in general IMO.
 


Ghosting is a product of the display, not the signalling method.
 

TBB

Reputable
Jun 30, 2014
11
0
4,510
But I had motion blur when I used HDMI and it disappeared when I used VGA. It was that bad that my eyes were red and got tired after racing for an hour.

Today I know HDMI is actually crap when it comes to PC's, even VGA is better at 1080p in some cases, like mine, and I hope DVI-I is better than VGA when it comes to anything else than image, so I know what to buy in the future. It's logical DVI-I would be better when you look at when it was invented and for what vs. VGA, but HDMI was also invented to replace VGA in some cases like home theatre while it's not always better.
 


DVI-I is not a single interface, it's a combination of DVI-D and DVI-A. DVI-D is electrically compatible with HDMI and is nearly logically identical (HDMI can transfer data in the blanking periods, such as audio). DVI-A is electrically and logically identical to VGA
 

TBB

Reputable
Jun 30, 2014
11
0
4,510
So I assume DVI-I was created so that the best of both DVI-A and DVI-D are combined eliminating the issues of DVI-A with what DVI-D has to offer and vice versa... Otherwise there would be no reason for creating DVI-I that will logically be bigger because it's 2 tech's combined.
 


It just combines the two sets of signal specifications into one mechanical interface. A Dual-Link DVI-I port on a GPU can be passively adapted to any of the following display interfaces:

VGA
DVI-A
HDMI
Single-Link DVI-D
Dual-Link DVI-D

DVI-I yields the best overall display support, hence why most GPUs are equipped with at least one Dual-Link DVI-I port.