Solved

Is Digital (DVI) really better than Analogue (VGA)?

I currently have a 22" ViewSonic monitor running at 1680 x 1050 60hz. I am using a VGA cable.
I use my PC for a lot of things, playing first person shooters, watching movies, surfing internet etc.

My monitor also has a DVI input option.

Is it worth buying a DVI cable to use digital or is Analogue fine? If so, in what way would it be better?
18 answers Last reply Best Answer
More about digital dvi analogue vga
  1. Digital is better, but make sure that your display has a DVI-D input rather than a DVI-A input
  2. It's DVI-D dual link.

    But how is this better than VGA? the monitor seems sharp, smooth in-game and isn't giving me any issues eg. Tearing.
  3. Just try it!
  4. http://www.buzzle.com/articles/dvi-vs-vga.html

    DVI is better, have a look at the link.
  5. Honestly for the most part you'll never notice. You'll need a digital cable if you want to go above 1080/1200 as VGA doesn't support it. The biggest benefit is HDCP support. That requires a digital connection to support the copy protection. Otherwise most eyes will have a hard time telling the two apart at 1080.
  6. littlejoelgriffo said:
    It's DVI-D dual link.

    But how is this better than VGA? the monitor seems sharp, smooth in-game and isn't giving me any issues eg. Tearing.


    Digital video signals are not significantly superior to analogue video signals at low resolutions (<= 1920x1080) but as the resolution and colour depth increases, digital signalling wins out. The analogue signal is attenuated and degraded by the DAC hardware on the transmitter, the cable, and the ADC hardware on the receiver. Each introduces a small error into the analogue symbol which can compound into the symbol that is received by the signal sink being slightly different than the symbol that was sent by the signal source.

    For example, the signal source might send a colour component with a value of 127 using an analogue voltage level of 0.3472 volts (0.7 volt full scale, 8 bits per symbol, 2.73 millivolts between symbols) but the characteristics of the transmission network may result in this being interpreted as 125 which is slightly less intense than the image source intended.
    If a very low value is sent following a high value (a high contrast edge), the low value may be pulled upwards due to the cables internal capacitance and low-pass nature. For example, an intensity of 42 is sent immediately following an intensity of 220. The receiver may interpret these as 218 and 58. The same error occurs when a high intensity symbol follows a low intensity symbol.

    These problems are less prevalent at lower pixel clock rates using shorter cables, and more prevalent at higher pixel clock rates with longer cables. Digital symbols can have errors in individual bits, but these bit errors are very rare and only occur occasionally whereas analogue symbols are always degraded to some extent.
  7. Wow, so many answers! Thanks!

    @i7baby, not very helpful :)
    @silverliquicity, all the article says is "better quality" not very specific :/
    @4745454b abC, thankyou very much for the practical answer!
    @Pinhedd didn't understand a word in the last two paragraphs :D
  8. So what you guys reckon is that at less than 1080p (I'm at 1680 x 1050) there isnt going to be any real noticeable difference?
  9. Best answer
    There should be no difference. (unless as i said above you tried watching an HDCP video, then the VGA won't even play it.) If you were using one of the new 1440 monitors I'd say to get DVI. I noticed no difference when I moved from VGA to DVI. I did notice when I went from 1024x768 to 1920x1080. I wouldn't say i7's advice wasn't very helpful. It shouldn't cost a lot to get a DVI cable and see for yourself. Perhaps borrow one from a friend and see? But I doubt you'll notice any real differences.
  10. DVI is digital and analog... So that means that where digital fails, analog will keep up?

    Personally I noticed SVGA is much better than HDMI (only digital) on my setup for several reasons like color, banding and especially ghosting (running at the same Hz, but blurry).

    My hobby is sim racing and when I was really enthusiastic about it I could race for 3 hours max. Now there are no limits, I can just keep on going till I die all because of ghosting. I think it's for the most part my TV because of the banding, but that's the reason why a decent SVGA cable is better on 1080 and lower resolutions in general IMO.
  11. TBB said:
    DVI is digital and analog... So that means that where digital fails, analog will keep up?

    Personally I noticed SVGA is much better than HDMI (only digital) on my setup for several reasons like color, banding and especially ghosting (running at the same Hz, but blurry).

    My hobby is sim racing and when I was really enthusiastic about it I could race for 3 hours max. Now there are no limits, I can just keep on going till I die all because of ghosting. I think it's for the most part my TV because of the banding, but that's the reason why a decent SVGA cable is better on 1080 and lower resolutions in general IMO.


    Ghosting is a product of the display, not the signalling method.
  12. I guess I meant blur then... Thanks for the correction, but that doesn't help me a lot...
  13. TBB said:
    I guess I meant blur then... Thanks for the correction, but that doesn't help me a lot...


    Loss of contrast (which can cause a sharp image to look blurry) is a characteristic of analogue video transmission, not digital.
  14. Digital is definitely better. Text looks fuzzy and tiresome to look at on VGA while it looks sharp on DVI
    At 1024x768, I can't tell. At 1440x900 it is noticeable and at 1920x1080 VGA looks bad to me
  15. But I had motion blur when I used HDMI and it disappeared when I used VGA. It was that bad that my eyes were red and got tired after racing for an hour.

    Today I know HDMI is actually crap when it comes to PC's, even VGA is better at 1080p in some cases, like mine, and I hope DVI-I is better than VGA when it comes to anything else than image, so I know what to buy in the future. It's logical DVI-I would be better when you look at when it was invented and for what vs. VGA, but HDMI was also invented to replace VGA in some cases like home theatre while it's not always better.
  16. TBB said:
    But I had motion blur when I used HDMI and it disappeared when I used VGA. It was that bad that my eyes were red and got tired after racing for an hour.

    Today I know HDMI is actually crap when it comes to PC's, even VGA is better at 1080p in some cases, like mine, and I hope DVI-I is better than VGA when it comes to anything else than image, so I know what to buy in the future. It's logical DVI-I would be better when you look at when it was invented and for what vs. VGA, but HDMI was also invented to replace VGA in some cases like home theatre while it's not always better.


    DVI-I is not a single interface, it's a combination of DVI-D and DVI-A. DVI-D is electrically compatible with HDMI and is nearly logically identical (HDMI can transfer data in the blanking periods, such as audio). DVI-A is electrically and logically identical to VGA
  17. So I assume DVI-I was created so that the best of both DVI-A and DVI-D are combined eliminating the issues of DVI-A with what DVI-D has to offer and vice versa... Otherwise there would be no reason for creating DVI-I that will logically be bigger because it's 2 tech's combined.
  18. TBB said:
    So I assume DVI-I was created so that the best of both DVI-A and DVI-D are combined eliminating the issues of DVI-A with what DVI-D has to offer and vice versa... Otherwise there would be no reason for creating DVI-I that will logically be bigger because it's 2 tech's combined.


    It just combines the two sets of signal specifications into one mechanical interface. A Dual-Link DVI-I port on a GPU can be passively adapted to any of the following display interfaces:

    VGA
    DVI-A
    HDMI
    Single-Link DVI-D
    Dual-Link DVI-D

    DVI-I yields the best overall display support, hence why most GPUs are equipped with at least one Dual-Link DVI-I port.
Ask a new question

Read More

DVI Cable Digital VGA Graphics Monitors