Sign in with
Sign up | Sign in
Your question
Solved

Dvi to Vga question!

Last response: in Graphics & Displays
Share
August 28, 2013 5:04:09 PM

I hear dvi is better quality than hdmi for pc. So here is my question. My pc has a DVI out but my monitor only has a vga and hdmi port. Is there a way for me to connect the pc to monitor and get dvi quality?

More about : dvi vga question

a b C Monitor
August 28, 2013 5:09:34 PM

I dont know were you heard that, dvi and hdmi are similar but hdmi does have the edge. The quality will degrade with a dvi to vga as well, slightly, its better to go hdmi with dvi to hdmi.
m
0
l
a b C Monitor
August 28, 2013 5:16:42 PM

DivineIncarnated said:
I hear dvi is better quality than hdmi for pc. So here is my question. My pc has a DVI out but my monitor only has a vga and hdmi port. Is there a way for me to connect the pc to monitor and get dvi quality?



http://www.newegg.com/Product/Product.aspx?Item=N82E168...
m
0
l
Related resources

Best solution

a b C Monitor
August 28, 2013 5:50:06 PM

DivineIncarnated said:
I hear dvi is better quality than hdmi for pc. So here is my question. My pc has a DVI out but my monitor only has a vga and hdmi port. Is there a way for me to connect the pc to monitor and get dvi quality?


You heard incorrectly.

DVI-D and HDMI are digital signals that carry the same pixel data. Since the signal is digital, it either gets there intact or it doesn't get there at all. There's a small probability of a bit error, but these are stochastic by nature and aren't noticeable.

VGA and DVI-A are analog signals. Pixel data is carried by varying the amplitude on 3 wires, one each for Red, Green and Blue. For a variety of electrical reasons it's expected that there will be a small deviation in signal ampitude from what would normally be expected. For digital transmissions this usually has no effect as the variation is rarely significant enough to cause a bit error. However, on analog signals the variation can be significant enough to cause a symbol to be misinterpreted as a neighbouring value. Neighbouring colour values are usually unnoticeable, but such deviation occurs more frequently as the resolution and refresh rates are increased. High resolution displays that receive analog signals will appear washed out compared to the same display receiving a digital signal.

DVI-I (the formfactor present on most video cards) combines DVI-D and DVI-A into one socket. The output signal will be either analog or digital depending on what cable is attached (usually provided with the display). DVI-A is identical to VGA, differing only in socket layout. A passive DVI-A to VGA adapter (included with almost every graphics card) allows a VGA cable to be connected to a display adapter that has only a DVI-I output.

For resolutions at or below 1920x1200, HDMI, DisplayPort, and DVI-D are functionally identical. For resolutions beyond 1920x1200, only DisplayPort and DVI-D will work (Dual-Link DVI-D to be specific). VGA (and accordingly, DVI-A) can drive resolutions up to approximately 2048x1536, but colour quality can be atrocious compared to its digital counterparts.
Share
August 28, 2013 6:39:02 PM

Gam3r01 said:
I dont know were you heard that, dvi and hdmi are similar but hdmi does have the edge. The quality will degrade with a dvi to vga as well, slightly, its better to go hdmi with dvi to hdmi.


From a retard apparently . Thanks for the response!
m
0
l
August 28, 2013 6:39:44 PM

Pinhedd said:
DivineIncarnated said:
I hear dvi is better quality than hdmi for pc. So here is my question. My pc has a DVI out but my monitor only has a vga and hdmi port. Is there a way for me to connect the pc to monitor and get dvi quality?


You heard incorrectly.

DVI-D and HDMI are digital signals that carry the same pixel data. Since the signal is digital, it either gets there intact or it doesn't get there at all. There's a small probability of a bit error, but these are stochastic by nature and aren't noticeable.

VGA and DVI-A are analog signals. Pixel data is carried by varying the amplitude on 3 wires, one each for Red, Green and Blue. For a variety of electrical reasons it's expected that there will be a small deviation in signal ampitude from what would normally be expected. For digital transmissions this usually has no effect as the variation is rarely significant enough to cause a bit error. However, on analog signals the variation can be significant enough to cause a symbol to be misinterpreted as a neighbouring value. Neighbouring colour values are usually unnoticeable, but such deviation occurs more frequently as the resolution and refresh rates are increased. High resolution displays that receive analog signals will appear washed out compared to the same display receiving a digital signal.

DVI-I (the formfactor present on most video cards) combines DVI-D and DVI-A into one socket. The output signal will be either analog or digital depending on what cable is attached (usually provided with the display). DVI-A is identical to VGA, differing only in socket layout. A passive DVI-A to VGA adapter (included with almost every graphics card) allows a VGA cable to be connected to a display adapter that has only a DVI-I output.

For resolutions at or below 1920x1200, HDMI, DisplayPort, and DVI-D are functionally identical. For resolutions beyond 1920x1200, only DisplayPort and DVI-D will work (Dual-Link DVI-D to be specific). VGA (and accordingly, DVI-A) can drive resolutions up to approximately 2048x1536, but colour quality can be atrocious compared to its digital counterparts.


Thanks for the lesson! This really helped explain it to me. :) 
m
0
l
a b C Monitor
August 28, 2013 6:52:06 PM

DivineIncarnated said:
Pinhedd said:
DivineIncarnated said:
I hear dvi is better quality than hdmi for pc. So here is my question. My pc has a DVI out but my monitor only has a vga and hdmi port. Is there a way for me to connect the pc to monitor and get dvi quality?


You heard incorrectly.

DVI-D and HDMI are digital signals that carry the same pixel data. Since the signal is digital, it either gets there intact or it doesn't get there at all. There's a small probability of a bit error, but these are stochastic by nature and aren't noticeable.

VGA and DVI-A are analog signals. Pixel data is carried by varying the amplitude on 3 wires, one each for Red, Green and Blue. For a variety of electrical reasons it's expected that there will be a small deviation in signal ampitude from what would normally be expected. For digital transmissions this usually has no effect as the variation is rarely significant enough to cause a bit error. However, on analog signals the variation can be significant enough to cause a symbol to be misinterpreted as a neighbouring value. Neighbouring colour values are usually unnoticeable, but such deviation occurs more frequently as the resolution and refresh rates are increased. High resolution displays that receive analog signals will appear washed out compared to the same display receiving a digital signal.

DVI-I (the formfactor present on most video cards) combines DVI-D and DVI-A into one socket. The output signal will be either analog or digital depending on what cable is attached (usually provided with the display). DVI-A is identical to VGA, differing only in socket layout. A passive DVI-A to VGA adapter (included with almost every graphics card) allows a VGA cable to be connected to a display adapter that has only a DVI-I output.

For resolutions at or below 1920x1200, HDMI, DisplayPort, and DVI-D are functionally identical. For resolutions beyond 1920x1200, only DisplayPort and DVI-D will work (Dual-Link DVI-D to be specific). VGA (and accordingly, DVI-A) can drive resolutions up to approximately 2048x1536, but colour quality can be atrocious compared to its digital counterparts.


Thanks for the lesson! This really helped explain it to me. :) 


You're most welcome. I was going to include some math to explain the errors and their effects, but decided that it was too much.
m
0
l
!