DVI to VGA Usage Question

nick003

Honorable
Feb 10, 2014
30
0
10,530
Hello,

I have a Dell monitor which has two inputs; 1 VGA and 1 DVI. The VGA input is being used as a display for a file server.

I needed a second VGA input, so, I purchased the appropriate DVI to VGA adapter and connected it to the monitor. I then connected the second input to a DVR for security cameras.

The idea was this: Use the monitor for viewing the security cameras and have the ability to switch over to the server when necessary.

It doesn't work. The VGA port which has the server connected works fine, but apparently the DVI port with the VGA adapter doesn't appreciate this configuration.

Any ideas as to why this doesn't work? I've tested both ports individually and they work fine. Also, with two machines; one VGA and one DVI, I can toggle between the two without a problem.

I'm guessing the issue is with the use of the DVI to VGA adapter. I'd like to understand why this doesn't work.

Thanks in advance.

 
Solution
Because it does not work that way. It converts a DVI-I source to VGA, not convert a VGA source to DVI

The adapter does not even actually convert dvi digital signal to vga analog signal, all it does is change the pin/wire locations.
On a DVI-I port on a PC, the 4 pins in the corner have the data for a analog VGA signal already built in. If you use a DVI cable it uses the DVI signal, if you use a DVI-VGA adapter it will use those 4 pins to create a VGA source.

I would advise using a KVM switch if you need both sources to be VGA.
Because it does not work that way. It converts a DVI-I source to VGA, not convert a VGA source to DVI

The adapter does not even actually convert dvi digital signal to vga analog signal, all it does is change the pin/wire locations.
On a DVI-I port on a PC, the 4 pins in the corner have the data for a analog VGA signal already built in. If you use a DVI cable it uses the DVI signal, if you use a DVI-VGA adapter it will use those 4 pins to create a VGA source.

I would advise using a KVM switch if you need both sources to be VGA.
 
Solution
6K4tr.png
 

Unkk

Distinguished
Apr 20, 2012
224
0
18,760
Not all DVI system have an analog signal, there are three types of dvi connectors:
DVI-D (digital only, single link or dual link)
DVI-A (analog only)
DVI-I (integrated, combines digital and analog in the same connector; digital may be single or dual link)
Your cameras may be only DVI-D output.
 


I have not seen a CCTV DVR with a DVI port. From what the OP said it sounds like he hooked up like so:
DVR VGA PORT -> VGA CABLE -> DVI-VGA ADAPTER -> DVI PORT ON MONITOR

 

emmanuelxian07

Distinguished
Jan 22, 2011
150
0
18,710
I am no expert and this is just a guess but most likely, the DVI port still sends the analogue signal to the same VGA module in the monitor which processes the input from the VGA port. So basically although you're using the DVI port, the signal is still sent to the VGA module which is already processing information from your server. As far as I know, monitors have 2 separate modules - one working on analogue signals (via VGA) and the other working on digital signals (via DVI or HDMI). If you have an option, try to change either the server or the security camera to use either a DVI cable or an HDMI cable so that one of them will be transmitting digital signals and the other will be using analogue.

I based my answer on what you said about testing both ports and that they seem to be working fine individually. I am assuming that when you did the test, you used the same VGA connector + the VGA to DVI adapter to confirm that the DVI port is working fine.

Again, I am not an expert so I might be wrong. I am just basing my answer on what I recall from what I've read some time ago.
 


No.
At the monitor the DVI port Is only looking for a DVI signal.
Don't give advise based on pure guesses and assumptions on how things are designed.

DVI-I ports are only on video cards and were designed for maximum compatibility/expandability in their limited space.
DVI-I ports just have the pins for both a digital DVI signal and an analog VGA signal (see picture I posted earlier), You then use a DVI-to-VGA adapter to utilize the VGA signal. Note it is called a DVI-to-VGA adapter, NOT a VGA-to-DVI adapter.
 

emmanuelxian07

Distinguished
Jan 22, 2011
150
0
18,710
Seems like what you're trying to do isn't going to work.

https://www.lorextechnology.com/support/self-serve/Guide+to+DVR+Ports+and+Connections/2100032

Under DVI Video Output it says, and i quote, "A DVR with a DVI Video Output port can be connected to a monitor that only has VGA inputs using a DVI-to-VGA connector. However, it is not possible to connect a DVR with a VGA Video Output to a monitor's DVI input."

I assumed you tried to simply unplug your server and left your DVR connected through the VGA + DVI to VGA adapter to test that the DVI port is working fine, but it seems I am wrong. Best thing to probably do is replace one type of input from VGA to DVI if that is an option. If it isn't, then might as well ask boosted1g since he seems to know what he's saying. Just seems like he's not that friendly. LOL. Just kidding, boostedg1. Peace! :)
Thanks for sharing your knowledge. A thumb up for you.