Hello,
I have a Dell monitor which has two inputs; 1 VGA and 1 DVI. The VGA input is being used as a display for a file server.
I needed a second VGA input, so, I purchased the appropriate DVI to VGA adapter and connected it to the monitor. I then connected the second input to a DVR for security cameras.
The idea was this: Use the monitor for viewing the security cameras and have the ability to switch over to the server when necessary.
It doesn't work. The VGA port which has the server connected works fine, but apparently the DVI port with the VGA adapter doesn't appreciate this configuration.
Any ideas as to why this doesn't work? I've tested both ports individually and they work fine. Also, with two machines; one VGA and one DVI, I can toggle between the two without a problem.
I'm guessing the issue is with the use of the DVI to VGA adapter. I'd like to understand why this doesn't work.
Thanks in advance.
I have a Dell monitor which has two inputs; 1 VGA and 1 DVI. The VGA input is being used as a display for a file server.
I needed a second VGA input, so, I purchased the appropriate DVI to VGA adapter and connected it to the monitor. I then connected the second input to a DVR for security cameras.
The idea was this: Use the monitor for viewing the security cameras and have the ability to switch over to the server when necessary.
It doesn't work. The VGA port which has the server connected works fine, but apparently the DVI port with the VGA adapter doesn't appreciate this configuration.
Any ideas as to why this doesn't work? I've tested both ports individually and they work fine. Also, with two machines; one VGA and one DVI, I can toggle between the two without a problem.
I'm guessing the issue is with the use of the DVI to VGA adapter. I'd like to understand why this doesn't work.
Thanks in advance.