Hey guys,
- I have two VGA monitors (also support HDMI)
- I have a GTX 650 graphics card, which has 2 DVI-I ports and 1 mini-HDMI port.
When I connect both monitors to the DVI-I slots using either both VGA-DVI-I adapters or both VGA-DVI-A adapters, the computer detects both monitors, but only ever displays one of them (the primary). When I go to the NVIDIA control panel, it says one monitor is VGA-PC (primary) and the other is DVI-PC.
Any one have a clue as how to fix this? Or if not, any alternatives (e.g. buy a mini-HDMI HDMI cable).
Also: After hours of research, I think this may be a source of my problem: although the second port does have the characteristic DVI-I four pins, I have a sneaking suspicion that it does not actual support analog and that it is digital only. If you go to the official product page, you can see that it was originally a DVI-D port. Image.
Thanks for your help guys!
- I have two VGA monitors (also support HDMI)
- I have a GTX 650 graphics card, which has 2 DVI-I ports and 1 mini-HDMI port.
When I connect both monitors to the DVI-I slots using either both VGA-DVI-I adapters or both VGA-DVI-A adapters, the computer detects both monitors, but only ever displays one of them (the primary). When I go to the NVIDIA control panel, it says one monitor is VGA-PC (primary) and the other is DVI-PC.
Any one have a clue as how to fix this? Or if not, any alternatives (e.g. buy a mini-HDMI HDMI cable).
Also: After hours of research, I think this may be a source of my problem: although the second port does have the characteristic DVI-I four pins, I have a sneaking suspicion that it does not actual support analog and that it is digital only. If you go to the official product page, you can see that it was originally a DVI-D port. Image.
Thanks for your help guys!