Windows 7 - Second Monitor Not Detected

Oct 28, 2017
7
0
520
I just got a new video card meaning I gave my brother the only other DVI to VGA adapter we have so they can use their dual VGA LCD monitors.

Its an AMD Radeon HD 6850 with Dual DVI-I ports. I've used this card myself recently when my previous card's heat sink broke away from its mounting bracket. I used it with dual monitors (1920x1080 and a 1680x1050) but we're just trying to get dual 17 inch 1440x900 monitors working.

BOTH Monitors are using DVI-I to VGA adapters as the AMD Radeon HD 6850 has no VGA ports and has two DVI-I ports. I know both adapters work because I was just using the second one today for my second monitor (because I didn't have an extra DVI cable laying around for it). The other adapter has been in constant use with my Brothers computer for their single VGA monitor.

I dont know why Windows 7 isn't detecting the second monitor but at least the second monitor isn't showing "Cable not connected".

OS : Windows 7 Ultimate 64-bit
CPU : AMD Athlon II x2 270
RAM : 8GB DDR3
GPU : AMD Radeon HD 6850 (1GB)
 
Solution
As I said. Despite what the image shows (two DVI-I ports), the specifications state 1 DVI-D and 1 DVI-I. Seems obvious to me that only the top one, as you put it, is the actual DVI-I port. The other has the physical DVI-I connector, but only outputs a digital signal (DVI-D).

As to why, my first guess would be that the first digital displays came with DVI-I ports so you could connect either a VGA (analog) or a DVI-I (digital) graphic card output.

-Wolf sends
Oct 28, 2017
7
0
520


I believe its an Sapphire Radeon HD 6850 1GB model
http://www.sapphiretech.com/productdetial.asp?pid=A76124BC-FD28-4C45-B14A-4A1BB95F16E7&lang=eng

It would seem that while it worked in my computer for dual monitor, where I had one monitor connected with DVI and the other with a DVI to VGA Adapter, for my brothers set up it seems that only the top DVI-I port works with DVI to VGA adapters, the bottom DVI-I port just remains dark with no image on the second monitor. If I switch the cables around I can get an image on the 2nd monitor but not the 1st/main monitor.

Perhaps it just can't do two DVI-to-VGA connections and both monitors are VGA only.
 
Oct 28, 2017
7
0
520


Its not DVI-D though, or at least the Physical Port isn't, its a DVI-I port that accepts a DVI-to-VGA adapter but I'm thinking that the Sapphire Radeon HD 6850 only outputs Analog on one of the DVI-I ports (the top one)

I know enough about computers that DVI-D only outputs Digital and like HDMI requires an active converter to VGA which are expensive and not worth it for me. Also if it was a DVI-D port, then it wouldn't take a VGA to DVI adapter as a DVI-D port is phsyically incompatible with the pinout of a DVI-I/DVI-A connector

I just dont understand why they would have two DVI-I ports if only one outputs VGA and the other doesnt unless I'm overlooking something.

Lastly I'd like to mention this issue happens with the AMD Catalyst 15.7 Driver and the Crimson Beta driver (16.4 I believe?)
 

Wolfshadw

Titan
Moderator
As I said. Despite what the image shows (two DVI-I ports), the specifications state 1 DVI-D and 1 DVI-I. Seems obvious to me that only the top one, as you put it, is the actual DVI-I port. The other has the physical DVI-I connector, but only outputs a digital signal (DVI-D).

As to why, my first guess would be that the first digital displays came with DVI-I ports so you could connect either a VGA (analog) or a DVI-I (digital) graphic card output.

-Wolf sends
 
Solution