Sign in with
Sign up | Sign in
Your question

My 2nd Monitor can't be detected

Tags:
  • Graphics
  • help
  • Monitors
  • Geforce
  • Displays
Last response: in Graphics & Displays
Share
March 12, 2014 1:10:45 AM

I need some advice on what to try, to get my 2nd monitor working again.

I've been using two monitors with my computer for years. A couple of days ago, after a reboot, my OS would suddenly no longer detect the presence of my 2nd monitor. The cabling and port usage on both my video card and the monitors is identical to the circumstances under which it has worked fine for years.

My 2nd monitor is not bad, nor is the monitor cable. I know this because I can successfully get a display on the 2nd monitor using its same cable attached to a different PC. Further, I can get my 2nd monitor to work on my primary PC as long as I detatch the primary monitor. I also know it's not my vid card, because since this happened I bought a brand new one, and after installing it the same thing is still happening. Literally, my computer's OS appears to be rejecting the fundamental notion of having two monitors attached to my computer. The 2nd monitor works fine during bios post. It simply loses signal just as the OS loads. And when the OS comes up, display properties reports there only being 1 monitor detected... despite that the BIOS can detect and use that same monitor just fine, under identical conditions.

Literally, all I've changed involves installing some driver software for my motherboard's on-board hardware. I'd do a system restore to reverse the mobo driver changes, except that I no longer have the restore point. Since Installed many drivers at once, I have no idea which might be causing the problem. And I don't want to mass uninstall them because then all my mobo devices (usb ports, nic, etc) will all quit working. I'm hoping to find an easier solution that nuking my OS and reinstalling... because I'm certain there must be one small piece of software that's causing the problem and I simply have no idea what it is or how to find it without going to extreme headaches involving many hours of installing and uninstalling core hardware drivers... when I have many better things to be doing with my limited free time.

The only lead I have, is that I've found some old forum posts from 3-4 years ago involving AMD chipset software causing dual monitor problems. But nothing from those old posts gives me enough to develop a solid plan of action.

Any advice will be appreciated.

My System:
OS: Win7 Pro.
VidCard: GeForce GTX 760
CPU: AMD FX-8350 black
Mobo: ASUS Crosshair V Formula-Z
Mobo Chipset: AMD 990FX/SB950

Both Cables coming out of my vid card to my two monitors are DVI. Both of cables have previously been proven to support both DVI-I and DVI-D. Both inputs on the monitors are DVI. Everything I'm using is digital. There is no vga-dvi conversion going on anywhere.

More about : 2nd monitor detected

March 12, 2014 1:16:27 AM

Hi there,

I assume you've followed the process to extend a desktop using windows? This will also tell you wether windows is recognising that another monitor is plugged in.

Regards
m
0
l
March 12, 2014 2:32:16 AM

Yes, I've tried using the inherent windows function for extend desktop, and to try forcing the OS to detect more monitors. None of it yields any results. The monitor list in display properties still only detects 1 monitor total.
m
0
l
March 13, 2014 12:30:47 AM

BennyJi said:
Hi there,

I assume you've followed the process to extend a desktop using windows? This will also tell you wether windows is recognising that another monitor is plugged in.

Regards


Yes, i've tried that. Do you have any other suggestions?
m
0
l
!