I have a dual monitor setup (well, trying to) using an 8800 Ultra, for a while it was working fine, but then after about a month of using the monitor and new card together the 2nd monitor stopped working on my computer. I tried using the monitor on my dads laptop and it worked fine there. I can put my primary monitor in either slot on the video card and it works on both, however the secondary monitor never works (even if I unplug the primary and restart using only the potentially faulty monitor).
Any time I turn the computer on with it plugged into my comp, the power LED flashes on and off somewhat rapidly, and the screen has a very dim flicker. Once I turn the computer off, the LED stops flashing and stays on and because the computer is off it gives the No Signal error for about 5 seconds then goes into standby.
I am 99% sure I have the drivers installed properly so that shouldnt be the issue. However, the device manager lists the Proview as a plug and play monitor rather than officially recognizing it. Additionally, I have 3 "Default Monitor" that show up whenever I do a hardware scan. The Primary is recognized as its brand, Optiquest.
Card: EVGA 8800 Ultra
Primary: 21" CRT (Optiquest, not sure of model #, its old)
Secondary: 19" Widescreen LCD (Proview PL926SWi)
PSU: 550Watt Antec
I managed to borrow another monitor from my parents, and hooked it up, and it does manage to work with the computer. How is it possible that the Proview wouldnt work with the computer but the Dell CRT I borrowed does? Additionally, the device manager detects the Dell monitor correctly when it's first plugged in.