First of all, apologies if I somehow missed the answer to this somewhere else. I did quite a bit of poking around before I posted, and didn't see anyone else with this issue.
I'm running a GTX 580 with Win7. My NVIDIA drivers are up to date.
Depending on what I'm doing (i.e. general purpose computing vs. watching content via MediaPortal), I output to one of two different displays: an ASUS VW266H monitor, or a Panasonic TC-P50GT30 plasma TV. Both of them work perfectly, independently, connected via HDMI or DVI.
The problem, though, is that my video card isn't wanting to recognize both displays at once. When I have my monitor connected via DVI, and the TV connected via HDMI, only one of them works & shows up in the NVIDIA control panel. The only way to switch from one to the other is to shut off the computer, disconnect the one that I want to stop using, and power the system back on.
Any ideas why this might be happening? Even if, for some reason, the card is incapable of simultaneously outputting over DVI and HDMI, it should be able to at least detect that multiple monitors are connected, right? Allowing me to then switch which one I want to use, without having to reboot every time?
Once, I was able to temporarily get an image on both monitors by using the "Rigorous Display Detection" option in the NVIDIA Control Panel, but it was rife with issues. First of all, it made both of the monitors show up in the control panel with generic names, and since it wasn't actually identifying them, their native resolutions weren't auto detected. Second, the image output to the TV had a ridiculous amount of green pixel artifacts*. Third, it was temporary, and after a reboot went back to the old "what two monitors?" behavior. So clearly, not a good solution.
I was really hoping someone else might have some insight, here. At this point, I'm just starting to wonder if this is an idiosyncrasy of my video card. I'd really hate to have to replace my 580 with something else, because it's still a powerhouse in spite of this issue, but being able to power both displays is pretty important for me.
*This occurred when I tried to run the TV at its native 1920x1080 as well as several other 1080 resolutions, but went away if I dropped the resolution way down to 720 or below. But strangely, if the TV is the only display, it can easily handle native resolution. This made me think maybe I was actually running up against the peak ability of how many pixels the 580 is able to push at once (i.e. not a 1920x1200 LCD + 1920x1080 plasma at the same time)...
Damn, I was hoping you may have some advice =\. I have tried plugging in my hdmi cable to a different TV too, and still nothing. I get a duplicate image of my computers screen on the TV, yet it still doesn't detect my TV as a secondary display.
Yea, I am going to try reverting back to Windows XP and see if that has anything to do with my problems.... or I may have to replace my 580 even though she is a beast.
I am emailing back and forth with the Galaxy Support, but I will let you know if I find a solution to the problem.