Sign in with
Sign up | Sign in
Your question

DVI Issues

Last response: in Systems
Share
July 22, 2010 10:21:09 AM

I received my new custom PC yesterday, delivered without a graphics card or OS. It has been tested with an ati graphics card, and installed/tested windows 7, then wiped.

I installed my graphics card (8600GTS) plugged in my benq DVI monitor, and could not get any output on the screen - no POST or anything. I plugged in an older LG DVI monitor, and that worked fine, so I reset the bios to default settings, and installed windows 7. I then ran all the updates, new graphics drivers etc.

I then plugged in my Benq monitor as a second monitor, tried to detect it, and windows blue screened (nvdlmkm.sys I think). I uninstalled/reinstalled the graphics drivers a few times, and eventually I could plug in and detect the Benq manually using the windows screen resolution settings window. However, windows detected it as a "non-pnp" monitor, but allowed all resolutions (above natvie too). If I then loaded the nvidia control panel, it disabled the Benq, and claimed it couldn't find a second monitor (just my LG as primary).

I turn the pc on with the Benq in the second monitor, I get all the video on the LG, then I can manually detect the Benq (with the "non-pnp" issue) - but this time it restricts to 1280x768 (but the Benq osd says it's at native 1920x1080).

I have tested the Benq on a different machine, and dvi works fine. I have plugged a different monitor into my pc, and it is detected without any issues. Also, I was running the Benq & LG dual screen on my old pc, with the 8600GTS, without any issues.

Strangely, if I plug the Benq using a vga->dvi adaptor, I get a vga signal on the Benq without any problems - post screens, boot screens, and windows. Windows then detects it as a Benq without any problems, and I can dual screen without any issues at all.

So all I can assume is that something to do with Benq -> 8600GTS -> new PC that is stopping the Benq being detected correctly... I assume it's a problem with the card or mobo - as the issue happens before the OS boots.

Does anyone know what might cause this, or experienced it themselves? Google appears to have very little information about it!

More about : dvi issues

a b U Graphics card
a b C Monitor
July 22, 2010 12:50:42 PM

If one or both connections are DVI-D, you need a DVI-D cable.
If one or both connections are DVI-A, you need a DVI-A cable.
If one connection is DVI and the other is VGA, and the DVI is analog-compatible, you need a DVI to VGA cable or a DVI/VGA adaptor.
If both connections are DVI-I, you may use any DVI cable, but a DVI-I cable is recommended.
If one connection is analog and the other connection is digital, there is no way to connect them with a single cable. You'll have to use an electronic convertor box, available in either analog VGA to digital DVI or digital DVI to analog VGA.

http://www.datapro.net/techinfo/dvi_info.html
!