Sign in with
Sign up | Sign in
Your question

DIfficulty adding DVI to older PC

Last response: in Graphics & Displays
Share
November 14, 2009 8:33:19 PM

Hi Everyone,

I'm having great difficulty adding a DVI video card to an older workstation. Here's what I have:

- SuperMicro P4DC6 motherboard (dates to 2000), dual xeon CPU, 4X AGP (1.5v) http://www.supermicro.com/products/motherboard/Xeon/860...

- W2000 Professional SP4

- Samsung 205BW monitor w/ DVI and VGA inputs
http://www.samsung.com/me/products/monitor/lcdmonitors/...

I have tried two DVI video cards designed to work in an older AGP slot - the ATI Diablotek and the Matrox G550. Neither will output DVI, but will output VGA. In both cases, I updated the drivers as recommended.

I have updated the system bios, and checked the cables/monitor on another system (Mac Pro). I have my doubts that both cards can be faulty, but I've run out of ideas. WHat is it about an older system like this that would prevent these graphic cards from working in DVI mode? Any suggestions?

Thanks, Jim
a b U Graphics card
November 15, 2009 4:39:19 PM

Hey jmag,

In most cases that this happens the monitor has received corrupted signals.

copasetic said:
Turns out the device ID information in the monitor became corrupt somehow. The monitor sent that corrupt information to the video card during the display driver loading, video card got confused and switched to VGA output, screen stops receiving signals.


Source

The best recommendation is to unplug the monitor from the power supply for 5 or 10 minutes which should clear the corrupted information and allow it to receive signals again.

Does the monitor POST at all? Or just does not display in Windows?
m
0
l
November 15, 2009 7:36:34 PM

Thank you for the help. I have tried unplugging the monitor, but it doesn't help. Also, the monitor does nothing - it does not post.
m
0
l
!