DVI to D sub not working? (Video card not working?)

grottobill

Distinguished
Oct 30, 2007
6
0
18,510
I just setup my new system, and on the video card there are two DVI ports. My monitor is old.. but I do have a converter for the monitor Dsub connection to DVI so it fits into the video card. The problem is I can't see anything. So I tried the monitor with the adapter still on in the computer I'm using now, which has an older card with one Dsub and one DVI connection. I plugged the monitor in to the DVI, and nothing comes up. So I have two totally different computers (different cards), two monitors (both old), and I can't get any sign of life from any card when I use the DVI. One other annoyance is when I plug the adapter in the monitor turns orange, so it doesn't say no connection or anything like that. Does my monitor need to be newer even if I have a Dsub to DVI adapter? If not, could the adapter be bad?

Hate this stuff ><



Thanks in advance

Bill
 
Yea i know what you mean this kind of thing is a real pain in the a** .
I dont know a lot about it myself but the first thing that comes to mind is which conection do you have pluged in?
They will be numbered 1 and 2 you should be using no1.
Apart from that i will see if i can find out anything that is a known issue just incase you dont get much feed back,well it is friday. :)
Mactronix
 

grottobill

Distinguished
Oct 30, 2007
6
0
18,510
Ya I tried both #1 and #2, same thing. Also, on the system that is currently up and running (with the video card that has VGA + DVI) the DVI does the same thing, nothing. Ahh I want to install windows!!!

Also, I've now tried 3 different CRT monitors, but all are old, at least 4+ years I would say.

Thanks for help, appreciate it.
 

niz

Distinguished
Feb 5, 2003
903
0
18,980
You can't just plug the monitor into a DVI port on a running computer and get a picture. You need to power the PC up with the monitor already connected to the DVI output, as DVI detects whats connected before it outputs a signal, and usually only does the detection at powerup.

You also need to have the right drivers loaded for your video card. You don't mention which make it is, but go to the manufacturers website and download the latest drivers.

Also check your video driver settings to see if there's an option to enable DVI or select between VGA and DVI outputs (right click on desktop and poke around under properties/settings/advanced, or if its an nVidia card, under the nvidia display properties).
 

grottobill

Distinguished
Oct 30, 2007
6
0
18,510
Ahh, I didn't realize that I needed to have the computer off before the DVI worked. That pretty much bunks most of my tests so far. But I did try the DVI on my new computer in both #1 and #2 spots while the computer was turned off, then turned it on, and nothing happened. The video card is a eVGA 8800GTS 320mb. I have been looking around and found that "some cards have a DVI-D (digital only) connector instead of DVI-I (digital/analog), which means that a DVI-to-VGA converter will not work". Doesn't make sense they would give me an adapter if it's DVI-D, and I just tried the adapter on this computer the proper way (turning it off for DVI) and it worked. So the adapter works, the monitor works for DVI on this computer, why won't my new computer show anything? I can't even get to a BIOS screen and the monitor just sits there with it's light orange. Why wont this work ><


Bill
 

niz

Distinguished
Feb 5, 2003
903
0
18,980
check your bios settings to see if there's a preferred graphics mode in the bios. If your motherboard has onbard graphics you may need to turn that off.