Dual Monitor Issue.. Please help!

Status
Not open for further replies.

clagger

Honorable
Apr 19, 2012
9
0
10,510
Hey all, I really help someone here can help me; I have been running Dual monitors for years on my Windows PC. Last night I upgraded to an AMD FX X3 3.6 Quad core, 8GB(4x2)PC 1600 DDR3 RAM, ASUS Mother Board, and a Nvidia GTX 560 1gb video card. I first booted into the BIOS using the onboard graphics card in order to disable it. Once disabled I installed my graphics card and hooked up my monitors(Both have VGA connections so I had to use converters to interface with the graphics card, which is the same setup as before) and installed Windows 7 Ulitmate edition 64bit overtop of my exsisting file system. After installing my motherboard drivers I went to nvidia.com to get the latest drivers for my video card. Everything works fine except I only get a picture on whichever monitor is plugged into the 1st DVI port on the graphics card. What I have tried so far:

I can get either monitor to work by plugging them into the primary DVI slot, but not any of the other 3.

I have tried every combination of resolution/color depth/refresh rate.

I have uninstalled and reinstalled the nvidia drivers.

Installing acutal drivers for monitors, not just generic pnp

*Note: When the 2nd monitor is plugged into any other DVI port windows recongizes it, and installs the drviers, and the display configuration screen in windows and in nvidia control panel show 2 monitors and everything seems correct....mouse pointer will even scroll off the screen as if it is going onto the second monitor, but I have no image.

At this poing I'm thinking the graphics card may have dead ports? But it would be odd that 3 out of the 4 connections were dead on arrival.
 
Solution

ricardois

Distinguished
Dec 21, 2011
1,470
0
19,660
when you are using DVI to VGA, you must use a DVI slot that supports a analogic interface cable, probably only the first DVI slot allow the Analogic + Digital signal, and the other one supports only Digital, so yes to use vga you the DVI conector that supports analogic signal.

what is your 560 version ? can you link it, so i could check and confirm if that is really the problem, maybe only the first DVI supports the analogic signal, try using a monitor that have DVI inputs so you could probably use the other slots with dvi cable.
 

clagger

Honorable
Apr 19, 2012
9
0
10,510
Thanks a lot for the reply! Unfortunately both of my monitors are VGA natively. One is a 24" AOC LCD and the other is an old Dell 19" CRT. Here is a link to the exact card I purhcased yesterday. You've gotta love having a Microcenter 5 mins from your house!

http://www.microcenter.com/single_product_results.phtml?product_id=0381792

I still have my old card, an Nvidia Geforce 6800GTS that I could swap in to see if that fixes the problem, to confirm that it is some sort of issue with the new card. That'll have to wait, I am at work currently... But i could go home for a lunch break if you think it would help anything to try it.
 

clagger

Honorable
Apr 19, 2012
9
0
10,510
I looked at the specifications on the video card and it says 1 DVI-I and 3 DVI-D ports. Do the DVI-D ports only supply digital output and since I'm using VGA cables the monitors cannot function? My old CRT is obviously only VGA, but here is my LCD;

http://www.newegg.com/Product/Product.aspx?Item=N82E16824160049

It says it has a DVI-D input, So if this is indeed the problem, then I just need to buy a DVI-D Cable and hook up the LCD as the second monitor, and that should be fixed?
 

ricardois

Distinguished
Dec 21, 2011
1,470
0
19,660


Without the four analog pins present on the DVI-D connector, a VGA source cannot be attached. make sure your connector is using them you kno on the dvi cable there is a + or a - and there are some analog pins surrounding it, your connector must have it to VGA work into DVI-D connectors proably buying other connector will do the job be very specific when buying that you need a connector to connect from a DVI-D output that requires all the four analog pins...
 

clagger

Honorable
Apr 19, 2012
9
0
10,510
I really appericate the help, I can try getting a new connector, but would it make sense that either monitor with their respective(different and previously working on dvi connections on my 8800 GTS) connectors both work on DVI port 1, and not on DVI port 2,3 or 4?

I guess it would be easier to answer this question so I can fully understand what is going on: Is the first DVI port on my graphics card different from the other 3?

If the answer is yes to that question then all of this makes sense, if the answer is no then I'm not sure getting new connectors will help since both of the current connectors work on the first DVI slot. That would leave me to believe there is something wrong with the other 3 DVI slots or there is some setting that is screwed up... But like I said it is a fresh install of Windows 7 Ultimate.
 

ricardois

Distinguished
Dec 21, 2011
1,470
0
19,660


VGA = Analogic Signal
DVI = Digital Signal

if you know that than you know that the answer is yes, you said it by yourself the first one is DVI-I and the other ones are DVI-D, DVI-I will work with vga even without the 4 extra analog pins, while the DVI-D requires them to work.

DVI-D (digital only, both single-link and dual-link)
DVI-A (analog only)
DVI-I (integrated – digital and analog)

got it? whiel the DVI-I have both signal integrated it does not need the extra analogic pins, so if you are using DVI-D you need the Analog signal via extra pins.

read more about dvi connectors here:
http://en.wikipedia.org/wiki/Digital_Visual_Interface
 
Solution

clagger

Honorable
Apr 19, 2012
9
0
10,510
Awesome, Thanks so much man. I'll try this out and let you know if it fixes the problem! I really appericate your time! I spent about 3 hours last night messing with settings and reading forums and trying to fix it, never occured to me that the ports are different.... They should color them different or something!! Lol

Help from strangers.... It gives you that hometown warm and fuzzy feelin'
 

ricardois

Distinguished
Dec 21, 2011
1,470
0
19,660


hope that was really the problem you were having, please when you test that make sure to let your feedback here to people know if that was the real problem, it can always be something else ^^.

thanks.
 

clagger

Honorable
Apr 19, 2012
9
0
10,510
I went home on lunch and checked out my VGA to DVI connectors, they do have the 4 pins for the analog signal, but still no dice. Apparently even though the ports on the back of the GTX 560 have open pins to connect these types of analog dvi connectors to, they do not output analog signal.

I found a guy on craigslist selling a DVI-D Cable for $5 bucks, beats bestbuy's price of $25 and microcenter's price of $45(Wow) so i'm going to meet up with him after work and I'm about 99% sure that I can use that DVI-D cable to hook up my LCD to one of the digital connections, and place the CRT on the 1 port that outputs analog signal. Then i'll just have to configure windows or Nvidia's control pannel to switch the two in order to make the LCD my primary display.(Since for some reason the analog DVI connector is primary by default). I'll let you guys know if this fixes it!

 

ricardois

Distinguished
Dec 21, 2011
1,470
0
19,660


Make sure the cable really supports DVI-D ANALOGIC signal and is sending it to the vga conector, otherwise it will not work.
 

clagger

Honorable
Apr 19, 2012
9
0
10,510
My LCD monitor has mulitple options for input, one being DIV-D and another being VGA, I just never hooked up the DVI-D becuase it didn't come with a cable, and I didn't feel like buying one when the VGA cable always worked just fine.
 

clagger

Honorable
Apr 19, 2012
9
0
10,510
It worked! I was able to get the DVI-D Cable and use the digital connection from my LCD to the vid card, and use the analog to the CRT.

I was able to watch my Hulu plus while lose about 120 rating in solo queue ranked in League of legends..haha Good times

Thanks for the help!
 
Status
Not open for further replies.

Latest posts