Dual Monitor issues

Status
Not open for further replies.

Tattletailz5

Distinguished
Oct 8, 2011
11
0
18,510
Ok to set the scene with facts: I have two monitors both vga. on my old xp pc with radeon ATI (non pci-e) I can run one through the vga socket on the graphics card and one through a vga/ DVI adapter through the DVI port. both work and show splash and bios on boot up........So I can assume that monitors adapters and cables are all ok.

My new PC with windows 7 has a Asus F1A75-M board and 1024 AMD Radeon card HD6450.
The monitor works through vga but will not pick up or detect the second monitor using dvi or hdmi adapters and leads. basically no second monitor input at all. Ive looked in all control panels and bios and no clues to "switch on" a second monitor"

I actually bought another sapphire radeon card and same issue. Even If I unplug the vga monitor the second will not pick up through the HDMI or DVI?? thats on both cards. and no sign off the second monitor in ATI control centre or windows.

Is this a windows 7 issue? is it compatability, can it be done (the sockets are there)? is it the mother board?

I would bang the old PC card in but it wont fit as not PCI -E, Do I need to buy something else other than Radeon card? I have to use mixed sockets as one monitor is a fixed vga lead, and (by now ) I have adapters and cables for the other to fit, DVI I DVI D HDMI.

All help and advice most welcome!!!!!

 
Are you plugging the monitors into the same card or are you plugging one into the integrated port on the motherboard with one?

Also, to add a 2nd monitor to Windows 7, you right click the desktop, choose Screen Resolution, the click detect. If it finds the monitor, you can then choose to extend the desktop or clone it.
 

Tattletailz5

Distinguished
Oct 8, 2011
11
0
18,510
thanks for prompt reply. Ive plugged into same card. As far as I know from general knowledge and looking, the onboard graphics and vga output are disabled in the bios. Ive also added the second card in the other pcie slot and still no joy. with all combinations of card and input, I only get the VGA port monitor to work. I have also on each permitation tried to detect the second from windows and AMD Radeon control centre. It does not find it. As I said before what remains sppoky is that the old card on the old pc with xp picks up the second monitor on boot, and if I hot connect it with windows running. No need to even attempt detection.
 

Tattletailz5

Distinguished
Oct 8, 2011
11
0
18,510
The vga to dvi adapter works on the old xp system and card.
whether I boot with one monitor or two and trying with both cards, the only output that works is the vga. the DVI and HDMI have no output even as a single monitor. the ports cant be defective of 2 new video cards!!!
Thats what makes me think its a setting/ enabling issue. Or could the mother board be defective in some way, yet evrything else seems fine.
 
If it doesn't show both monitors at bootup on post, it's not going to work at all. It has nothing to do with Windows 7, it's either an adapter or the port/video card.

Make sure both DVI ports on the card work by themselves and test both adapters. Just because it works on an old card, doesn't mean it'll work on the new card.
 
If you have an addon graphics card with two outputs and one of them is working, then it has NOTHING to do with the rest of the computer.

Your problem exists with:
1) your graphics card,
2) cable, or
3) monitor

Testing:
1) Test BOTH monitors on the working graphics card VGA output (should work)
2) Test a working monitor on BOTH VGA outputs (one at a time)

If things work up to this point it must be an issue with setup in the Catalyst Control Panel.
 

Tattletailz5

Distinguished
Oct 8, 2011
11
0
18,510
both monitors work through the vga port whichever card I put in.
neither monitor works when I go in to the DVI or HDMI (as I have cables and adapter plugs for both)whichever card I put in.
Adapters most be ok as they worked on DVI on the old pc.
 


Don't assume the adapters work with the new computer, just because it works with the old. There are some slightly different technologies that may not be compatible, even though they look almost identical. There are DVI-D and DVI-I adapters. Your old one may be DVI-I which doesn't in a lot of newer cards.
 

Tattletailz5

Distinguished
Oct 8, 2011
11
0
18,510
when I refer to adapters i mean plugs:
i have vga to hdmi, vga to DVI I and vga to DVI D (one of my cards is DVI I and the other for some reason DVI D dual. ie there are 4 pins around the blade on one and not on the other)
 

Tattletailz5

Distinguished
Oct 8, 2011
11
0
18,510
yeah i thought that but one card is dvi d and the other dvi i, I have both adapter plugs and they dont work on 2 seperaqte cards, also the HDMI cable |I have doesnt.

Silly question where is the CCC located, I saw it only when I ran the instalation cd. the only place I see where I can check settings are the windows "display settings" or the AMD ATI control centre advanced settings - I cant see anything a miss on either. (except they only ever accept or detect 1 monitor.
 
You can find the CCC after it's installed by right clicking the desktop, and it should be at the top of the menu.

Your problem is hardware, because if both monitors are attached to the same card, it will show both monitors at Post regardless of what you do in Windows (Windows is not running at Post).

The only exception is with HDMI. Many systems cannot run dual monitors setups when one of the monitors is using HDMI and sometimes it does work. I don't know why.
 

Tattletailz5

Distinguished
Oct 8, 2011
11
0
18,510
Am I asking the card to something it wont? ie can you run vga out the vga port and a second monitor through either dvi or hdmi?
I cant believe that 2 brand new cards have defective dvi and hdmi sockets! That only leaves the motherboard?
 

Tattletailz5

Distinguished
Oct 8, 2011
11
0
18,510
if i right click desktop - my top is AMD Vision Control Centre - I cant see any options - is it anything in hydrovision i select, any thing on hz or resolutions? detection doesnt pick up the other monitor.
 
I still think it's your adapters.

You said it works in the old xp machine, but that's likely due to the older video card inside that works with your DVI-I adapter. I've never been able to make HDMI connected monitors work with another (not sure what's up with that), so your test with the HDMI adapter was a no go. That leaves using 2 DVI-D adapters.

The Test:
Since you obviously don't want to entertain the incompatible adapter idea, then try putting your new video card in your old XP machine and see if it fails or succeeds.

2nd Test:
Try the DVI-I and DVI-D adapters with both monitors individually, and also try the single monitor, with each adapter, in both of the cards plug ins. This will find out if there is an adapter or plug in problem and the exact combination that does fail (if it fails).
 


I already let you know that if it's not visible at start up, it's not software. However, the place I find and detect monitors is like this:
right click desktop
click Screen Resolutions
Click Detect

Edit: one bit of software which is running at post is your Bios. Perhaps you should check the bios or look for an update if the test in the previous posts all check out.
 

Tattletailz5

Distinguished
Oct 8, 2011
11
0
18,510
yep I get that no detection on any software.

I cant do test 1 as the old pc doesnt have pcie - hence upgrade, the cards wont fit in it!
test 2 - any combination on any card with any monitor shows no output through any other socket other than vga.

but hey I'll entertain any test, ive chucked a few quid at this already and believe me if there is a card out there with 2 vga or an nvidia that has been proven to solve such issues Ifd gladly buy it!! I agree I think its hardware. As far as I know dual monitors should both show boot up - as you rightly say before windows even kicks in.
 


Don't go spending money yet.

By your response, it doesn't sound like you did my test 2 the way I tried to describe it.

Do not test each adapter in a combination of 2 monitors.
Take 1 monitor, use 1 adapter, test it each of the two plugs on the video card. Does it work? If so, good.
Take the same monitor and use the other adapter, test it with both plugs on the video card. Does it work? If so, good.
Do the same for the other monitor.

I'm betting one of the plugs doesn't work in at least one of the plug in spots.
 
Oh shoot, I didn't realize the card you have has a VGA output and one DVI. Forgive me if the following is true as I didn't quite understand correctly.

So basically you are saying that none of your adapters will work for the DVI connection, even with a single monitor plugged into it?
 

Tattletailz5

Distinguished
Oct 8, 2011
11
0
18,510
thats correct - even i if want to use a single monitor, i am not getting an output from dvi or hdmi (whichever card is in)

monitor 1 hard wired internally to vga:
works on vga port on card 1 and card 2
when used through an adapter plug to dvi d or i (as my cards are different) no joy
when used through same dvi i adapter on old pc all ok.

monitor 2
vga female on monitor
vga to vga all ok
vga to dvi i via complete cable no joy on either card
vga to dvi d via adapter plug no joy on either card
vga to hdmi via complete cable no joy on either card

Basically I have not had any output yet on 2 seperate graphics cards through their hdmi or their dvi ports whether I try and run 2 monitors or single monitors. but both cards vga ports work fine.
 
Well, that takes away the dual monitor problem. The problem is with your monitors working with an adapter for either HDMI or DVI, or maybe a bios issue.

I don't know the answer, but you first need to be able to get a single monitor to work off of that DVI connection. The cheapest thing you could try is a new DVI adapter (they usually come with video cards).
 

Tattletailz5

Distinguished
Oct 8, 2011
11
0
18,510
Thanks so much for the input chaps. Let me update you..... today, in order to at least prove something to myself, I bought an nvidia GForce. slotted it in, rad the drivers and plugged in one vga and the other monitor through a vga to dvi-i cable to the dvi-i port. BINGO! all works fab, 2 monitors on extended desktop.
Conclusion after using same monitors and leads on 2 seperate AMD ATI Radeons: There is clearly a conflict - well , there is on my system anyway!!! (anyone wanna buy an ATI Graphics card??? Thanks for responses, most welcome and helped me in making my last resort decision!
 

shuvadeep

Distinguished
Oct 11, 2011
1
0
18,510
Ok...i am new to dual monitor setups....i want to run a dual monitor setup in my old pc...at this moment i am having [core 2 duo E4500,Gigabyte G31M-ES2L mobo and a old ATI RADEON 4350(one slot is there for VGA,one for SVIDEO and one for DVI)...i am having a Samsung Syncmaster and a AoC monitor(both VGA monitors...not HD) which i want to be running in dual mode...(not by cloning but expanding)....now can anyone tell what exact steps i need to do for connecting the 2 monitors...my present monitor runs in the VGA slot of the graphic card....do i need to purchase a connector cauz the no pins of the DVI slot in my card are more tha the VGA ones...
 
Status
Not open for further replies.