HD 5450 triple monitor problem

luthierwnc

Distinguished
Apr 19, 2013
150
0
18,680
Hello all,

I've scoured previous posts but haven't found anything that quite matches my dilemma. I hope someone here will recognize the issue. Thanks in advance.

Computer:
i5 chip running Windows 7 64-bit
SSD and HDD
6 GB RAM
Sapphire Radeon HD 5450 graphics card
3 monitors -- one HDMI, one DVI, one VGA

I am running three monitors on this for basic office use (no gaming or heavy graphics apps). The DVI and HDMI monitors are connected to the card. The VGA monitor is still connected to the mobo.

On Friday I replaced the VGA cable with a longer one for more desktop flexibility. All three monitors blinked but went back to the way they were after a couple seconds. I shut the machine down (I thought) on Friday. When I got in today, there was a black screen saying I needed to hit enter to complete the changes. Having no real choice, I did and Windows started as usual. But only the HDMI and DVI monitors came on.

The graphics card is a light-duty item and I knew it couldn't drive all three monitors in legacy mode. When I plug the VGA monitor into the card, that works but the DVI is disabled. The screen resolution tool shows either/or.

I could be way off base but I think the black screen changes disabled the Intel graphics display adapter that comes with the chip leaving only the Catalyst. The AMD is the only one showing now in Device Manager so I can't swear that was it. I had what could be a related problem getting rid of the Intel graphics on an AMD install with my home computer but that has only gotten me so far with this problem.

Long story short, if anyone recognizes the issue or has a better idea for running three monitors on this box, I'd love to hear them. Thanks, sh

 

luthierwnc

Distinguished
Apr 19, 2013
150
0
18,680
I should have mentioned this is a Dell using the Aptio American Megatrends BIOS which I haven't found always matches other machines.

As much as I could do was to check the advanced settings and see that the Intel Multi-Graphics was and is enabled.

The Device Manager only shows the AMD in the Display Adapters. IIRC, the Intel onboard adapter was in there as well at one point but it has been a while. The 5450 is only supposed to handle two monitors and only has three outputs to improve the odds it will fit what you have. One new thing I noticed was that the DVI monitor showed VGA in block letters in the upper right of the screen when it was booting up. It might be nothing.

I did check this out: http://support.amd.com/en-us/search/faq/154

but it only applies to the AMD part of the card and not the Sapphire end. I'm pretty sure this has nothing to do with the card anyway and I need to get the adapters to play nice.
 

luthierwnc

Distinguished
Apr 19, 2013
150
0
18,680
Update: I spent a lot of time with a pleasant but not particularly experienced Dell tech. I don't think I ever convinced him that for the last year all three monitors worked perfectly using two graphics adapters.

Here's what I've done since my last post:
--Pulled the AMD card, rebooted with the VGA monitor in the mobo and it worked fine. It also reinstalled the Intel HD2500 driver in the display adapter and deleted the AMD adapter.
-- Put the AMD card back in, left the VGA as is but didn't plug in the card monitors -- blackness
-- pulled the card, opened in Intel graphics and uninstalled all of the AMD software
-- put the card back in with the monitors plugged in. Opened the card monitors find, replaced the Catalyst software as if it had never been uninstalled but deleted the Intel drivers.

I went back to my original post a year ago and k1114 gave me the right answer then.

http://www.tomshardware.com/answers/id-1837163/monitor-card-ideas-gaming-computer.html

I think if I can just figure out how to keep both adapters in place, I'll be OK. Thanks, sh

PS. My BIOS doesn't have an IGPU option. It does have a multi-monitor option which is enabled.
 

luthierwnc

Distinguished
Apr 19, 2013
150
0
18,680
Even in my old age I still don't have quite enough patience to put up with someone who knows less than I do but is trying to be helpful -- especially when they insist on downloading several 4 minute drivers after I've repeated they are already in the download file.

I'd never seen the screen before but it was a DOS-type message saying that I needed to hit Enter to open Windows. I had a client coming in about 15 minutes later so I got in and prepared for the meeting on two screens. Three are a luxury anyway and I can make do with two. If I can't replicate the programming pretty soon, I'll just go get a USB-to-VGA adapter and soldier-on. Fiddling with this is a luxury too.

Somewhere, someone is using a three-monitor setup and whatever they're doing would be fine. sh

 

luthierwnc

Distinguished
Apr 19, 2013
150
0
18,680
nope. The computer was on and I unplugged the VGA cable and put in the new one. The monitor came back on immediately and I used it for a few more hours before closing up for the weekend -- at least, I hit the Shut-Down command. Not being a gamer, I've never had a computer with multiple GPUs. In that case, how would you have two display adapters that wouldn't cancel each other out? The Intel will drive two monitors and the Radeon card will drive two.

The weird thing is that when I was installing a Radeon card on my home machine for two monitors, I couldn't get rid of the Intel graphics. It took like 7 times before it didn't reinstall itself on boot-up.
 

luthierwnc

Distinguished
Apr 19, 2013
150
0
18,680
Same thing. I pulled the CMOS battery, installed the card, plugged everything in and on boot-up got a message that the BIOS couldn't reset because the onboard graphics card wasn't recognized. I pulled the VGA, rebooted and am back to where I was. Still just one adapter.
 

luthierwnc

Distinguished
Apr 19, 2013
150
0
18,680
Back at home, my computer here shows the Intel and Radeon adapter in the Device Manager. I can't get the office computer to do that. The machines are almost identical except this one has a better GPU. Strangely, nothing is plugged into the Intel graphics here. sh
 

luthierwnc

Distinguished
Apr 19, 2013
150
0
18,680
Thanks all. When the card is out, the Intel graphics work fine for two monitors. The Intel graphics shows in the Device Manager > Display Adapters by itself. When I put the card in, it drives two monitors and it shows as the only thing in the Display Adapters.

My confusion with Dell tech was that no matter which driver set was downloaded, I/we got a message that the hardware was insufficient. I'm going to try to reload those without the card to see if that was the rub. Long-story short, when the computer boots up, it finds and loads whichever graphics driver matches the hardware and removes the other.

What's strange is that I have an almost identical computer at home using two monitors on a Radeon HD 7750 card. Both GPUs show in the Display Adapters although the monitors are plugged into the card. I had a mirror opposite problem with the home computer that they were always fighting with each other as I tried to get rid of the Intel. Somehow I figured out a way to have them both in the same space.

It is kind of like dropping a coin and having it rest on the edge. It can be done but you can lose a lot of time trying to do it again. I having nothing against doing this the way everybody else does it but that hasn't been easy to figure out either. Maybe the USB to VGA active adapter. I'll stop by BestBuy at lunch to see what they have. The reason for the longer VGA cable is that I'll be using it with an old monitor that I've built a DIY teleprompter around for podcasts. I need to get that project going.

Cheers, sh
 

luthierwnc

Distinguished
Apr 19, 2013
150
0
18,680
Update: I got into work, pulled the card, plugged the HDMI and VGA monitors into the mobo and booted up. Then I downloaded the drivers that wouldn't download with the card in. Worked fine. The AMD drivers were uninstalled earlier yesterday but when the card is in, they seem to work fine with or without.

I guess the big question now is how can I reinstall the card and use three of the four available ports. Plugging in the card automatically removes the Intel graphics from the Display Adapter list.
 

luthierwnc

Distinguished
Apr 19, 2013
150
0
18,680
I disabled the multi-monitor option in the BIOS but it didn't make any difference. Turning it back on didn't matter either. In both cases, I got a black screen warning that I was attempting to use the ports that didn't match the GPU.

Let's let this one die. I ordered a USB 3.0 to DVI/HDMI/VGA adapter on the net this afternoon. That should do the trick. Even if I got the twin GPUs to work, it isn't very stable and I wouldn't want to risk some headache when I really needed it to work.

Thanks to all for your effort and generosity, sh
 

luthierwnc

Distinguished
Apr 19, 2013
150
0
18,680
Thanks again for helping me with this head-scratcher. The USB 3.0 to VGA adapter works flawlessly. It came with a CD but it worked fine plug-and-play. I got a Pluggable unit if you are in the market. Cheers, sh