Sign in with
Sign up | Sign in

Reply to this thread

Solved Forum question

Started by luthierwnc | | 16 answers
HD 5450 triple monitor problem
Hello all,

I've scoured previous posts but haven't found anything that quite matches my dilemma. I hope someone here will recognize the issue. Thanks in advance.

Computer:
i5 chip running Windows 7 64-bit
SSD and HDD
6 GB RAM
Sapphire Radeon HD 5450 graphics card
3 monitors -- one HDMI, one DVI, one VGA

I am running three monitors on this for basic office use (no gaming or heavy graphics apps). The DVI and HDMI monitors are connected to the card. The VGA monitor is still connected to the mobo.

On Friday I replaced the VGA cable with a longer one for more desktop flexibility. All three monitors blinked but went back to the way they were after a couple seconds. I shut the machine down (I thought) on Friday. When I got in today, there was a black screen saying I needed to hit enter to complete the changes. Having no real choice, I did and Windows started as usual. But only the HDMI and DVI monitors came on.

The graphics card is a light-duty item and I knew it couldn't drive all three monitors in legacy mode. When I plug the VGA monitor into the card, that works but the DVI is disabled. The screen resolution tool shows either/or.

I could be way off base but I think the black screen changes disabled the Intel graphics display adapter that comes with the chip leaving only the Catalyst. The AMD is the only one showing now in Device Manager so I can't swear that was it. I had what could be a related problem getting rid of the Intel graphics on an AMD install with my home computer but that has only gotten me so far with this problem.

Long story short, if anyone recognizes the issue or has a better idea for running three monitors on this box, I'd love to hear them. Thanks, sh

  • By posting on this site, I confirm I am over 13 years of age and agree to abide by the site’s rules.

October 15, 2014 12:20:57 PM

Thanks again for helping me with this head-scratcher. The USB 3.0 to VGA adapter works flawlessly. It came with a CD but it worked fine plug-and-play. I got a Pluggable unit if you are in the market. Cheers, sh
October 8, 2014 1:34:57 PM

I disabled the multi-monitor option in the BIOS but it didn't make any difference. Turning it back on didn't matter either. In both cases, I got a black screen warning that I was attempting to use the ports that didn't match the GPU.

Let's let this one die. I ordered a USB 3.0 to DVI/HDMI/VGA adapter on the net this afternoon. That should do the trick. Even if I got the twin GPUs to work, it isn't very stable and I wouldn't want to risk some headache when I really needed it to work.

Thanks to all for your effort and generosity, sh
a c 117 C Monitor
October 8, 2014 10:37:34 AM

The only reason it's not showing up is because it's being disabled. Igpu multi monitor is the only option affecting that. Check the option again. Turn it off and on.
October 8, 2014 6:19:05 AM

Update: I got into work, pulled the card, plugged the HDMI and VGA monitors into the mobo and booted up. Then I downloaded the drivers that wouldn't download with the card in. Worked fine. The AMD drivers were uninstalled earlier yesterday but when the card is in, they seem to work fine with or without.

I guess the big question now is how can I reinstall the card and use three of the four available ports. Plugging in the card automatically removes the Intel graphics from the Display Adapter list.
October 8, 2014 5:05:27 AM

Thanks all. When the card is out, the Intel graphics work fine for two monitors. The Intel graphics shows in the Device Manager > Display Adapters by itself. When I put the card in, it drives two monitors and it shows as the only thing in the Display Adapters.

My confusion with Dell tech was that no matter which driver set was downloaded, I/we got a message that the hardware was insufficient. I'm going to try to reload those without the card to see if that was the rub. Long-story short, when the computer boots up, it finds and loads whichever graphics driver matches the hardware and removes the other.

What's strange is that I have an almost identical computer at home using two monitors on a Radeon HD 7750 card. Both GPUs show in the Display Adapters although the monitors are plugged into the card. I had a mirror opposite problem with the home computer that they were always fighting with each other as I tried to get rid of the Intel. Somehow I figured out a way to have them both in the same space.

It is kind of like dropping a coin and having it rest on the edge. It can be done but you can lose a lot of time trying to do it again. I having nothing against doing this the way everybody else does it but that hasn't been easy to figure out either. Maybe the USB to VGA active adapter. I'll stop by BestBuy at lunch to see what they have. The reason for the longer VGA cable is that I'll be using it with an old monitor that I've built a DIY teleprompter around for podcasts. I need to get that project going.

Cheers, sh
a c 117 C Monitor
October 7, 2014 8:53:02 PM

Both gpus need to show in device manager. So you can't get the intel to work at all? Try reinstalling the intel drivers.
October 7, 2014 1:46:02 PM

Back at home, my computer here shows the Intel and Radeon adapter in the Device Manager. I can't get the office computer to do that. The machines are almost identical except this one has a better GPU. Strangely, nothing is plugged into the Intel graphics here. sh
October 7, 2014 12:38:38 PM

Same thing. I pulled the CMOS battery, installed the card, plugged everything in and on boot-up got a message that the BIOS couldn't reset because the onboard graphics card wasn't recognized. I pulled the VGA, rebooted and am back to where I was. Still just one adapter.
a c 149 C Monitor
October 7, 2014 12:22:10 PM

clear your CMOS and try running three screens, this'll just reset your BIOS as after that unknown message on startup.
October 7, 2014 12:15:29 PM

nope. The computer was on and I unplugged the VGA cable and put in the new one. The monitor came back on immediately and I used it for a few more hours before closing up for the weekend -- at least, I hit the Shut-Down command. Not being a gamer, I've never had a computer with multiple GPUs. In that case, how would you have two display adapters that wouldn't cancel each other out? The Intel will drive two monitors and the Radeon card will drive two.

The weird thing is that when I was installing a Radeon card on my home machine for two monitors, I couldn't get rid of the Intel graphics. It took like 7 times before it didn't reinstall itself on boot-up.

See all answers