Monitor loses signal after Standard VGA Graphics Adapter drivers are updated.

SubtleMeatball

Reputable
Dec 1, 2015
8
0
4,510
Hello, and thank you for taking your time out to read this. I have been having issues with my computer ever since I upgraded a few parts of it on Saturday. Now what I believe to be the main issue at the moment is that the Standard VGA Graphics adapter cannot start, and it show code 10. I have tried numerous times to update the drivers for it, but when it finishes, my monitor loses connection to my computer and I have to reboot it in safe mode and do a system restore. Any help would be greatly appreciated!
If it helps I currently have and Intel core i5 4690k processor, MSI z97 gaming 5 motherboard, and an Nvidia Geforce GTX 660 Ti graphics card
 

dotaloc

Distinguished
Jul 30, 2008
319
0
18,810
Sorry to ask the tough questions...

Is the monitor plugged into the graphics card or the motherboard? If a graphics card is detected, the on-board graphics may be automatically disabled.

If the standard vga adapter IS the nvidia, you'll want to install the drivers from the nVidia website, probably.

We'll have to know if you are trying to use both the on-board graphics AND the nVidia...probably, this is a bad idea if it is even possible.
 

SubtleMeatball

Reputable
Dec 1, 2015
8
0
4,510


I have already installed the latest drivers for my Nvidia graphics card.
The monitor was plugged into the graphics card, I tried to install the drivers again for the VGA update, and this time when the screen went blank I plugged it into the motherboard, which made it work but using what i'm guessing is integrated Intel(R) HD graphics 4600 as that is what DirectX is telling me. I only want to use the Nvidia graphics card, but when I do the standard VGA update it doesn't seem to allow me to use it.
Another thing to note is that before I did the update and had my monitor plugged into the graphics card, Windows Desktop manager was unable to work whenever I started my computer, and any game that I would try launching would stop working before even opening, but now when I plug the monitor into the motherboard and use the Intel HD graphics they open up fine, but I obviously don't want to use this as my graphics card.
Thanks for your help!

 

dotaloc

Distinguished
Jul 30, 2008
319
0
18,810


Sorry for the delay. I got busy after lunch. With work.

It almost sounds like both of your display adapters (the intel one in your processor/gpu combo and your dedicated nVidia add-in card) work and are vying for control. While it may be, technically, possible to use both (to the same monitor...assuming your monitor has multiple inputs)...there isn't much reason you'd want to use the intel one with the nVidia on hand.

I'm guessing that you'll need to try one of a few things that may help in no particular order...

1 - install proper drivers for your processor. maybe getting the 'standard vga adapter' to be seen as the intel 4000 or whatever will cause it to function more correctly.
2 - disable the intel graphics in your device manager (right-click the 'standard vga adapter' and 'disable').
3 - disable the intel graphics in your BIOS. you may have to reference your motherboard's user manual as sometimes the terminology in there gets a little subtle. for instance, my bios gives an option of auto, pga*, or pci-e, or both for graphics options...presumably pga is my on-board graphics, in my situation. if i choose 'both', i can see two display adapters in device manager.

this may be incorrect, but i'm not restarting to see what it says exactly and i just know it is something less than obvious for someone who is pretty comfortable with computer hardware/software and terminology.

good luck!