Was using my Motherboard GPU thinking it was Dedicated GPU and when I put it into my GPU it black screens
I got a pc two months ago, I'm a pc noob and didn't realise that I was playing off my Motherboard GPU instead of my GPU I have installed. So anyway I only realise today while I was cleaning my pc that I never plugged the HDMI cable into my GPU. So I did and i started my pc and it loaded up. I was trying to fix my resolution on the pc to fit the screen and it went black screen. I tried to restart it didn't work, I tried to put in back into the Motherboard GPU and it didn't work. The pc is running and when it does start I can see the Msi logo loaded up and then it goes black screen again. Does anyone know what I did wrong?
Make sure to disable integrated graphics in the BIOS!
Here is how to: (TechWalla Website) - This is the website I got the steps from.
Restart your computer. While the computer is running the POST (Power-On, Self-Test) look for a message indicating a key to press to access the "Setup" or "BIOS." On most computers, the key to access the BIOS will either be "ESC," "F1," "F2," "F10" or "DEL."
Use the arrow keys on your keyboard to navigate your BIOS and use any instructions to change values. Find the menu item that closely matches "Integrated Peripherals" or "On-Board Devices." The setting to disable the on-board graphics card may also be under the "Advanced" option. Highlight the appropriate menu item and press "Enter."
Highlight the option that controls the on-board graphics card and press "Enter," then press the appropriate key to select "Disable." If your option allows you to enable an external video card, set that option to "Yes" or "Enable." If your options for the video card are "AGP," "PCI-Express" or "Integrated," choose the type of slot in which you have installed your new video card. Press "Enter."
Press "Esc" until you have backed out of all menus and are at the main BIOS screen. Select the option or press the appropriate function key to "Save and Exit." Press "Enter" to confirm. The computer will reboot and you may switch the monitor cable to the new video card output port.
You say you're new at this, and you provide no details on what you did, and you didn't mention installing new videocard drivers after connecting the card, so I conclude this is what happened:
You attempted to 'fix' the resolution and somehow set a resolution/refresh rate your monitor can't handle. So now you see nothing. Here's the first thing I'd do: See if you can reset the monitor settings to default. You might need to look up your monitor online for this. If this works, great, if not...
Pop the CMOS battery out of the motherboard for a minute, then replace it. It's the round looking silvery thing on the motherboard. Alternatively, if you can reach the documentation for your motherboard online, using another computer or something, find out how to use a 'jumper' to reset your motherboard bios. Maybe if you can get it back to default settings you can use the integrated graphics again. Of course this means disconnect the videocard from the monitor for now.
In case you can't tell, this is all about getting your computer and monitor back to default settings. Then you can start fiddling with your videocard again.
mjslakeridge said:When you remove the CMOS battery as suggested above, also unplug the computer from the power outlet (wall socket).
You know, I sometimes just assume people will do certain things without being told, but you have the right approach. Especially when someone says they're new to this NOTHING should be assumed. I'm going to try to remember that.