Hi,
I've had a heck of a time with my graphics card causing my PC to crash and I'm hoping there's something I'm missing.
I built my PC in March 2017 with the following specs:
When playing several weeks ago, the game suddenly crashed and took my PC down with it, displaying a random color on the screen each time. After, it spread to this random crashing as soon as soon as I booted into Windows. I troubleshot this extensively and eventually ended up sending my card in for an RMA.
I just received a replacement card this week. After installing it, everything worked fine -- no crashes, and A Hat in Time ran fine. A few days later, I was playing the game and it randomly crashed to a black screen, my PC also unresponsive. Once I rebooted, both of my monitors were trying to find input as if they weren't connected to anything. I had to switch to my integrated graphics output and when I went into the UEFI settings, it was showing that my graphics card wasn't detected.
I then swapped my card to the first PCIE port (it was one the second one prior to this, as I had trouble getting the screws protecting the top slot out) and made sure the card was seated correctly and the power cables snugly plugged in. After setting everything back up, I booted and my PC detected the card just fine.
After a few hours of general use, I launched A Hat in Time and the game crashed my PC with a black screen before I even saw the title screen. After rebooting, I'm back to both of my monitors acting as if there's no input and I have to switch to my integrated graphics video outputs.
I know the graphics card is receiving power because it lights up and it obviously worked before launching the game. I don't understand how launching a non-demanding game is A) causing my PC to crash and B) stopping the display from the graphics card. I was running Open Hardware Monitor at the last launch of the game, and it showed about 49 degrees Celsius for my GPU right before it crashed.
Sorry for the amount of information, but this truly has me baffled and I wanted to explain it as much as possible. I can't imagine that a replacement graphics card is faulty right out of the box.
TL;DR: Sent my graphics card in for an RMA. With the replacement card, a game is crashing my PC to a black screen and causing my graphics card to not display any output so I have to use integrated graphics.
I've had a heck of a time with my graphics card causing my PC to crash and I'm hoping there's something I'm missing.
I built my PC in March 2017 with the following specs:
Windows 10 Pro 64-bit
Intel Core i5-7500 @ 3.40 GHz
16 GB RAM
ASUS PRO GAMING Motherboard
AMD Radeon Sapphire Nitro+ RX 480 Graphics, 8 GB
250 GB Samsung 850 EVO SSD
1 TB WD Blue HDD
When playing several weeks ago, the game suddenly crashed and took my PC down with it, displaying a random color on the screen each time. After, it spread to this random crashing as soon as soon as I booted into Windows. I troubleshot this extensively and eventually ended up sending my card in for an RMA.
I just received a replacement card this week. After installing it, everything worked fine -- no crashes, and A Hat in Time ran fine. A few days later, I was playing the game and it randomly crashed to a black screen, my PC also unresponsive. Once I rebooted, both of my monitors were trying to find input as if they weren't connected to anything. I had to switch to my integrated graphics output and when I went into the UEFI settings, it was showing that my graphics card wasn't detected.
I then swapped my card to the first PCIE port (it was one the second one prior to this, as I had trouble getting the screws protecting the top slot out) and made sure the card was seated correctly and the power cables snugly plugged in. After setting everything back up, I booted and my PC detected the card just fine.
After a few hours of general use, I launched A Hat in Time and the game crashed my PC with a black screen before I even saw the title screen. After rebooting, I'm back to both of my monitors acting as if there's no input and I have to switch to my integrated graphics video outputs.
I know the graphics card is receiving power because it lights up and it obviously worked before launching the game. I don't understand how launching a non-demanding game is A) causing my PC to crash and B) stopping the display from the graphics card. I was running Open Hardware Monitor at the last launch of the game, and it showed about 49 degrees Celsius for my GPU right before it crashed.
Sorry for the amount of information, but this truly has me baffled and I wanted to explain it as much as possible. I can't imagine that a replacement graphics card is faulty right out of the box.
TL;DR: Sent my graphics card in for an RMA. With the replacement card, a game is crashing my PC to a black screen and causing my graphics card to not display any output so I have to use integrated graphics.