Main monitor flashes black for a second and GPU driver fails

kpagcha

Honorable
Jun 3, 2014
58
0
10,640
I occasionally get a black flash lasting for a second perhaps two and then Windows displays a message that says the GPU has encountered an unexpected error. This happens mostly when watching videos but never when playing games.

I am on Windows 7 64-bit and have a dual-monitor setup. Both monitors are connected to the GPU (model NVIDIA GeForce GTX 1060): the main 4-months old one through a HDMI to DVi and the old monitor through VGA to HDMI.

I read that this happened to some people with two monitors and solved it by clean uninstalling the GPU driver and installing it again. I did this using Display Driver Uninstaller and then installed the most up to date driver for my GPU.

Not only this did not solve the problem, but now my secondary monitor cannot be configured with its proper resolution (1440x900).

How can I solve the failing driver issue? How can I set my secondary monitor to its former, proper resolution?
 
Solution


If it's Flash video, remove the Flash player in FF, and see if the website shows you the video in a different format. If it's not Flash, then you need to identify what type of video it is, and then update the corresponding video codec.

Alternatively, use Chrome for that video (maybe it uses the VP9 codec) if that worked for you before.
Usually monitors "inform" the GPU of their supported resolutions, and then the GPU lists that as an available resolution. However, if you're connecting that older monitor through VGA (HDMI to VGA), you're not getting that type of communication. So you have a couple of options available:

1) Connect the older monitor with something different than VGA, then check if the monitor has 1440x900 as an available resolution.
2) In the resolution part of the display driver, select/de-select the option (if available) to show all "unsupported" resolutions. Then select 1440x900.
3) Install a driver for your monitor, if available from the monitor mfg.
4) Buy a new monitor with full 1080p resolution. You can buy 22-24" LCDs for around $100 USD.

Of all the possibilities, I'd go with buying a new monitor. :)
 

kpagcha

Honorable
Jun 3, 2014
58
0
10,640
I said that up until this issue happened, I was able to connect my second monitor was working just fine with its native 1440x900. I can't really connect anything differently, as I already spent quite some time getting everything to work. It's just that now it's not working anymore because of the damn black flashing...

What do you mean install a driver for my monitor? It's connected to the GPU, therefore that's its driver isn't it? Or I'm missing something?

In any case, what about the screen flashing issue? How'd I go about that?
 


Right, but at some point it lost the original settings which told Windows what monitor type it was & what resolution it supports. You might try switching the output adapters, if the older monitor also supports DVI. If not, test the HDMI-VGA converter with your main monitor to confirm it's okay (assuming it also has a VGA input in addition to DVI).



Older monitors sometimes were shipped with a driver that identified the monitor & its capabilities to Windows. For example, Dell used to include CDs for their 17" & 19" LCDs. So Windows would show specific model info for it rather than just "Generic PnP monitor". If you have something like that, that could help to get the right resolution. If not, then check to see if you can find the "Unsupported" resolutions.



That does indicate a driver crash. Since games aren't crashing & it's mainly video, check to see if you need to update your video player (like VLC, Flash, or whatever) as well as related video codecs. You may even want to switch to VLC for video playback if that is stable for you.
 

kpagcha

Honorable
Jun 3, 2014
58
0
10,640
Thank you for answering!



I can't switch anything. The odd combination of adapters I have have a reason to be: that is the only combination that works given the port my monitors have and the ports my GPU have (I could connect the second monitor to the motherboard but then its native resolution would simply not work, I found out about this when I configured it all some time ago)



Both monitors are listed as generic ones, in the device manager. In the resolution configuration, accessed by right clicking the desktop, the main monitor is listed as BenQ GW2470, and my second monitor is Optoma XGA (I don't even know what this is, my monitor HP). There were both listed that way before and it was workig. So what are you suggesting I should do, given that the monitor is detected as generic pnp?



Oh I didn't think about that. It doesn't happen with VLC, although I don't use it intensively in this PC. It does happen when watching videos in Firefox. And I would be confident enough to say this only happens with Firefox. I only changed from Chrome to Firefox several months ago and I am absolutely positive this flashing issue did not happen with Chrome. So how'd I go about it?
 
OK, so first try to remove the "Optoma XGA" monitor. Optoma are usually known for their projectors, so likely Windows thinks you have that projector connected. That's why the resolution isn't working. So remove the "Optoma" as a Monitor (go into Device Manager and look under Monitors; then scan for new devices again), and then you should get the right one (even "Generic PnP" should work).

In regards to the video, try removing the Flash Player extension in Firefox, if you're having problem with Flash video. Then the website might let you render something else through HTML5. Also, update Firefox if it's not updated; Firefox Quantum 64-bit is up to version 59 as of this writing.
 

kpagcha

Honorable
Jun 3, 2014
58
0
10,640


How can I remove the monitor? Remove it from where? Also, as I mentioned both monitors are listed as "Generic PnP Monitor" in the Device Manager (well two entries show up) so I don't know which one is which.



Yes that is my current version of Firefox. As of extensions I have: OpenH264 and Shockwave Flash (version 28.0 r0). Is this last one the one I should remove? But it is set as "always ask to enable".
 

In the Display Adapter properties (or Advanced properties), you should see a Monitor tab. From there, you can change the monitor to a different one (like Generic PnP), or you can adjust the settings for the existing one. You should also see the checkbox that hides unsupported modes; that could let you select 1440x900 resolution again.



If you're having problems with Flash video, then remove the Flash player. It's a big ongoing security hole as well, so you'll be doing yourself a favor. If you need the Flash player, then you could use Chrome if that worked for you previously.
 

kpagcha

Honorable
Jun 3, 2014
58
0
10,640


I am in Advanced properties > Monitor for my secondary monitor, the one weirdly listed as Optoma XGA. But I can't change the monitor, there is no dropdown to choose from to be seen anywhere. In any case, in this tab the monitor shows now as "Generic PnP" insted of "Optoma XGA". Also, that checkbox you are talking about is locked, so I can't enable showing unsupported resolutions. In any case, I have tried creating a custom resolution with the NVIDIA panel and trying applying it to that monitor, but the image would be distorted.
 


If it's Flash video, remove the Flash player in FF, and see if the website shows you the video in a different format. If it's not Flash, then you need to identify what type of video it is, and then update the corresponding video codec.

Alternatively, use Chrome for that video (maybe it uses the VP9 codec) if that worked for you before.
 
Solution