So this is a strange problem I'm having since today, and can't find anything like it anywhere on the web. I think I'm going to have to tell a story to make it understandable.
My monitor (Hannspree M19W4 in device manager) is blacking out and displaying 'no signal input' after windows boots up. I know it's still working because if I type in my password from memory it makes the login sound, and various other sounds, and if I press the power down button it makes the windows power down sound. The Hannspree displays fine until this point, and will work in Safe mode. It also displays in normal Windows startup on another monitor.
Now this is the weird part. If, after starting up with another monitor I unplug it and plug in my Hannspree, or after uninstalling display drivers, then the Hannspree works. If I go to display settings, the computer identifies my Hannspree as either the replacement monitor, or as a generic monitor. If I then press 'detect' or if I scan for hardware changes, then the screen goes black and windows makes the noise as if I had unplugged a device. If I then plug another monitor back in without restarting the PC, it won't display on that monitor either, which I imagine is because it thinks this is still my Hannspree. I assumed this was a driver problem, so I uninstalled drivers, however this didn't work. I've also tried changing the resolution to several different settings (1440x900 default) and turning off all of Windows 7 aero and other graphics stuff.
Now, I would assume this is not a broken monitor, as the monitor can work fine if I boot with another monitor first and then switch. The problem seems to be is that as soon as my PC identifies my monitor as the Hannspree the information (I have no idea what) doesn't send properly. I really don't know much about how monitors work, so I'm desperate for help with this - it seems like it should be easily solvable
Specs: Windows 7 64bit, Intel core2 quad q6600 2.4ghz, Nvidia Geforce 8800gt, Hannspree XM 'New York' M19W4 monitor. Ask if you need anything else.
Oh, I also tried running windows memory diagnostic. I also set the multiple displays option to 'duplicate on both screens', thinking it might be detecting the monitor as a secondary screen for some reason, and that didn't help.
That sounds like a really hard problem to troubleshoot! My last graphics issue turned out to be a bent DVI pin, but I don't think that's your issue. If you've been using the latest Nvidia drivers, I would uninstall your graphics drivers, and then let Windows 7 auto-install new drivers. If you've been using the auto-installed drivers, upgrade to the latest Nvidia drivers. If you've already tried both drivers, let me know.
Thanks for the tip, unfortunately I tried that already and it didn't seem to work. Something that did work is booting Windows in low-resolution mode... so I tried bumping that up by small amounts and it kept working until something I thought was acceptable.
I don't dare to go any higher is all, and it's such a pain to change it, I think I might just leave it on this lower resolution. Does anyone have any ideas on why my screen won't display its standard resolution any more?
I was afraid of that... You know, not being able to push the monitor past a certain resolution was the same problem I had with my bent DVI pin. Have you tried using the VGA connector instead of DVI? At your monitor's resolution, it won't affect the quality much if at all.
Wow, apparently I'm a bit of a noob. Turns out I had both the VGA connector and the DVI connector attached at the same time, thinking I needed both of them. Come to think of it the alternate monitor that worked fine was VGA (I looked it up).
I guess the DVI cable is broken then, I'm impressed that you managed to work it out given that you didn't know what was really the problem. I'm guess it defaults to using the DVI if both are connected? Also, though it doesn't seem like it, is it worth replacing the DVI cable? I play quite a lot of games but if it's not gonna make a noticeable difference I won't bother.
So VGA worked? Yay! It's possible that the problem was caused by both VGA and DVI being hooked up at the same time. Most monitors will default to DVI, but of course they vary. Try one alone and then the other and see which works.
If they both work fine, you might as well go with DVI.
If only VGA works, it might be the DVI connector on either the computer or the monitor or the cable itself. In this case, it would be easier to just stick with VGA, but if you really notice lower quality (extremely unlikely), you could troubleshoot further to figure out which component is broken.
As far as gaming speed, there is no difference between DVI and VGA (or HDMI or DisplayPort, for that matter).
Yeah I already tried out which cables were broken. It seems like it was defaulting to the broken cable on native res, because with just the DVI it wouldn't work at all on any resolution. I can't notice a difference, and anyway I think the VGA cable is attached to the PC by a DVI adaptor? There's like an extra attachment between the cable and the Gcard, which only has DVI size slots anyway.
So this question is pretty much resolved I think. Thanks v. much for your help, there's no way I would have figured that out on my own and I was getting pretty fed up with a blurry screen.