My friends monitor started just blinking at her (no picture, power button light flashes slowly) so I brought over a working LCD I had lying around to try, it worked and so I left it with her, as I no longer needed it.
A couple days later the LCD I left with her goes blank on her... I thought just like the last one but in actuality this one flashes a picture for a half second before entering standby mode where as her first one just blinked and never displayed at all... both however would work if she turned them on and off or if she rebooted - not everytime but if she tried enough times then yes it would work, both eventually stopped working via this method though and remain in there current state as described above, 1 flashes a picture for a second before going to standby the others power button just flashes.
So I figured her PC's video card is pooched and needs to be replaced... her computer is 8 years old though so we just went and bought a baseline prepackage compaq from future shop, price is right and fits her needs. i Plugged both monitors into this new PC and they are doing the same thing... so now my jaw is slack and I'm a little stunned ... I have never heard of a monitor being fried by a video card... let alone 2 of them... but seeing the exact same problem on a brand new PC has me totally confused...
To top it off I grabbed a 3rd monitor that was never connected to her old PC and conected it to the new one and things seem to be fine...
I had no idea this was possible... any thoughts - did her video card destroy 2 monitors? Is that possible? is there an issue I'm not thinking of yet?
I know I'm a new forum user, but have lurked this site for years, I know the right people hang out here for sure, thanks in advance guys!
no thoughts on this... I have been reading thought perhaps its the power source... a not completely crappy surge protector... could really use some ideas guys... not wanting to scramble the third monitor...
Back in the day with crt's, programs could adjust the frequencies and make custome resolutions. However, they all gave a warning that you could damage your monitor. I recall writing programs and learning how to do this with code, which also gave me the same warnings.
So I guess if you could mess up a monitor with the wrong frequencies, it's possible a video card could damage a monitor.
This shouldn't really happen with modern LCDs. I suppose if a video card managed to force a refresh rate the monitor couldn't handle it could cause damage. It is also unlikely that the video card would seem to operate properly, but be sending dangerous current to the monitor resulting in damage. It's much more likely that the wall socket you plug the monitors in has issues and is shorting them out.