LCD Monitor, vertical flickering and horizontal lines?

Funkmaster Rick

Honorable
Jul 9, 2013
3
0
10,510
Last night, I left my computer on while I slept; the sleep function has not worked for some time (it tends to kick itself awake randomly). This has not presented a problem so far, but this morning I awoke to find that my monitor had not only failed to automatically blank into power-saving mode, but also displayed some issues that made it unusable.

Specifically, there was massive vertical flickering - the top of the screen stayed more-or-less where it's supposed to, but the bottom of the screen kept flickering between different points varying from where it's meant for to an inch or more below the bottom of the monitor, dragging the middle of the screen with it. Horizontal lines would appear as it did so. There was also a little bit of discolouration; the colours seemed dimmer

So I jumped online and started researching. Before I found an appropriate answer, the issue ceased. The monitor is currently functioning fine, and has been for the last hour. Nonetheless, this issue concerns me. If this monitor breaks again, I cannot currently afford to replace it.

I've run a program from the manufacturer to install drivers for all their monitors. I've also updated my GPU drivers. I ran the Windows 7 Troubleshooter before either of these steps, and found no issues. I also switched my monitor plugin to the other DVI port on my card.

Specifications:
Windows 7 64-bit
4gb RAM
Nvidia GeForce GTX 560
i5-3570 CPU

Just as I finish posting this, the issue has returned. The vertical flicker and horizontal lines are less intense, but now the colour issues seem more pronounced. This flashback lasted less than thirty seconds.

So what's causing this? Is it my monitor, ribbon, GPU, or some other part of my system that needs replacing? Maybe it just needs a good cleaning?


Additional issues, maybe related, maybe not:
Windows Update has been non-functional for maybe half a year, unknown reason.
Computer likes to kick itself out of sleep mode without input.
I get an error 1935 message whenever trying to install Visual C++ 2005 Redistributable, which comes with several games, and am unable to play these games because of it.

That's all I can think of. Here's hoping!
 
Solution
It doesn't matter if you're using an adapter. The VGA cable sends an analog signal that must then be converted at a loss of quality. Check this article: http://www.diffen.com/difference/DVI_vs_VGA

I was actually doing exactly what you're doing--using a dvi converter--and I was astonished at the difference that using the correct cable makes.

drewhoo

Honorable
Apr 5, 2012
318
0
10,860


It sounds like you are using an analog (VGA) connection that is getting jostled or whatever and is causing those weird malfunctions. If you're using a VGA (often a blue connection head) cord, then this is the problem. Get an HDMI or DVI (both are digital) and your problem should go away.
 

Funkmaster Rick

Honorable
Jul 9, 2013
3
0
10,510
Update: after a thorough cleaning of the computer, I plugged everything back in and the monitor started having power issues, or something. When I turned it off, then back on, I'd get about half a second of perfect functionality followed by it dimming down to the point of uselessness.

A friend had a spare CRT monitor, so I borrowed it and am currently testing. Seems to work at first glance.

drewhoo: It was LCD, with a VGA port. I'd added an adapter to fit it to the DVI output on my card. The cord was firmly positioned till I tried a few adjustments to see if that was the issue; no result. The newer power issues don't seem to jive, unless there are two separate problems here.
 

drewhoo

Honorable
Apr 5, 2012
318
0
10,860
No, that's not what I'm saying. I'm saying you need to ditch the VGA cord and use a DVI-D cord. They're totally different things, regardless of whatever adapter you're using and what the i/o is. The best DVI cord you can get costs less than $10 delivered to your door.

You could also use HDMI, but HDMI limits your resolution to something like 1920x1200, whereas DVI-D supports up to 2560x1600, should you ever decide to upgrade your monitor. Plus, the DVI-D connection is more robust/durable than HDMI.

Edit: I think that perhaps I confused you when I said that the signal must be converted. Even without the converter, when the VGA output is connected to a VGA Cable and then to a VGA input, there is signal conversion going on. So VGA is the incorrect interface to use because you are using an analog signal cable to communicate between two digital devices.