Need help setting up my second dipslay - the color keeps flickering

Hi everyone, this is my first post here so I hope I make everything as clear as possible and make no mistakes.

I have a rather strange issue imo. I just hooked up my second display and I get a strange issue, the image seems to flicker, they best way I can describe it is that it looks like the monitor is being filmed by an old camera, not as intense, but something like that and it's especially visible on darker colors.

Strange thing is that as soon as I unplug the main monitor it becomes normal again. Also, if I lower the resolution to 1280x1024 it stops flickering- the native resolution of this monitor is 1680x1050. I also notice that this is the point where the frequency automatically changes to 75hz from 60. If I manually turn it to 60 hz it flickers again.

It works fine if the main monitor is not connected even at 60hz but as soon as I plug the main monitor it starts again.

I tried forcing it to go to 1680x1050 at 75 hz in the nvidia control panel. When I do the test it works fine, the resolution is sharp and no flicker, but that's only when I test it. When I hit "ok" and have it apply it messes up again, and now, it shows me that the resolution is 1680x1050 but the image is not sharp anymore, it's almost distorted.

I can't seem to find anything by googling this, it's just a strange issue I think.

My main monitor is a Samsung SyncMaster T260 HD - this one causes no problems
My second monitor is an Acer P223w
My ghrapchis card is a GeForce GTX 670.

Hope you guys can help me, thank you in advance!
11 answers Last reply
More about need setting dipslay color flickering
  1. just a couple of questions. When you said that the image is "distorted" is it blurry? Or is it warped in some way? Does it still flicker at 1680x1050 and 75 hz?
  2. When I said its distorted it almost looks like it's lower resolution but all the icons stay the same size as if the resolution didn't change, but the image isn't as clear. So I guess you could say it's blurry although it wouldn't be the word I would use.

    You have to make a custom resolution under the Nvidia control panel in order to force the 75 hertz and before you create it you have to test it first. When I do the test it works fine, resolution is sharp and no flicker. However, when you get out of test mode and you hit apply the new custom made resolution 1680x1050 @ 75hz that's when it does what I described above.
  3. Hello... are you using all DVI inputs/outputs and cables to your monitors? Specs say 60HZ
  4. I am using a classic vga connector (the monitor only has the analog version so no dvi) and I hook it up to a dvi output on the videcard by using an adapter.
  5. Hello... there's your problem... your only option using a analog input, is to try different Frequencies from 50-75 to find a sweet spot... try V-sync... try different ( older ) Video card drivers... Try DE-MAGGing your analog monitor... Try a different VGA cable... until you get all DVI monitors, you will always notice a Visual difference.
  6. the thing is, if I only have the one monitor connected there is no issue, at the same resolution, the same hertz it works correctly when it's by itself. When I plug in the main monitor the Acer starts to do this really vissible flicker like I explained. At 75 hertz the image becomes unclear..
  7. Hello... thats why I mentioned different Drivers versions , from my experience... it can be a time consuming, trial and error, compatabilty issue with different older/newer drivers. Did you try each output DVI port with the same results? I so Don't miss my analog Monitors because of the same issues... With DVI now, it's SET IT-and FORGET IT.
  8. I can only use one dvi Port, it's the way the adapter is set up, it only fits in one of the ports.
    I don't know what drivers to use, this model of acer has no support for windows 7, although win 7 does recognize it. I updated the video card drivers when I noticed the problem but it stil does it. I'm guessing it's a video card problem more than a monitor problem because the monitor works fine when by itself
  9. Hello... there are NO Drivers for a MONITOR... a monitor is a Passive Device. The Drivers that will change performance will be the VIdeo card Drivers ONLY. And you just proved my point, when you updated your Video card driver, THE PROBLEM STARTED... Revert back to your previous video card driver version, for the best Analog monitor performance... and untill you replace your ANalog monitor, Keep a backup of that video card driver version, and don't update any more.

    Enginneering teams are updating software because of new hardware/software compatability... concerns for Analog monitors are being left wayside, in their Time, Efforts, and Testing.
  10. I solved it.

    Went to nvidia control panel/change resolution. Selected my acer monitor and created a new custom resolution of 1680x1050 @ 75 hz. but under timing, instead of automatic I set it to CVT. It works fine now, clarity and everything.
  11. the problem existed before and after updating so it's not a driver issue, it works now with the resolution setting
Ask a new question

Read More

Graphics Cards Resolution Monitors Graphics