DVI - VGA gives a pink hue to my monitor

Status
Not open for further replies.

Jasper Hopkins

Honorable
Jun 20, 2013
24
0
10,510
So recently I decided to go for a double monitor set up for my run of the mill PC. The ports I have on the graphics card are VGA and DVI. As I was using the VGA and my second monitor didn't have a DVI port I decided to get a DVI to VGA cable for it. So I plugged in a DVI-VGA cable into my first monitor for a test and it came up with a pink hue which is annoying as hell. So looking for the problem I plugged the cable into my second monitor and it still had the pink hue. So the problem might be with my cable or graphics card. Really annoyed and it would be great if you guys could help me out :)

-Edit-
My graphics card is the Nvidia Gefore 9500 GT DDR2
 

Jasper Hopkins

Honorable
Jun 20, 2013
24
0
10,510

Its not a problem with the graphics card because when I the DVI to VGA with one monitor it still has the pink hue. And the graphics card does support dual monitor. Stumped.

 

Jasper Hopkins

Honorable
Jun 20, 2013
24
0
10,510


I'll look for a refund and buy another and see if the problem persists, untill then I'm keeping this thread open though :)
 

Jasper Hopkins

Honorable
Jun 20, 2013
24
0
10,510


I wasn't planning on running games on 2 monitors, I was planning on running the games on one monitor and another window on the other. Thanks for the reply though :)
 

Jasper Hopkins

Honorable
Jun 20, 2013
24
0
10,510


No, this cable has never worked, and when I tried to get it to work it was on my desk with everything unplugged :l
 

prankstare

Distinguished
Mar 29, 2010
50
0
18,630
I'm having exactly the same problem, which makes text reading absolutely ridiculous.

Just bought a factory overvolted Nvidia-based graphics card (EVGA GTX650TI Boost) and when I plug the card into my secondary monitor via DVI -> VGA adapter (or pretty much any monitor that doesn't have DVI ports), I notice a weird pinkish-reddish tinge throughout the entire screen causing a lot of eyestrain especially when reading (font-rendering), and there's absolutely nothing I can do to fix this (tried rolling back drivers, custom resolution/refresh rate, two different dvi->vga adapters and nothing). Have also messed around with hue settings and what Nvidia likes to call 'Digital Vibrance' and it kinda helped, tried disabling the latter though as it messes up with color accuracy but there's no way of doing that (they say 50% is neutral but that's not true). Funny thing is that this monitor has always worked great with ATI/AMD graphics cards so I figured this is probably a Nvidia driver problem as the company deliberately refuses to release some simple fix that would come in handy (e.g., monitor reporting incorrect EDID values, full-RGB range HDMI, etc etc) or it could be an issue affecting factory overvolted cards like the EVGA SC and MSI Lightning cards. I'd also like to stress the fact that when in Linux and without installing any Nvidia drivers of course image quality looks apparently normal and great (not as good as a ATI-AMD graphics card but acceptable).

So as a result I figured out a simple and fast solution which is never buy Nvidia cards ever again - has caused me way too much trouble already.






 
The simple answer is that your using VGA.
VGA is an analogue signal and is subject to degradation unlike the digital signal DVI/HDMI/Displayport uses. If you have a low quality VGA cable it can affect your image quality, with DVI/HDMI you can get the cheapest cable you can find and it will work 100% or not at all.
 
Status
Not open for further replies.

TRENDING THREADS