Cant get my monitor to work with DVI-D to VGA adapter

2moons33

Honorable
Nov 10, 2012
16
0
10,510
So here's the situation: I think the HDMI on my monitor looks like crap with my 7950 graphics card (fuzzy text on screen ect.) so i went and bought an adapter for male DVI to female VGA and am using male VGA to male VGA to my monitor because my monitor only has HDMI and VGA ports. After all that the monitor does not detect the computer when i hook everything up...

Whats the problem?
 

rage_311

Distinguished
May 23, 2003
26
0
18,530
You mention DVI-D in your title, which is the digital ONLY version of DVI. Although most video cards have DVI-I, which includes analog pins, it is possible that yours is DVI-D only... in which case, your only option to get from DVI-D to VGA is with a rather expensive active converter. Getting a DVI to HDMI adapter/cable would be cheaper, though it might not help in your case if your HDMI input on the monitor is really your problem.

Try going HDMI-HDMI again and adjusting the ClearType settings (if you're using Windows). This should help your text to look less blurry. Also, there are options in Linux to mess with the sub-pixel order, which can also help... there's probably something similar offered in Windows, but I don't know it off the top of my head.

Hope any of this helps...

EDIT: Also, check your resolution to ensure that you're using the native resolution for your monitor. If your monitor is 22"+, then it's probably 1920x1080, but you should be able to find out by going into your display settings in your OS.
 
HDMI and DVI are digital video. Your computer tells the monitor exactly what color and brightness to make each individual pixel. You should not be getting fuzzy text. Or rather, there is no mechanism by which these cables/adapters can be responsible for text being fuzzy. There is no way for the instructions for one pixel to "bleed" into adjacent pixels with digital video.

Converting to VGA with an adapter will not improve your situation, as all that does is insert a digital -> analog -> digital step in the middle. The only way it will help is if the problem is caused by one of the below reasons, and the VGA input suffers from it less or somehow manages to bypass it.

- Check to make sure the Cleartype settings are to your liking. There's one setting within Cleartype which is based on subpixel arrangement (RGB vs. RBG) which will make text appear fuzzy if you select the wrong one, though I'm not sure how you access it in Windows 7.

- Check that your computer's desktop resolution matches the monitor's resolution. If they don't match, the monitor has to stretch the picture to cover the entire screen. Most monitors do a good job at this, but some do a very poor job resulting in blurry/blocky images. A lot of older people set their desktop resolution low to make the icons and text bigger, resulting in this problem. If you're one of these people, set the resolution to match the monitor. Then change the DPI setting (right-click desktop -> personalize -> display). You may have to logout and back in before the changes kick in.

- Check your monitor's setting to make sure overscan is turned off. HDMI is also used for TVs, and many TV signals are designed with garbage at the edges of the screen (in particular the closed captioning signal is encoded there as black/white dots). To prevent you from seeing this, most TVs will overscan (stretch the picture slightly) to hide the garbage. This results in picture degradation like above, but you don't really notice it with images and video. For computer output though, the fuzzy or blocky images and text are pretty obvious. You can run into this problem if your TV/monitor mistakes your computer output for a TV signal, so you have to force it to turn overscan off. This setting may also be called 1:1 or direct.

Pretty much all modern video cards have HDMI out in addition to DVI out. I have to figure a 7950 has HDMI output on it. Is there some reason you aren't just using a straight HDMI-HDMI cable? It's no big deal, except a DVI-HDMI adapter may not preserve HDCP (high def content protection) info. Meaning if you ever try to watch a blu-ray movie with that setup, it may get blurred down to DVD resolution.