Sign in with
Sign up | Sign in
Your question
Closed

VGA gives better image than HDMI

Last response: in Graphics & Displays
Share
December 28, 2009 7:02:51 PM

Hi there! at home i got this LCD monitor 21,5", syncmaster2770HD from Samsung, and i wanted to hook it up to my pc. Now, the graphic card i´m using (ATI) has two DVI-outputs, I have a DVI to HDMI converter, and i also got the HDMI input on my monitor, so i bought an HDMI cable.. I put the converter in the card, continuing with an HDMI into the monitor, it works, but i am experiencing extremely blurry text, as if i got way too high resolution, wich i don´t. I figured this was not how it was supposed to be and put instead a DVI to VGA-converter on the graphics card, and got in a VGA-cable to the screen, and it looks better, the text is much sharper. Am i doing something wrong here? Why does the HDMI produce an unclear and blurry text?
Don´t worry if you explain in geek-terms, I´ll understand most of it.

More about : vga image hdmi

December 28, 2009 7:31:15 PM

you can adjust cleartype in windows to optimize the text. Also try adjusting the sharpness of the monitor when using hdmi.
December 28, 2009 7:58:26 PM

Thanks for the quick answer, i appreciate it. But none of those works really, there is really no change wether i put the sharpnessbar on 0 or 100, and cleartype i already acivated.
Related resources
December 28, 2009 8:05:31 PM

There's a feature adjust clear type text where there's abunch of texts and you choose the ones that look best and then that gives more options and it goes till you have the final result.
December 28, 2009 9:03:56 PM

Yes yes, that´s the cleartype thingie, and i´ve already tried that several times. It´s just not the text, everything else looks bad too.
December 28, 2009 9:28:17 PM

For some reason, I do not think that the commercial application of HDMI as a display connection technology for computers is working as intended. My real-world experience has been that everything works fine as long as your adapter is working at the native resolution of your display (ie, 1920x1080 for a 1080P display, 1366x768 (or 1280x720) for a 720P display), but any time you are outside of that range, most of the displays I have connected to via HDMI want to upsample/downsample to one of those two resolutions rather than display at the requested resolution. Check the display and make sure that your adapter settings are in sync with your display settings (not on the computer, but on the display itself).

Good luck.
December 28, 2009 10:42:10 PM

Try a new cable, Newegg sells them DVI on one end HDMI on the other. I got a 30 ft. one for $20.00 there at Newegg. One of the pins in the adapter or HDMI ends may be damaged. I use mine in a clone mode to my 40" HDTV. The other to my monitor is a 5 ft. cable same configuration. I am using a 8800GT with 1900x1200 settings (my monitor specs) and it works great .
December 28, 2009 10:42:30 PM

Hmm, now it gets even more weird.. i restarted the computer with the DVI to HDMI put in, and now the display displays those black lines at the edges, as if the resolution is too big for the screen, but that´s the native resolution. I tried some other resolutions and i found that 1680 x 1050 works perfect, but that one is a 16:10 resolution and this is a 16:9 monitor, it even says the size is 16:9 at displayoptions on the options for the monitor. I don´t get this at all =/
December 28, 2009 10:57:51 PM

16x9 or 16x10 are not resolution settings they are aspect ratios. When you watch a widescreen or letterboxed movie on a 4x3 tv you get the horizontal bars on the top and bottom, conversely if you watch 4x3 content on a widescreen the bars are on the sides running vertically.
a b U Graphics card
December 28, 2009 11:02:49 PM

Go to the display settings and set the resolution to the highest one allowed. That is the native resolution, everything should look much better then.
December 28, 2009 11:05:57 PM

Find the max resolution of your monitor in its manual, then adust to those settings in what ever program you use to setup your graphics card parameters. I use the Nvidia control panel.
December 28, 2009 11:34:21 PM

Yep, the max resolution is 1920 x 1080, but it just doesn´t want to display properly in that resolution if i choose HDMI as inputsignal.
a b U Graphics card
December 28, 2009 11:50:23 PM

Knubbis91 said:
Yep, the max resolution is 1920 x 1080, but it just doesn´t want to display properly in that resolution if i choose HDMI as inputsignal.


By doesn't want to display properly you mean the blurring problem?
a b U Graphics card
December 28, 2009 11:51:50 PM

Try using another DVI or HDMI cable. My friend has a defective DVI cable that caused a blur on his monitor and some color distortion. Once he changed the cable, the problem was resolved.
a b U Graphics card
December 28, 2009 11:53:10 PM

Yep like skolpo said try another hdmi cable. Try another dvi to hdmi converter as well.
December 29, 2009 3:37:23 AM

yeah cables can cause alot of weird stuff.. Always try to fix the easiest and cheapest stuff first.. batteries, cables, etc.

If you try another cable, and it is still working like that.. Try the display on another PC. If it works on PC#2, its a PC issue. If it Is bad on PC#2 also, you know its not your cables or PC unless its a PC settings error..Then you know if its a computer issue, or something inside the display.. If the Display settings turn out to be correct, your actual HDMI connection could be messed up inside the display..

Calling manufacturers for support can be a pain.. For instance the AC/DC converter on my old internet equipment was out of spec and it kept fryin my lan cards... After I figured it out, I basically had to force the company to take my world for it that this was the problem. Keep in mind, some of the tech support people you talk to will be simply reading from a manual. Try everything you can before you call tech support.. The trying new cable sets, new PCs etc.. If you do all that before you call, you will be able to get a specialist and an RMA if need be..

a b U Graphics card
December 29, 2009 3:56:28 AM

I go0ggled your syncmaster2770HD and it shows it to be a 27" LCD TV?
December 29, 2009 12:30:43 PM

Thanks for the answers, I´ll try another converter, or would it be better if i used no converter at all and just go from DVI on graphics card to DVI on the monitor? I have read it doesn´t differ between them just that HDMI carries sounds, wich i can´t anyway coz i still got to convert, and i guess it can´t convert audio.

asnwer to HundredIslandsboy: No it´s a 21,5" LCD, i googled it too and got it was 27", dunno where that comes from.
a b U Graphics card
December 29, 2009 1:54:37 PM

Try just using a DVI cable. Adapters might not be reliable, especially randoms ones you find on eBay or something.
December 29, 2009 4:53:14 PM

Say if i get a DVI cable and run with it, will i really get better image quality than with the VGA? Coz right now the VGA provides a crystal clear picture and i can´t see how it can improve even more.. Maybe perfomance will?
a b U Graphics card
December 29, 2009 10:27:39 PM

VGA is an analog connection and DVI is digital. It's "supposed" to look better and consistently have clearer images. Some people can barely or cannot tell the difference. If you are satisfied with the VGA connection, then you should stick with it. But if you can fix the DVI connection, you can compare the two and choose from there.
February 5, 2010 12:54:50 PM

I've been having this same problem, but with a 23" syncmaster. I have been using a VGA connection with a brilliant sharp picture for a week, and I just bought an HDMI cable so that I could plug that into my TV and use that as my main monitor, and use the VGA port for a second smaller monitor.

For some reason, the quality of the picture on the TV when using an HDMI cable is much much worse than when using a VGA cable - with the same graphics card, TV and resolution settings...

I have yet to try the DVI port on my graphics card.

~xatm092
a b U Graphics card
February 5, 2010 7:32:27 PM

Maybe you have to reconfigure the settings after changing to HDMI. Also, try using DVI. Although there are no extra benefits between HDMI and DVI other than sound input (and internet input in the new HDMI), my T260HD Syncmaster is a lot more configurable using DVI instead of HDMI. It's quite odd. I think the monitor limits the settings when using an HDMI port. Not sure why.
February 7, 2010 3:42:20 AM

I actually just ordered a DVI cable off of amazon yesterday so will post back after the weekend when it arrives and I have tried it out.
a c 271 U Graphics card
February 7, 2010 9:55:43 AM

This topic has been closed by Mousemonkey
!