HDMI Worse Than VGA

I have this TV http://www.vizio.com/lcd-hdtvs/e321vl.html and I had been using VGA and then I decided to get an hdmi cord for it, but now that I have hooked it up the HDMI looks worse than the VGA. Can someone please tell me why and you know a fix please in detail explain.

P.S. Graphics card is a Geforce GTS 450
6 answers Last reply
More about hdmi worse
  1. How does it look worse? Generally my problem with VGA, is its fuzzy to me.
    But really DVI and HDMI are both the best out right now...
  2. it's not HDMI's fault if anything looks worse. you likely just have an incorrect setting.
  3. check your Overscan & Underscan options in your graphics control panel
  4. Your problem is likely to be the media you are watching. Standard Definition content performs very badly on an High Definition set up. SD looks and performs better on VGA than it does on HD. This is because firstly the SD technology was designed over many years to run at the VGA standard. Secondly when a HD set up is delivering SD content it attempts to up scale the SD content to the higher definitions HD supports. The result is artefacts and a block y picture.

    Just about all content downloaded for a PC is SD format as it is smaller in file size. DVD product is SD though some up scale better than others depending on the time the DVD was released. Blu Ray pretty much always plays HD well especially if the original content was shot in HD.

    As for the Windows desktop a TV does not play well in this area and will not perform as well as a proper PC screen. It may well cope better with VGA as the standard is less demanding.

    In short HDMI is not always better. The technology must be supported from the source material through to the output device. It is only as strong as its weakest link.
  5. My experience is that some TV's do not interpret the HDMI signal correctly resulting in weird artifacting like faintly colored lines in the image. Texts can be hard to read. No amount of adjusting to the underscan, overscan and other options available removed this problem completely with my TV. RGB however, gives a clean perfect image. So am still using that today.
  6. I actually ran into similar problems (desktop looks like it's being run somewhere between 16-bit and 32-bit and minor artifacts etc), and I think it has to do with HDMI's audio interface. I could be wrong but when I used a DVI to HDMI adapter instead of using the HDMI port on my HD6870 it warned me that audio would be cut out, but my color became much more defined.

    Which is awesome because I was getting ready to wig out since whenever I attempted to use anything higher than 1152 x 648 on my DVI to VGA it would read out as invalid format on my Vizio.

    Hope this helps someone out, I was seconds away to reinstating my old 550 ti back to active duty. :bounce:
Ask a new question

Read More

Graphics Cards VGA HDMI Graphics