Sign in with
Sign up | Sign in
Your question

RGB vs DVI/HDMI

Last response: in Graphics & Displays
Share
October 17, 2010 4:04:09 PM

Recently I bought this 32" LG 1080 LCD TV and I have a desktop comp with an ATI 46XX card that has 2 DVI and a Svideo output. the first thing I did was hook to the TV's RGB input and to the comp using a RGB to DVI adapter. But naturally I thought i might get a better picture if I went from DVI to the TV's HDMI input. So I bought a DVI/HDMI cable and gave it a try. To my astonishment at the same 1080 resolution I got a "TERRIBLE" picture (the fonts were almost unreadable and the general pic quality just totally sucked) and had to revert back to the RGB to DVI.

Can anyone tell me what happened there?

Like I said I was at same resolution with both hook ups.

More about : rgb dvi hdmi

a b U Graphics card
October 17, 2010 10:04:56 PM

RGB only supports 480i signals while DVI can go all the way up to 1080p. Did you set the TV resolution correctly?
m
0
l
a b U Graphics card
October 18, 2010 4:34:11 AM

Its probably not a 1080p television. most cheaper HDTVs are not HD at all. they have a max res around 1366x768, and downscale and interlace higher res images. so a 1080p image ends up looking horrible.

use a 1366x768 resolution and you should get a much better image (you may need to force an unsupported resolution, since the EDID wont function properly. but you can do that in the display driver.)
m
0
l
Related resources
Can't find your answer ? Ask !
a b U Graphics card
October 18, 2010 7:00:08 AM

Exactly the same problem my old man got when using the VGA/DVI GPU Adapter to the TV VGA port, which was perfect at 13xx*7xx res. All games were smooth and worked perfectly.

Connected via a HDMI Cable and the picture were blurred and horrible, games were just unplayable. Had to setup the picture in the TV's own menu system to support the 1920x1080 1080p Res. Was nightmare to do, and had to ring up LG themselves to get it sorted in the end.

But works and looks brilliant now, but this is on a HDMI cable itself, not DVI/HDMI (shouldnt make too much difference though)
m
0
l
a b U Graphics card
October 18, 2010 8:56:02 AM

the main problem is in the scaling.

when using a digital connection, the TV is left to do the scaling. something its not very good at. When using analogue, your Video card handles scaling and does a better job. Hence why the higher res image looks better on a analogue connection. But the best image by far, comes from using a digital connection and native resolution.

Though when i hook up to my HDTV, i use VGA and set it yo 1366x768. easier, and looks plenty good enough for light gaming and watching movies.
m
0
l
a b U Graphics card
October 18, 2010 3:00:26 PM

I know on my TV (A Samsung UNC7000), if the input mode is not "DVI PC", the text is damn near unreadable. So there is a scaling issue involved that needs to be resolved with your particular set.
m
0
l
October 19, 2010 5:30:58 PM

mister g said:
RGB only supports 480i signals while DVI can go all the way up to 1080p. Did you set the TV resolution correctly?



VGA (RGB) has no limits it can go up to 2048x1536. RGB input can go up to 1080 easily.

DVI is "TECHNICALLY" supposed to have a sharper crisper image but both are capable of the same resolutions.

i don't know where you got 480 from, lol
m
0
l
October 19, 2010 5:41:10 PM

welshmousepk said:
Its probably not a 1080p television. most cheaper HDTVs are not HD at all. they have a max res around 1366x768, and downscale and interlace higher res images. so a 1080p image ends up looking horrible.

use a 1366x768 resolution and you should get a much better image (you may need to force an unsupported resolution, since the EDID wont function properly. but you can do that in the display driver.)



It is a 1080 TV I always do my homework before i buy something. Well maybe cables escaped me LoL Had no reason to believe it was such and issue.

LG - 32" Class / 1080p / 60Hz / LCD HDTV Model 32LD450

The picture I get with the RGB is a fabulous HD picture and the real reason I wanted to change cables was I needed the RGB cable somewhere else, and thought as long as i need to buy a new cable might as well get a DVI/HDMI cable. hehhe that logic didnt work out so well.

Your second reply about scaling sounds like the issue. This tv had a lot of picture options for the RGB input but only like 4 for HDMI (ratios) and none of them were any good at all.
m
0
l
a b U Graphics card
October 19, 2010 8:43:38 PM

Does the Catalyst Control Center recognize the TV and set it to the default resolution?
m
0
l
a b U Graphics card
October 20, 2010 1:55:09 AM

Yes, buts there's a MASSIVE difference between 1080i and 1080p.

is the TV progressive scan at 1920x1080?
m
0
l
a b U Graphics card
October 20, 2010 2:46:39 AM

Don't all LCD TVs progressivelly scanning lines? I thought only CRTs had to do interlace scanning, (except for computer monitors).
m
0
l
a b U Graphics card
October 20, 2010 3:04:08 AM

Nope, some HDTVs are progressive scan, while others are interlaced (with a native res usually about 1366x768 or 1280x720)

that's how they fool people. they think they are buying a nice HDTV for playing their HD content, when in fact it will look horrible at 1920x1080.
m
0
l
!