Does HD ready mean less pixel density?

Status
Not open for further replies.

angelo88

Distinguished
Feb 13, 2010
24
0
18,510
Sorry for the repost, I'm not sure exactly which forum to put this question in.

Hey guys, I know that this discussion is pretty much all over the web, but I haven't been able to find a straight up answer for what I'm looking for.

Does hd ready mean that the tv has less pixel density than a full hd tv? Coz I've been reading around, and some say this is the case, and that hooking it up to a 1080p source(e.g. ps3 or pc) will take the image and downscale it. Others say that the diff is just due to the lack of a tv tuner in the tv, but display wise the tv itself is capable of displaying full 1080p resolution.

Coz I'm planning to upgrade my monitor to a 32 inch screen, and I'm primarily gonna use it for PC gaming and watching HD movies from my pc, with tv on the side(although non-HD channels). My concern is, will my pc gaming and movie watching suffer? Will the resolution really be less than what I'm using now(1680x1050) and thus, less detail? Or is it true that its just the lack of a TV tuner that is the diff, and pixel density wise, the HD ready tv will display full 1080p when hooked up to my PC

Obviously im such a noob at this stuff, I'm not even sure im using the terms in the right context

Thanks in advance for the help!
 
Solution
Don't put much stock in the terms HD or Full HD they don't have consistent meanings. The terms that are important are 720p or 1080p.

720p is 1280x720
1080p is 1920x1080

The p part really means not "i" were "i" stands for interlace, and is not something you want.

A monitor/tv is said to support 720p if it's resolution is at least 1280x720, and 1080p if it's resolution is at least 1920x1080. You can just compare the resolutions directly to find out which one looks better.

The basic differences between a monitor and a television are:

1) A television indeed has a tv tuner
2) Televisions generally have image processing on them to make bad sources look better
3) Televisions have a bit more lag in them, which you might notice when moving...

MagicPants

Distinguished
Jun 16, 2006
1,315
0
19,660
Don't put much stock in the terms HD or Full HD they don't have consistent meanings. The terms that are important are 720p or 1080p.

720p is 1280x720
1080p is 1920x1080

The p part really means not "i" were "i" stands for interlace, and is not something you want.

A monitor/tv is said to support 720p if it's resolution is at least 1280x720, and 1080p if it's resolution is at least 1920x1080. You can just compare the resolutions directly to find out which one looks better.

The basic differences between a monitor and a television are:

1) A television indeed has a tv tuner
2) Televisions generally have image processing on them to make bad sources look better
3) Televisions have a bit more lag in them, which you might notice when moving the mouse around
4) Televisions are designed to look good from 6' to 15' while monitors are designed for around 20". Cheap televisions in particular might not look right up close.
 
Solution
agree...terms they use on new televisions and monitors are downright frustrating. for the non-technologically informed its a nightmare. for a monitor you want 1080p, nothing less.

agree... you want progressive scan not interlaced. pretty standard by now.

1)agree
2)agree
3)on 60hz tvs not likely. i use one. on 120hz tvs i've heard there can be some input lag issues but i cant tell you how bad.
4)i use a 40" screen at 3.5-4 feet away and it looks fine. i wouldnt go any bigger than that unless you wanted to sit further away. at computer monitor distances i wouldnt go larger than 26-30" unless you wanted to push the monitor back a little bit.

avoid vizio, olevia, cheap televisions if you can. the price may look right but its not usually worth it in the end unless you need a cheap solution for now and arent worried about longevity.
 
Status
Not open for further replies.