Is HD 720 (Ready) better than my monitor? And some other questions

l_r_c_t

Distinguished
Apr 8, 2009
275
0
18,790
Hey Everybody:

I'd like to discuss some subjects regarding display resolution, and basic differences between the formats.

The best display that my monitor (XEROX) can show according to it's specifications is 1680X1050 (Ratio: 16:10), in terms of display resolution my monitor displays a better image than the HD 720 (Also known as HD Ready) 1280X720 (Ratio: 16:9), how come it's considered better? (or it's only the publicity that made such a big deal of the HD?)

Why isn't QSXGA which has the best resolution display, isn't being sold in stores instead of the HD technology? If it has a better display resolution even than the HD 1080 (Also known as HD Full) 1920X1080 (Ratio: 16:9)

What is the difference between 1080p and 1080i, there is no need for a very deep explanation about the last question, I just want to know in generally.

The question might sound (or read) dumb, or not actual, but I still would like to get an answer for them.

The whole resolution, and display field is very big and complicated, and commonly less known by people. So please people, only if your really know about it reply, if not you are welcome to watch the replies and acquire some knowledge.

Thank you.

 
Solution
The best display that my monitor (XEROX) can show according to it's specifications is 1680X1050 (Ratio: 16:10), in terms of display resolution my monitor displays a better image than the HD 720 (Also known as HD Ready) 1280X720 (Ratio: 16:9), how come it's considered better? (or it's only the publicity that made such a big deal of the HD?)

1280x720 is considered "better" by people who don't know better. Computer moniters have long supported higher resolutions then commerical TV's (640x480 = 480p), so all the HD hype is just marketing.

Why isn't QSXGA which has the best resolution display, isn't being sold in stores instead of the HD technology? If it has a better display resolution even than the HD 1080 (Also known as HD...
The best display that my monitor (XEROX) can show according to it's specifications is 1680X1050 (Ratio: 16:10), in terms of display resolution my monitor displays a better image than the HD 720 (Also known as HD Ready) 1280X720 (Ratio: 16:9), how come it's considered better? (or it's only the publicity that made such a big deal of the HD?)

1280x720 is considered "better" by people who don't know better. Computer moniters have long supported higher resolutions then commerical TV's (640x480 = 480p), so all the HD hype is just marketing.

Why isn't QSXGA which has the best resolution display, isn't being sold in stores instead of the HD technology? If it has a better display resolution even than the HD 1080 (Also known as HD Full) 1920X1080 (Ratio: 16:9)

Lack of demand [People JUST got 1080p TV's], cost, size, and frankly, avaliable bandwith to transmit the image all play a part.

What is the difference between 1080p and 1080i, there is no need for a very deep explanation about the last question, I just want to know in generally.

With progressive scan, an image is captured, transmitted and displayed in a path similar to text on a page: line by line, from top to bottom. The interlaced scan pattern in a CRT (cathode ray tube) display completes such a scan too, but only for every second line. This is carried out from the top left corner to the bottom right corner of a CRT display. This process is repeated again, only this time starting at the second row, in order to fill in those particular gaps left behind while performing the first progressive scan on alternate rows only.

Such scan of every second line is called interlacing. A field is an image that contains only half of the lines needed to make a complete picture. The afterglow of the phosphor of CRTs, in combination with the persistence of vision results in two fields being perceived as a continuous image which allows the viewing of full horizontal detail with half the bandwidth that would be required for a full progressive scan while maintaining the necessary CRT refresh rate to prevent flicker.

CRTs and ALiS plasma panels can display interlaced video directly, other display technologies may require some form of deinterlacing. Modern CRT-based monitors used as computer displays utilize progressive scanning, thus also require deinterlacing.

http://en.wikipedia.org/wiki/Interlace

Basically, half the lines aren't drawn in order to save bandwith.
 
Solution

l_r_c_t

Distinguished
Apr 8, 2009
275
0
18,790
So it means that my monitor today (details in the first post) can display a better image than HD 720?

About the differences between "p" and "i", you didn't explain it very well, or I didn't get it (sorry if so), the "i" stands for "Interlacing"?

Thank you.
 

raiden-kun

Distinguished
Feb 18, 2010
39
0
18,530
Excuse me,
I agree with all said here but if HD is just marketing and monitors used higher resolutions befor HD, then why do people need the HDMI cables? If that image quality was available with standard monitor/TV cables why the usage of these new HDMI ?
 

l_r_c_t

Distinguished
Apr 8, 2009
275
0
18,790
First of all, HD broadcasting in TV's wasn't available until recent time (last 1-2.5 years).

HDMI is exactly the same like DVI, only that by using the HDMI interface you can transmit audio in addition to video. It's necessary when you want to connect a DVD/Home Theater/Computer to a TV using only one cable.