full hd resolution? hd resolution? hd ready resolution?

Status
Not open for further replies.
Solution
Here're the main resolutions:

High Definition (HD)/ HD Ready - Also known as 720p. True 720p is a resolution 1280 pixels horizontally and 720 pixels vertically being put on screen progressively, hence 720p. However, most monitors and televisions today that are marketed as 720p actually have a resolution slighter higher than that at 1366x768. It'd be more accurate to call it 768p, but no one does.

High Definition plus (HD+) - Also known as 900p. It's a resolution of 1600x900 being put on screen progressively. A nice medium between 720p and 1080p.

Full High Definition (FHD) - Also known as 1080p. It's a resolution of 1920x1080 being put on screen progressively and is the most common resolution for PC gamers and somewhat for...

Deus Gladiorum

Distinguished
Here're the main resolutions:

High Definition (HD)/ HD Ready - Also known as 720p. True 720p is a resolution 1280 pixels horizontally and 720 pixels vertically being put on screen progressively, hence 720p. However, most monitors and televisions today that are marketed as 720p actually have a resolution slighter higher than that at 1366x768. It'd be more accurate to call it 768p, but no one does.

High Definition plus (HD+) - Also known as 900p. It's a resolution of 1600x900 being put on screen progressively. A nice medium between 720p and 1080p.

Full High Definition (FHD) - Also known as 1080p. It's a resolution of 1920x1080 being put on screen progressively and is the most common resolution for PC gamers and somewhat for businesses, as well as a lot of televisions. It's exactly 2.25 the number of pixels as true 720 (1280x720) Be careful when it comes to televisions, though, because a huge number of television manufacturers will list a 1366x768 television as "1080p". Yea, terrible people.

Full HD Interlaced (1080i) - The resolution is the same as 1080p -- 1920x1080. The difference is that pixels aren't put on screen progressively, but are instead interlaced. I'll explain that later, but everything that has a "p" at the end of it (e.g. 720p, 1080p, 900p) is being drawn progressively, and everything that has an "i" at the end of it (e.g. 1080i) is being interlaced.

Quad High Definition (QHD) - A resolution of 2560x1440, or 1440p. It's exactly 4 times the number of pixels as 720p, hence the name. I don't think any televisions are made like this, it's really just a monitor thing for gamers, I believe.

Ultra High Definition (UHD/4K) - A resolution of 3840x2160 or 2160p. It's called "4K" because it has almost 4,000 horizontal pixels (in actuality 3840), and is exactly 4 times the total number of pixels as 1080p/1080i, and is exactly 9 times the number of pixels as 720p.

Progressive and Interlaced:

In short, progressive is always better. Always. But to explain it more...

Progressive ("XXXXp") - Progressive video is video that is drawn all at once. When an image comes on screen, every pixel on screen updates to display that image at the same time. Let's say you're watching a movie and we take a look at three sequential frames. Before the first frame comes on, there's only a black screen. When the first frame does display, every pixel will be drawn on screen at once. When the second frame displays, every pixel on screen updates at once to match the image in the second frame. And of course, in the third frame, again every pixel updates to match the image.

Interlaced ("XXXXi") - Interlaced video is video that is not drawn all at once. Instead, every other line of pixels is drawn one after the other. Let's take the same movie and we look at the same three sequential frames but now say the frames are drawn in an interlaced format. We start out with the same black screen. When the first frame finally displays, only the odd lines of your screen are updated, so the 1st, 3rd, 5th, 7th, etc rows of pixels are updated to match the first frame. However, for this first frame the even lines of pixels (2nd, 4th, 6th, 8th, etc) are not updated. Instead, since the screen was black before the first frame, the even rows of pixels are still just black, and are missing information. When the 2nd frame is displayed, now the even rows of pixels will be updated to match the 2nd frame. However, the odd rows of pixels aren't updated and instead remain the same as they did in the first frame. The odd rows of pixels won't be updated again until the 3rd frame, but as you may have guessed, when the 3rd frame comes around, the even rows of pixels will not be updated and will remain the same as they did in the 2nd frame. So essentially, you will always have a row of pixels that's a frame behind. This brings about several effects: first, there is a motion blur effect, since there's always a row of pixels behind in time. Second, there're "artifacts" or basically odds lines that plague your image since you're getting one half from each of two different images to overlap one another. Third, motion is much less smooth, since each half of your image is always skipping forward a whole frame before you get to see it. Here's an example of progressive vs interlaced:

interlaced_vs_progressive.gif




Anyway, hope that helps. I took too long to write this, and am now feeling quite sad that I took so long, so now I'm off to bed.
 
Solution
Status
Not open for further replies.