What makes a card and monitor hd? Resolutions?

drthrd

Distinguished
May 4, 2010
93
0
18,640
When I was looking around for monitors I seen several that said they would do 1920x1080. Some of them said they would do 1080p and others did not say. If both would do the same resolution what is the difference? I noticed that my GeForce4 MX 4000 would do 1920x1080 but it does not say it will do 1080p or 720p, it just says it will do HD.
 
The resolutions would be the same.

Some units will paint half the pixels every cycle, while others will do all of them every cycle. The suffix "I" indicates alternate cycles, while "P" indicates every cycle.
I might be suspicious of a unit without P , but it might be just a simple omission. There are also monitors that do 1920 x 1200. The 1080P units are cheaper because they use TV panels which are made in larger numbers.
 

jb6684

Distinguished
Computer displays are all "P" Progressive scan. This system draws all lines of information in the frame in one pass. This applies to video cards and computer monitors. In the case of 1920x1080 that would be 1080 horizontal lines per frame.

Only TV and Video displays support any kind of "I" Interleave scanning. This system draws all the odd lines of information and displays it, this is called a field. Next, all the even lines are drawn. These two fields make up a frame.