LCD resolutions / aspect ratios

G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

over the next few months i'll be in the market for a highend widescreen
display, around 23/24" size. why are all widescreen LCD monitors 16:10
aspect ratio instead of 16:9 which is also HDTV spec? i think driver
support has been around for ages for the likes of 1920 x 1080
resolution.... in theory when you're watching a DVD on a 16:10 monitor
you're either scaling to an incorrect ratio or getting black bars on
your monitor... is that right..?
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"gimp" <anonymous@smeg.com> wrote in message
news:d51c49$7jc$1@lust.ihug.co.nz...
> over the next few months i'll be in the market for a highend widescreen
> display, around 23/24" size. why are all widescreen LCD monitors 16:10
> aspect ratio instead of 16:9 which is also HDTV spec? i think driver
> support has been around for ages for the likes of 1920 x 1080
> resolution.... in theory when you're watching a DVD on a 16:10 monitor
> you're either scaling to an incorrect ratio or getting black bars on
> your monitor... is that right..?

Yes, that's right.

16:10 was established fairly early on in the first "widescreen"
CRT monitors (such as the Sony GDM-W900 line), as it
permitted the display of two pages of text side-by-side at the
normal aspect ratio of the printed page. It was also a fair match
to the 16:9 aspect ratio which was starting to become the
widescreen TV standard. When wide LCDs were first being
developed, they also adopted the 16:10 (or the similar 15:9)
aspect ratio.

If you're displaying widescreen video on a 1920 x 1200 panel,
the best way (IMHO) to do this is as a 1920 x 1080 image
with 60 unused pixels top and bottom. Why people feel that
"black bars" are necessarily a bad thing is beyond me - and
this way, you see the original image at precisely the correct
format.

Pro graphics/video applications also like the 16:10 AR, as this
permits a full-screen 16:9 image (again at its native 1920 x 1080
format) while leaving sufficient screen space for editing controls
and the like.

Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Myers" <nospamplease@address.invalid> wrote in message
news:S9dde.4780$2_5.2815@news.cpqcorp.net...
>
> "gimp" <anonymous@smeg.com> wrote in message
> news:d51c49$7jc$1@lust.ihug.co.nz...
>> over the next few months i'll be in the market for a highend widescreen
>> display, around 23/24" size. why are all widescreen LCD monitors 16:10
>> aspect ratio instead of 16:9 which is also HDTV spec? i think driver
>> support has been around for ages for the likes of 1920 x 1080
>> resolution.... in theory when you're watching a DVD on a 16:10 monitor
>> you're either scaling to an incorrect ratio or getting black bars on
>> your monitor... is that right..?
>
> Yes, that's right.
>
> 16:10 was established fairly early on in the first "widescreen"
> CRT monitors (such as the Sony GDM-W900 line), as it
> permitted the display of two pages of text side-by-side at the
> normal aspect ratio of the printed page. It was also a fair match
> to the 16:9 aspect ratio which was starting to become the
> widescreen TV standard. When wide LCDs were first being
> developed, they also adopted the 16:10 (or the similar 15:9)
> aspect ratio.
>
> If you're displaying widescreen video on a 1920 x 1200 panel,
> the best way (IMHO) to do this is as a 1920 x 1080 image
> with 60 unused pixels top and bottom. Why people feel that
> "black bars" are necessarily a bad thing is beyond me - and
> this way, you see the original image at precisely the correct
> format.
>
> Pro graphics/video applications also like the 16:10 AR, as this
> permits a full-screen 16:9 image (again at its native 1920 x 1080
> format) while leaving sufficient screen space for editing controls
> and the like.
>
> Bob M.
>
I think it's surprising how many people even "in the industry" are
not aware of this - at last years VESA DI conference, a Microsoft
presenter asked the question, and I was the only one willing to
volunteer the answer (Bob, you were "home" so probably did not
hear the question).
What was more surprising was that when someone in the audience
asked for the answer to be repeated by the presenter, he gave an
entirely different one, sputtering something about memory usage....

My 2 cents

NGA (Not to be confused with the oigional poster, gimp)
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Not Gimpy Anymore" <nogimpREMOV@msn.com> wrote in message
news:faqde.171960$cg1.16728@bgtnsc04-news.ops.worldnet.att.net...
> I think it's surprising how many people even "in the industry" are
> not aware of this - at last years VESA DI conference, a Microsoft
> presenter asked the question, and I was the only one willing to
> volunteer the answer (Bob, you were "home" so probably did not
> hear the question).

No, I don't recall that one. Serves me right for not attending more
of these things in person...:)

I think a lot of people have unfortunately become used to the idea
that displayed imagery "should" fill the screen, based entirely on the
fact that CRT TVs traditionally overscanned (i.e., the displayed image
actually extends BEYOND the limits of the CRT screen, and the
program creators understand and allow for this, basically by making
sure that nothing of importance occupies the outer 5-10% of the
image area). What's less commonly understood is that this was in
the first place done to hide problems with the CRT technology of the
day (or at least what could be implemented at reasonable cost),
namely ragged raster edges and the inability of the TV's CRT
display to hold the image size perfectly constant (due to HV
regulation issues and so forth). Ideally, you shouldn't be running
the active image out to the edges of a CRT display, since the
image quality deteriorates quite rapidly out in the corners and extreme
edges.

Bob M.