Sign in with
Sign up | Sign in
Your question

1920 pixel native horizontal resolution

Last response: in Home Theatre
Share
Anonymous
August 22, 2004 1:10:17 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

BlankIs it true that your set has to have a display of 1920 pixel native
horizontal resolution to take full advantage of HDTV? If so how much better
will the picture be?
Anonymous
August 22, 2004 1:10:18 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <ZSOVc.292576$a24.66717@attbi_s03>, rppb@comcast.net
says...
> BlankIs it true that your set has to have a display of 1920 pixel native
> horizontal resolution to take full advantage of HDTV? If so how much better
> will the picture be?

Yes, but in practice hardly anyone is transmitting HD DTV with 1920
resolution. Generally, most broadcasters are BW limiting to
something like 1400. Most current CRTs won't resolve 1920, since
they are using tubes designed for SD resolutions. The Sony XBR
series introduced higher resolution CRTs last year. Other
manufacturers may have followed. When consumers have access to HD DVD
and HD test pattern DVDs become available, I predict a lot of people
will learn some unhappy facts about what resolutions their monitors
are really capable of. This will also inroduce a demand for fully-
capable HD res monitors.

How much better is 1920 than 1400? Open question - most people have
never seen real 1920.
Anonymous
August 22, 2004 1:10:19 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

Chris Thomas wrote:

> Yes, but in practice hardly anyone is transmitting HD DTV with 1920
> resolution. Generally, most broadcasters are BW limiting to
> something like 1400. ...

Chris,

PBS, NBC, CBS, HDNet, Showtime, HBO, etc are all transmitted in 1080i I
believe. This means that they are transmitted in 1080 X 1920, does it
not? The problem is that the 720p 0r 768p displays only have (16/9)
times as many pixels in the horizontal so that the 1080i signal must be
scaled and will therefore be displayed at a lesser resolution.

If we have a display that has a 1080 X 1920 native resolution it has the
capacity to display the 720p, the 1080i, and also the 1080p (when
available.)

This is my understanding of the situation.

Jerry
Related resources
Anonymous
August 22, 2004 1:10:20 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <4127ed88@news101.his.com>, jsheldonNOTTTHIS@his.com
says...
> Chris,
>
> PBS, NBC, CBS, HDNet, Showtime, HBO, etc are all transmitted in 1080i I
> believe. This means that they are transmitted in 1080 X 1920, does it
> not?

No, it does not. When an HD video signal is encoded, it is compressed
using the MPEG2 lossey compression scheme. Lossey is a technical term
meaning the original signal cannot be identically reconstructed from
the compressed signal. This is different than what is done with,
say, a ZIP file, because those would be useless if the original input
could not be recovered, bit-idential. The advantage of lossey
compression is that much levels of compression can be achieved.

The MPEG2 standard includes several "profiles", or different
compression schemes. One preserves 1920 horizontal pixels. This
compression is typically used for in-studio use. However, for
broadcast, the alternate profile called "1440 high" is typically
used. This uses only 3/4 of the bandwidth of the 1920 profile, and
provides 1440 horizontal pixels when recovered. One reason is to
save transmission bandwidth for say a second subchannel. But the
main reason is to allow some headroom. If a piece of equipment in
the broadcast chain has even the slightest BW restriction, 1920 will
end up horribly distorted, while 1440 will survive.

Unfortunately, a full explanation of MPEG2 requires something like a
semester-long college-level engineering course with a lot of math. In
short, many of the commonly used HD mobile video cameras only support
1440, and most broadcasters today are using "1440 high" compression
for transmission. There is no easy way to tell what MPEG2 profile has
been applied to a given signal. The most direct way is to convert
the singal to analog and use a spectrum analyzer to see what the
highest frequency components present in the recovered signal are.
This is beyond the means of the average consumer.

/Chris, AA6SQ
August 22, 2004 2:44:54 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

> Yes, but in practice hardly anyone is transmitting HD DTV with 1920
> resolution. Generally, most broadcasters are BW limiting to
> something like 1400. Most current CRTs won't resolve 1920, since
> they are using tubes designed for SD resolutions. The Sony XBR
> series introduced higher resolution CRTs last year. Other
> manufacturers may have followed. When consumers have access to HD DVD
> and HD test pattern DVDs become available, I predict a lot of people
> will learn some unhappy facts about what resolutions their monitors
> are really capable of. This will also inroduce a demand for fully-
> capable HD res monitors.

All this makes me wonder what the Japanese used to watch HDTV when their analogue system started about 13 years ago?
August 22, 2004 3:47:16 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

First, learn the difference between pixels and lines of resolution.


"rppb" <rppb@comcast.net> wrote in message
news:ZSOVc.292576$a24.66717@attbi_s03...
> BlankIs it true that your set has to have a display of 1920 pixel native
> horizontal resolution to take full advantage of HDTV? If so how much
better
> will the picture be?
>
>
Anonymous
August 22, 2004 3:57:16 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

Chris Thomas (cthomas@mminternet.com) wrote in alt.tv.tech.hdtv:
> The MPEG2 standard includes several "profiles", or different
> compression schemes. One preserves 1920 horizontal pixels. This
> compression is typically used for in-studio use. However, for
> broadcast, the alternate profile called "1440 high" is typically
> used. This uses only 3/4 of the bandwidth of the 1920 profile, and
> provides 1440 horizontal pixels when recovered.

Almost no station uses 1440x1080 as the MPEG settings. None in either
of the DMAs I receive do, based on the examination I have made of the MPEG
data carried in the ATSC stream.

Stations generally limit the bandwidth before the MPEG encode stage via
filtering.

--
Jeff Rife |
SPAM bait: | http://www.nabs.net/Cartoons/Dilbert/SalesToFriends.gif
AskDOJ@usdoj.gov |
spam@ftc.gov |
Anonymous
August 22, 2004 4:20:10 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

Chris Thomas wrote:

> No, it does not. When an HD video signal is encoded, it is compressed
> using the MPEG2 lossey compression scheme. Lossey is a technical term
> meaning the original signal cannot be identically reconstructed from
> the compressed signal. This is different than what is done with,
> say, a ZIP file, because those would be useless if the original input
> could not be recovered, bit-idential. The advantage of lossey
> compression is that much levels of compression can be achieved.
>
> The MPEG2 standard includes several "profiles", or different
> compression schemes. One preserves 1920 horizontal pixels. This
> compression is typically used for in-studio use. However, for
> broadcast, the alternate profile called "1440 high" is typically
> used. This uses only 3/4 of the bandwidth of the 1920 profile, and
> provides 1440 horizontal pixels when recovered. One reason is to
> save transmission bandwidth for say a second subchannel. But the
> main reason is to allow some headroom. If a piece of equipment in
> the broadcast chain has even the slightest BW restriction, 1920 will
> end up horribly distorted, while 1440 will survive.

Interesting. Since there are few HD TVs out there that can handle the
full 1080x1920 pixel resolution, it makes sense for the broadcasters to
compress to 1440 horizontal. Once full blown 1080x1920 fixed pixel TVs
become common (not for a few years), I expect people will complain but
to no avail. I can see down the road a "Superbit" equivalent HD DVD/
Blue Ray release with a full 1080p x 1920 encoding - at a higher price
of course.

> Unfortunately, a full explanation of MPEG2 requires something like a
> semester-long college-level engineering course with a lot of math. In
> short, many of the commonly used HD mobile video cameras only support
> 1440, and most broadcasters today are using "1440 high" compression
> for transmission. There is no easy way to tell what MPEG2 profile has
> been applied to a given signal. The most direct way is to convert
> the singal to analog and use a spectrum analyzer to see what the
> highest frequency components present in the recovered signal are.
> This is beyond the means of the average consumer.
>
> /Chris, AA6SQ

Compression algorithm are not my area of expertise, but I do a fair
amount of dsp sw algorithm work for radars and other sensors. I will
have to read up on the MPEG formats sometime.

Alan
Anonymous
August 22, 2004 8:34:45 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

> Interesting. Since there are few HD TVs out there that can handle the
>full 1080x1920

I was under the impression that it was 1920x1080, and not the other way around.
We should all recognize that they are NOT the same thing, as each number
indicates a specific "line-bearing" - either horizontal or verticle.

>pixel resolution

Can there be such a thing? I thought resolution on LCD and DLP sets was
measured by millions of pixels on screen, not by line-presence as it would be
with tube and CRT-lens based RPTVs.

>it makes sense for the broadcasters to
>compress to 1440 horizontal

It has more to do with the need for bandwith and the limitations of older
broadcast equipment then it does with the capablility of modern TVs. Most HD
and many ED sets can downconvert 1080i when given it.
Anonymous
August 22, 2004 10:30:49 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <cO-dnSrYs7L-grXcRVn-rQ@comcast.com>,
Alan Figgatt <afiggatt@comcast.net> writes:
> Chris Thomas wrote:
>
>> No, it does not. When an HD video signal is encoded, it is compressed
>> using the MPEG2 lossey compression scheme. Lossey is a technical term
>> meaning the original signal cannot be identically reconstructed from
>> the compressed signal. This is different than what is done with,
>> say, a ZIP file, because those would be useless if the original input
>> could not be recovered, bit-idential. The advantage of lossey
>> compression is that much levels of compression can be achieved.
>>
>> The MPEG2 standard includes several "profiles", or different
>> compression schemes. One preserves 1920 horizontal pixels. This
>> compression is typically used for in-studio use. However, for
>> broadcast, the alternate profile called "1440 high" is typically
>> used. This uses only 3/4 of the bandwidth of the 1920 profile, and
>> provides 1440 horizontal pixels when recovered. One reason is to
>> save transmission bandwidth for say a second subchannel. But the
>> main reason is to allow some headroom. If a piece of equipment in
>> the broadcast chain has even the slightest BW restriction, 1920 will
>> end up horribly distorted, while 1440 will survive.
>
> Interesting. Since there are few HD TVs out there that can handle the
> full 1080x1920 pixel resolution, it makes sense for the broadcasters to
> compress to 1440 horizontal.
>
1440x1080 is NOT part of the ATSC spec. Don't expect that your TV
set (or STB) will be able to decode any form of 1440x1080. It is indeed
quite possible for a 1920x1080i signal to be pre-filtered to a bandwidth
that would be roughly equivalent to that which would be provided by a
1440H sampling structure. (That is, the horizontal frequency response would
be filtered such that the amount of information doesn't exceed what a
1440H sampling structure can represent.) It is also quite possible that
the vertical frequency response (even on 720p) be limited to less detail
than representable by the scanning structure. This 'smoothing' effect can
mitigate some uglifying that results from additional aliasing that crops
up due to the non-linear effects of truncating the DCT coefficients. (That
'aliasing' is part of what appears to make the DCT/quantization schemes
like DV or MPEG cause 'stairstepping' or 'aliasing' even though the theoretical
video frequency response is pre-filtered to avoid the aliasing effects.)

When doing the pre-filtering, and given any problems with adequate payload
to properly representing a video signal, that pre-filtering will have
an effect that will generally IMPROVE the video quality. This is because
the MPEG2 encoder will see less 'randomness' in the signal, an the DCT
coefficients will be better concentrated towards DC. The bad effects of
trying to encode too much detail will tend to disappear. So, one should
NOT be disappointed by the lack of a 1440x1080 mode, because there will
definitely be improvement of overly complex video scenes by doing some
prefiltering down to an appropriate level. Frankly, that choice of dropping
the MPEG encoding to 1440H from 1920H might be less optimal, because in
SOME cases, the dynamic removal of VERTICAL detail in the frequency domain
might solve more of the encoding problems than removal of horizontal detail
in the frequency domain.

So, the pre-filtering can actually IMPROVE the video quality, and might even
IMPROVE the true visible detail by eliminating the confounding effects of
aliasing (non-linear processing) that is natural with MPEG/DV25/DV50 or other
DCT based redundancy removal techniques. I am not sure that the tradeoffs
where the LOT type schemes are used are quite the same. I'd suspect that
the lapped transform schemes would be more resistant to the ugly aliasing
effects, with only a slight (probably invisible) increase of 'ringing' effects.
(LOTs definitely help with the blocky uglifying effects.)

When doing the analog pre-filtering, it is important that the filter maintain
a time delay vs. frequency such that transients do not spread. Even a
'flat' frequency response with a messed up *group delay* will 'spread' with
an effect that is different than, but similar to in some ways to a narrower
frequency response. Such 'pre-filtering' as supplied by pro-gear will just
be done 'correctly' and would be selectable by a configuration parameter
or would be an artifact of the design itself (e.g. the original HDCAM.)
(One common example that looks similar to bad group delay might be the
manifestation by most consumer VCR gear of the 'spread' that appears with
large black to white areas and noting the spreading of the black into
the white area, even though the VCR freq response doesn't expose that
weird response characteristic. For FM VCRs, actually, that spreading might
come from inadequate handling of the FM signal, causing a nonlinear effect
that is similar to bad group delay in linear filters.) One difference
between my really pro video gear and the cheapo FM luma VCRs (e.g. SVHS, Hi8)
is that such black to white transitions are so perfect that they look like
the direct output of a signal generator.

If one was doing a filter (for the first time) that would 'look good' for
video, a good first start would be to look at the bessel filters. Once the
filter transfer function droops vs. frequency, then the need for absolutely
flat group delay at that frequency becomes a little less important. Adding a
notch at a frequency where the video signal might be at -20dB in the transfer
function MIGHT be helpful to limit the bandwidth at some expense of video
fidelity.

John
Anonymous
August 22, 2004 12:19:02 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

>In article <4127ed88@news101.his.com>, jsheldonNOTTTHIS@his.com
>says...
>> Chris,
>>
>> PBS, NBC, CBS, HDNet, Showtime, HBO, etc are all transmitted in 1080i I
>> believe. This means that they are transmitted in 1080 X 1920, does it
>> not?
>
>No, it does not. When an HD video signal is encoded, it is compressed
>using the MPEG2 lossey compression scheme. Lossey is a technical term
>meaning the original signal cannot be identically reconstructed from
>the compressed signal. This is different than what is done with,
>say, a ZIP file, because those would be useless if the original input
>could not be recovered, bit-idential. The advantage of lossey
>compression is that much levels of compression can be achieved.
>
>The MPEG2 standard includes several "profiles", or different
>compression schemes. One preserves 1920 horizontal pixels. This
>compression is typically used for in-studio use. However, for
>broadcast, the alternate profile called "1440 high" is typically
>used. This uses only 3/4 of the bandwidth of the 1920 profile, and
>provides 1440 horizontal pixels when recovered. One reason is to
>save transmission bandwidth for say a second subchannel. But the
>main reason is to allow some headroom. If a piece of equipment in
>the broadcast chain has even the slightest BW restriction, 1920 will
>end up horribly distorted, while 1440 will survive.
>
>Unfortunately, a full explanation of MPEG2 requires something like a
>semester-long college-level engineering course with a lot of math. In
>short, many of the commonly used HD mobile video cameras only support
>1440, and most broadcasters today are using "1440 high" compression
>for transmission. There is no easy way to tell what MPEG2 profile has
>been applied to a given signal. The most direct way is to convert
>the singal to analog and use a spectrum analyzer to see what the
>highest frequency components present in the recovered signal are.
>This is beyond the means of the average consumer.
>
>/Chris, AA6SQ

So what you are basically saying is that MPEG2 essentially resizes the image in
the form that you are describing. But how how is that possible without
distorting the original widescreen aspect ratio? For that to be the case then
the 1080 portion would need to be resized too, wouldn't it?

In either case what you are describing is not compression but simply a
resizing the image. When you convert a still image to a .jpg or compressed
..tif you are not altering the dimensions of the original image in any way.
What's the point of a compression method that does so with video?
Anonymous
August 22, 2004 12:23:10 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

>Can there be such a thing? I thought resolution on LCD and DLP sets was
>measured by millions of pixels on screen, not by line-presence as it would be
>with tube and CRT-lens based RPTVs.
>

Resolution on LCDs is measured by the number of pixels horizontally and
vertically. Some advertisers like to make things seem impressive by giving the
total number of pixels. Kind of like with digital cameras.
Anonymous
August 22, 2004 10:25:23 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

John S. Dyson (toor@iquest.net) wrote in alt.tv.tech.hdtv:
> 1440x1080 is NOT part of the ATSC spec. Don't expect that your TV
> set (or STB) will be able to decode any form of 1440x1080.

Actually, it can.

There are two different resolution settings on the MPEG encoding, and it
would be possible (although silly) to keep the 1920x1080 setting in one
that would keep the STB happy, but only put 1440x1080 pixels in the
"meaningful" part.

Since many, many stations send 1920x1080 with 1920x1088 as the "meaningful"
part (which is just a mistake on their part), I'm pretty sure that most
STBs can handle this sort of setup.

--
Jeff Rife | "Hey, Brain, what do you wanna do tonight?"
SPAM bait: |
AskDOJ@usdoj.gov | "The same thing we do every night, Pinky...
spam@ftc.gov | try to take over the world."
Anonymous
August 23, 2004 2:06:10 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

"Jeff Rife" <wevsr@nabs.net> wrote in message
news:MPG.1b92e75a23b37a8f9897d0@news.nabs.net...
> John S. Dyson (toor@iquest.net) wrote in alt.tv.tech.hdtv:
> > 1440x1080 is NOT part of the ATSC spec. Don't expect that your TV
> > set (or STB) will be able to decode any form of 1440x1080.
>
> Actually, it can.
>
> There are two different resolution settings on the MPEG encoding, and it
> would be possible (although silly) to keep the 1920x1080 setting in one
> that would keep the STB happy, but only put 1440x1080 pixels in the
> "meaningful" part.
>
> Since many, many stations send 1920x1080 with 1920x1088 as the
"meaningful"
> part (which is just a mistake on their part), I'm pretty sure that most
> STBs can handle this sort of setup.
>
(I am not meaning to be cranky, and trying to make sense and being very
accurate while thinking about
another issue that is totally disconnected from this...)

Okay -- you are creating a technicality by using the 1920H scheme, but only
populating a portion of the screen. Even then, I doubt that the STB would
normally
have a feature that would expand the center portion (or left or right) of
the screen
to fully fill the screen, thereby correcting the aspect ratio (unless it is
meant as
a generic zoom feature, or there would be a semi-standard behavior by the TV
stations
that they would only populate the center portion of the screen... Whether
or not to expand
the image would be dependent upon the requirements for correct aspect
ratio.)

However, it is still true that the MPEG2 being formatted at 1920H x 1080V
(and
1280H x 720V) are currently the only HDTV formats that would guaranteed to
be
decodeable by the STBs. If you look at the MPEG2 syntax, that would still
show
the signal to be of the 'standard' type as prescribed by ATSC.

In some cases, the microcode in the MPEG2 decoders might handle some
nonstandard
formats, and the kind of 'nonstandard' MPEG2 streams would be very dependent
upon
the logic and microcode in the design of the units. I doubt that there is
truly much flexibility
in the decoders, but I would expect that some common sense extensions might
be internally
implemented (even if the capability isn't externally realizable.) As an
example, if a family
of encoders used by TV stations has a common flaw (or TV stations have a
common setup
error for their encoding), then it is likely that the decoders will be
designed to work properly
with the nonstandard syntax. The consumer decoders aren't designed in a
vacuum, and I
suspect that the development labs check their equipment while watching the
streams from
actual TV stations. Of course, the real world observations aren't the ONLY
tests, but such
verification has been done since the days when AM radio was in its infancy.
1440H x 1080V is
likely TOO nonstandard to be an externally supported standard -- unless
there is a defacto
deviant standard, but of course, a TV station producing a theoretically 100%
compliant
1920H x 1080H stream could certainly populate only the center 1440H
of a 1920H signal, and a feature could be provided by a consumer TV
manufacturer that would
allow expanding that image (or leaving it the same, if the aspect ratio
would be correct.)

What will happen over time is that the 'mistakes' made in the real-world
existant encoders
will be accomodated by the decoders, and various oddball encodings might be
accomodated.
It would be best to avoid any assumption that anything other than the
standard will be
accomodated. It is also very likely that accomodations for nonstandard
scanning will
mostly support 'off by one or off by eight' errors, perhaps a few extra or a
few missing
scanlines.
(Be VERY restrictive and compliant WRT the protocol that you transmit,
and be very
accepting and flexible WRT the protocol that you can receive.)

The digital world is different than the old time composite video schemes,
where 'approximately
correct' scanning schemes had to be accomodated for low quality or CCTV
systems. Nowadays,
perfect scanning for HDTV or SDTV doesn't require a rack of electronics.
Nowadays, the idea
of having to accomodate non-interlaced (or 'random' interlaced) CCTV cameras
only applies
because of legacy situations where the true scanning standards used to be
impossible to
implement without a studio facility. (The birth of digital semiconductor
logic helped to
support standard scanning in consumer gear, even if the absolute accuracy or
the time
base stability in electro-mechanical systems might not fully adhere to
specficiations.)

(WRT the digital schemes: there are indeed some weird or unpleasnt
cases of handling
issues like 'interlace' or the transforms of signals whose scanning
structure aren't divisible
by 8x8... These are examples of cases where ugly handling of end
conditions has to be
well agreed upon. Over time, when starting with JPEG which didn't
handle interlace, and
hackery allowed JPEG to support interlace at the cost of temporal issues
or loss of compression
effectiveness.... Eventually, schemes like DV25 handled the problems of
interlace while still
maintaining effective compression.)

So, even with the new schemes, over time, there are needed improvements.
Being able to accomodate
'common sense' variants of the 'standard' schemes can be useful, but the
only standards that really
need to be supported are those that are actually used. :-). So, you'll be
unlikely to see a 1440Hx1080V
decoder per-se, at least unless an ad-hoc standard is actually used by a
video source.

The problem with supporting all consistant scanning structures by default is
that testing becomes
more and more of a nightmare. It is much more important to support the
'standard' and ubiquitious
standards very well instead of potentially wasting time to support
relatively unused modes.
Everyone is limited by funding, and so tend to focus on the testing and
design that is targeted
towards the real-world userbase.

John
Anonymous
August 23, 2004 4:52:36 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

John Dyson (dyson@iquest.net) wrote in alt.tv.tech.hdtv:
> Okay -- you are creating a technicality by using the 1920H scheme,
> but only populating a portion of the screen. Even then, I doubt that
> the STB would normally have a feature that would expand the center
> portion (or left or right) of the screen to fully fill the screen,
> thereby correcting the aspect ratio

Nope, it would work. HDTV has 1:1 aspect ratio (square) pixels but most
MPEG decoders in the STBs will be fully compliant, so you could just use
non-square pixels. It should feed the output correctly, since the output
is often scaled from the original source anyway.

So, the MPEG decoder would take in 1440x1080 and feed the output 1920x1080
with stretched pixels.

This sort of flexibility is required in the MPEG decoder because MPEG is
such a "loose" spec, with more than one way to skin each different type of
cat.

--
Jeff Rife | "I don't have to be Ray Liotta: movie star,
SPAM bait: | anymore. I can be Ray Liotta: Maya's boyfriend.
AskDOJ@usdoj.gov | All I want to do is regular, boring, ordinary
spam@ftc.gov | couple things."
| "Then you, sir, have hit the soul-mate lottery."
| -- Ray Liotta and Nina Van Horn, "Just Shoot Me"
Anonymous
August 23, 2004 9:50:10 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <MPG.1b93421d453aff409897d4@news.nabs.net>,
Jeff Rife <wevsr@nabs.net> writes:
> John Dyson (dyson@iquest.net) wrote in alt.tv.tech.hdtv:
>> Okay -- you are creating a technicality by using the 1920H scheme,
>> but only populating a portion of the screen. Even then, I doubt that
>> the STB would normally have a feature that would expand the center
>> portion (or left or right) of the screen to fully fill the screen,
>> thereby correcting the aspect ratio
>
> Nope, it would work. HDTV has 1:1 aspect ratio (square) pixels but most
> MPEG decoders in the STBs will be fully compliant, so you could just use
> non-square pixels. It should feed the output correctly, since the output
> is often scaled from the original source anyway.
>
> So, the MPEG decoder would take in 1440x1080 and feed the output 1920x1080
> with stretched pixels.
>
How do you know that you should display the pixels in stretched mode,
unless you have added a flag that is RECOGNIZED by the decoder in that
specific mode? Hint: if the special mode isn't in the ATSC standard, then
it will likely NOT be tested for. The more general MPEG standard is specious
WRT the specific and limited criteria for the for the ATSC standard.
Along with the mode not being test for, the necessary changes in external
state will not occur if it isn't apriori specified.

>
> This sort of flexibility is required in the MPEG decoder because MPEG is
> such a "loose" spec,
>
Note that MPEG is NOT the standard for OTA HDTV, but ATSC is. ATSC
refers to specific modes of MPEG that are supported. If you look at
the specification for real world HDTV decoders, you'll find that they
aren't necessarily specified to support MPEG per se, but to support
ATSC and other world HDTV standards, which then POINT TO MPEG as a
syntax. ATSC and other world HDTV standards specify the specific modes
of MPEG that it supports. ATSC and various other HDTV
decoders support only the modes that they are specified to support.
They do NOT generally support a general, wide range MPEG specification,
but support specific modes for MPEG (usually including European, Chinese,
American, Japanese choices of MPEG features.) In some cases, some
eccentric modes are supported, but those are exceptions to the rule
that the most fully tested behavior and the exposed features are
specified and specific.

Full implementation of all features in MPEG just doesnt' happen in
a real world consumer (or even generally in pro) products. A research
device might be designed to be flexible, but that would be exceptional.

Similarly, when you look at the DVD spec (given generally more public
information today), you'll notice that it refers to specific modes
of MPEG. The same situation is valid for ATSC. DVD decoders aren't
specified to decode MPEG, but use the MPEG syntax with the modes that
the decoders are designed for.

MPEG is alot like a Chinese menu, and the various specific standards
select the various items from various columns.

John
Anonymous
August 24, 2004 7:52:30 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

Sending out 'oddball' resolutions (360x480, 480x480, etc.)
has been pretty standard in the satellite/TV world for
years now. The STB restores the proper aspect-ratio to
the 'non-square' pixels.

Since the MPEG-decoders are also used in satellite boxes,
isn't it safe to assume most silicon decoder vendors
would support a similar feature (1440x1080 -> 1920) in
their decoder?

It's already a defacto part of Microsoft's WMV9. The
'1080p' WM9-DVDs are actually encoded (picture bitmaps)
at 1440x1080, then stretched to a 16:9 aspect-ratio
in the player-platform. (Usually this is done by the
PC's VGA-card.) based on what a Microsoft rep said,
the 'display aspect ratio' and 'picture coding parameters
(dimensions' are independently coded and transmitted
in the bitstream.

though he didn't say what freedom (if any)
a content-provider has in altering these parameters.
Anonymous
August 24, 2004 1:42:14 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <2YyWc.10659$ei1.1489@newssvr27.news.prodigy.com>,
hello <hello@goodbye.com> writes:
> Sending out 'oddball' resolutions (360x480, 480x480, etc.)
> has been pretty standard in the satellite/TV world for
> years now. The STB restores the proper aspect-ratio to
> the 'non-square' pixels.
>
The STBs are designed to handle the 'oddball' modes. ATSC
STBs don't really need to encounter the really oddball modes
(1440) due to upconversion to standard rates. If there is
a WM9 capable STB, then it will support the modes that it
needs to.

Think again about 'testing' in a real world production environment
(I mean integration and system test, not end-of-production line
testing.) Adding irregular modes does increase the complexity beyond
just the development issues.

>
> Since the MPEG-decoders are also used in satellite boxes,
> isn't it safe to assume most silicon decoder vendors
> would support a similar feature (1440x1080 -> 1920) in
> their decoder?
>
No, not unless the mode is specified by a standard (or a defacto
standard that is used in the application.) For example, just
because a DVD might use an oddball mode, that doesn't mean that
an ATSC receiver will support it. Most often, that 'oddball' mode
will be upconverted before interface. There is certainly an
issue with interfacing between devices.

>
> It's already a defacto part of Microsoft's WMV9. The
> '1080p' WM9-DVDs are actually encoded (picture bitmaps)
> at 1440x1080, then stretched to a 16:9 aspect-ratio
> in the player-platform.
>
You'll still not see it in an ATSC STB, unless it becomes
ubiquitous.

I cannot really say 'any further' due to agreements. (Hint: I just
had an employment change.)

John
Anonymous
August 24, 2004 5:47:37 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

John S. Dyson (toor@iquest.net) wrote in alt.tv.tech.hdtv:
> How do you know that you should display the pixels in stretched mode,
> unless you have added a flag that is RECOGNIZED by the decoder in that
> specific mode?

Because the aspect ratio of the pixels is encoded into every MPEG stream.
Even the ATSC streams say "the pixels are 1:1" (well, most of the time...
there are modes where the pixels *aren't* 1:1).

> Note that MPEG is NOT the standard for OTA HDTV, but ATSC is.

That really doesn't matter, though, because after you strip the ATSC info,
you end up with one or more 100% MPEG2-compliant streams in the data.
Most of the "ATSC" spec is telling a receiver how to decide which streams
of video and audio to feed to the MPEG decoder (along with guide data,
etc., that is just tossed away by the time the MPEG decoder gets it).

> Full implementation of all features in MPEG just doesnt' happen in
> a real world consumer (or even generally in pro) products.

Actually, it does. You *have* to have full MPEG decoding capabilities
because you just don't know exactly what is going to happen. Again, the
fact that every single ATSC receiver gracefully handles 1920x1088 even
though it isn't in the ATSC spec should tell you something.

--
Jeff Rife | "What's goin' on down here?"
SPAM bait: | "Oh, we're playing house."
AskDOJ@usdoj.gov | "But, that boy is all tied up."
spam@ftc.gov | "...Roman Polanski's house."
| -- Lois and Stewie Griffin, "Family Guy"
Anonymous
August 24, 2004 5:50:08 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

John S. Dyson (toor@iquest.net) wrote in alt.tv.tech.hdtv:
> The STBs are designed to handle the 'oddball' modes.

No, the MPEG decoders in them are designed to handle those "oddball modes",
along with any other legal MPEG2 stream.

--
Jeff Rife | "This? This is ice. This is what happens to
SPAM bait: | water when it gets too cold. This? This is
AskDOJ@usdoj.gov | Kent. This is what happens to people when
spam@ftc.gov | they get too sexually frustrated."
| -- Chris Knight, "Real Genius"
Anonymous
August 24, 2004 9:40:13 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

John S. Dyson wrote:

> In article <2YyWc.10659$ei1.1489@newssvr27.news.prodigy.com>,
> hello <hello@goodbye.com> writes:
>
>>Sending out 'oddball' resolutions (360x480, 480x480, etc.)
>>has been pretty standard in the satellite/TV world for
>>years now. The STB restores the proper aspect-ratio to
>>the 'non-square' pixels.
>>
>
> The STBs are designed to handle the 'oddball' modes. ATSC
> STBs don't really need to encounter the really oddball modes
> (1440) due to upconversion to standard rates. If there is
> a WM9 capable STB, then it will support the modes that it
> needs to.
>
> Think again about 'testing' in a real world production environment
> (I mean integration and system test, not end-of-production line
> testing.) Adding irregular modes does increase the complexity beyond
> just the development issues.
>
>
>>Since the MPEG-decoders are also used in satellite boxes,
>>isn't it safe to assume most silicon decoder vendors
>>would support a similar feature (1440x1080 -> 1920) in
>>their decoder?
>>
>
> No, not unless the mode is specified by a standard (or a defacto
> standard that is used in the application.) For example, just
> because a DVD might use an oddball mode, that doesn't mean that
> an ATSC receiver will support it. Most often, that 'oddball' mode
> will be upconverted before interface. There is certainly an
> issue with interfacing between devices.
>
>
>>It's already a defacto part of Microsoft's WMV9. The
>>'1080p' WM9-DVDs are actually encoded (picture bitmaps)
>>at 1440x1080, then stretched to a 16:9 aspect-ratio
>>in the player-platform.
>>
>
> You'll still not see it in an ATSC STB, unless it becomes
> ubiquitous.
>
> I cannot really say 'any further' due to agreements. (Hint: I just
> had an employment change.)
>
> John
>
The lowest price 8-VSB receiver today (Hisense at WalMart) can handle
WM9. With business plans such as USDTV becoming the norm all receivers
being designed today must consider including such. Anyone who buys a
receiver that is not 5th gen and can handle WM9 will be sorry soon.
Anonymous
August 25, 2004 2:08:06 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <14LWc.32316$9Y6.15415@newsread1.news.pas.earthlink.net>,
Bob Miller <robmx@earthlink.net> writes:
> John S. Dyson wrote:
>
>> In article <2YyWc.10659$ei1.1489@newssvr27.news.prodigy.com>,
>> hello <hello@goodbye.com> writes:
>>
>>>Sending out 'oddball' resolutions (360x480, 480x480, etc.)
>>>has been pretty standard in the satellite/TV world for
>>>years now. The STB restores the proper aspect-ratio to
>>>the 'non-square' pixels.
>>>
>>
>> The STBs are designed to handle the 'oddball' modes. ATSC
>> STBs don't really need to encounter the really oddball modes
>> (1440) due to upconversion to standard rates. If there is
>> a WM9 capable STB, then it will support the modes that it
>> needs to.
>>
>> Think again about 'testing' in a real world production environment
>> (I mean integration and system test, not end-of-production line
>> testing.) Adding irregular modes does increase the complexity beyond
>> just the development issues.
>>
>>
>>>Since the MPEG-decoders are also used in satellite boxes,
>>>isn't it safe to assume most silicon decoder vendors
>>>would support a similar feature (1440x1080 -> 1920) in
>>>their decoder?
>>>
>>
>> No, not unless the mode is specified by a standard (or a defacto
>> standard that is used in the application.) For example, just
>> because a DVD might use an oddball mode, that doesn't mean that
>> an ATSC receiver will support it. Most often, that 'oddball' mode
>> will be upconverted before interface. There is certainly an
>> issue with interfacing between devices.
>>
>>
>>>It's already a defacto part of Microsoft's WMV9. The
>>>'1080p' WM9-DVDs are actually encoded (picture bitmaps)
>>>at 1440x1080, then stretched to a 16:9 aspect-ratio
>>>in the player-platform.
>>>
>>
>> You'll still not see it in an ATSC STB, unless it becomes
>> ubiquitous.
>>
>> I cannot really say 'any further' due to agreements. (Hint: I just
>> had an employment change.)
>>
>> John
>>
> The lowest price 8-VSB receiver today (Hisense at WalMart) can handle
> WM9. With business plans such as USDTV becoming the norm all receivers
> being designed today must consider including such. Anyone who buys a
> receiver that is not 5th gen and can handle WM9 will be sorry soon.
>
The standards that are supported will be those implemented -- you won't
find many units that support much more than the specification.

John
Anonymous
August 25, 2004 2:09:01 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <MPG.1b9549dc79a665f99897d9@news.nabs.net>,
Jeff Rife <wevsr@nabs.net> writes:
> John S. Dyson (toor@iquest.net) wrote in alt.tv.tech.hdtv:
>> The STBs are designed to handle the 'oddball' modes.
>
> No, the MPEG decoders in them are designed to handle those "oddball modes",
> along with any other legal MPEG2 stream.
>
The MPEG decoders that I am speaking of are part of the STB. (Hint, I know
more about future product than I am admitting to.) Not all decoders support
all of the oddball modes.

John
Anonymous
August 25, 2004 2:09:02 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

John S. Dyson (toor@iquest.net) wrote in alt.tv.tech.hdtv:
> The MPEG decoders that I am speaking of are part of the STB. (Hint, I know
> more about future product than I am admitting to.) Not all decoders support
> all of the oddball modes.

Then, they will be more expensive, because a general-purpose MPEG decoder
is cheaper in quantity than one that is less general but still has to
handle the toughest of MP@HL.

--
Jeff Rife |
SPAM bait: | http://www.nabs.net/Cartoons/ArloNJanis/CircularSaw.gif
AskDOJ@usdoj.gov |
spam@ftc.gov |
Anonymous
August 25, 2004 2:20:06 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <MPG.1b9549444bf8b4019897d8@news.nabs.net>,
Jeff Rife <wevsr@nabs.net> writes:
> John S. Dyson (toor@iquest.net) wrote in alt.tv.tech.hdtv:
>> How do you know that you should display the pixels in stretched mode,
>> unless you have added a flag that is RECOGNIZED by the decoder in that
>> specific mode?
>
> Because the aspect ratio of the pixels is encoded into every MPEG stream.
> Even the ATSC streams say "the pixels are 1:1" (well, most of the time...
> there are modes where the pixels *aren't* 1:1).
>
If the high level protocol doesnt' support stretching, then there is
NO guarantee that it will be supported. Note that MPEG is mostly just a syntax,
while ATSC is the standard (WRT STBs.) The ATSC spec might refer to MPEG,
but might not support 4:2:2 or 4:4:4 or spatial, SNR or temporal
scalability or NUMEROUS other features in the MPEG2 spec. Perhaps you
haven't seen the MPEG2 spec and all of the complexity? Since about
only 10% of the complexity is needed for ATSC, it would be insane to
support it all.

>
>> Note that MPEG is NOT the standard for OTA HDTV, but ATSC is.
>
> That really doesn't matter, though, because after you strip the ATSC info,
> you end up with one or more 100% MPEG2-compliant streams in the data.
>
Again -- you don't have ANY guarantee that the componentry that you are
using will support more than the greater standard. MPEG2 is the standard
that defines a part of the syntax. MPEG2 is alot like a chinese menu. You
don't usually pick all items from all rows, and the specific selection
is specified by ATSC (and other specs.)

>
> Most of the "ATSC" spec is telling a receiver how to decide which streams
> of video and audio to feed to the MPEG decoder (along with guide data,
> etc., that is just tossed away by the time the MPEG decoder gets it).
>
The ATSC spec tells what 'chinese menu' items that need to be supported
by the MPEG decoder (and other things like that.)


>
>> Full implementation of all features in MPEG just doesnt' happen in
>> a real world consumer (or even generally in pro) products.
>
> Actually, it does. You *have* to have full MPEG decoding capabilities
> because you just don't know exactly what is going to happen.
>
Wrong -- please refer to MPEG2 and the various options. Not all of those
options will be supported in each MPEG2 decoder chip. Not all scanning
modes will be supported in each MPEG2 decoder chip. Refer to the chinese
menu concept.

>
> Again, the
> fact that every single ATSC receiver gracefully handles 1920x1088 even
> though it isn't in the ATSC spec should tell you something.
>
1088 is a special case (per looking at a list of supported modes in a
real world product -- 2yrs from now.) 1084 is not a supported mode.
1024 MIGHT be a supported mode because of supporting certain VESA
scanning structures (I don't have the spec sitting in front of me right
now.)

Your example is specious WRT the concept of generally supporting all
modes. Again, 1088 is definitely a special case. I have certainly
left open the possibility of supporting special cases.

John
Anonymous
August 25, 2004 2:20:07 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

John S. Dyson (toor@iquest.net) wrote in alt.tv.tech.hdtv:
> Your example is specious WRT the concept of generally supporting all
> modes. Again, 1088 is definitely a special case.

Based on what chips are in the various STBs *now* and what the PCI HDTV
cards can decode (and what chips *they* have), I'm fairly confident that
many more "special cases" are handled far more often than you think.

And, 1088 isn't special at all. It's just the fact that the MPEG spec
requires vertical encoded pixel count to be a multiple of 16, while the
actual "needed" pixel count can be any number.

As long as you stick with color depths that are ATSC standard, I'm willing
to bet that most STBs can handle any MPEG stream of 1920x1080 or less,
and would resize it for output to any output mode the box handles. One
reason I'm sure of this is because most ATSC STBs sold are also satellite
receivers, and use the same MPEG decoder, and satellite didn't want to lock
in too much.

--
Jeff Rife | "Wheel of morality,
SPAM bait: | Turn, turn, turn.
AskDOJ@usdoj.gov | Tell us the lesson
spam@ftc.gov | That we should learn"
| -- Yakko, "Animaniacs"
Anonymous
August 25, 2004 6:44:40 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <MPG.1b95b6db6a929c0c9897de@news.nabs.net>,
Jeff Rife <wevsr@nabs.net> writes:
> John S. Dyson (toor@iquest.net) wrote in alt.tv.tech.hdtv:
>> The MPEG decoders that I am speaking of are part of the STB. (Hint, I know
>> more about future product than I am admitting to.) Not all decoders support
>> all of the oddball modes.
>
> Then, they will be more expensive, because a general-purpose MPEG decoder
> is cheaper in quantity than one that is less general but still has to
> handle the toughest of MP@HL.
>
But, the extent of the various modes that are supported is often
limited to what is needed. However, you are getting closer to my
position by specifying a profile instead of the entire MPEG spec.
Next, it is important to review the modes that are really necessary
to support, and then evaluate the cost of supporting the modes that
aren't necessary. The cost isn't just the cost of silicon. The
APIs for the chips (as exposed to the OEM) are often not even registers
but are actual CPU APIs. So, in SOME cases, there might even be
hw features that aren't 'supported' or effectively not implemented
because of the lack of testing.

When you look at REAL world specs for the end-product, you'll notice
that they require the support of a SUBSET of a given MPEG profile.
It is possible that the entire MPEG profile is supported by a given
chip, but even at the MPEG encode/decode level, there isn't necessarily
a discrete chip anymore. More and more things will be mangled together
as time progresses.

I find that this discussion isn't productive, other than noting that the
entire MPEG spec isn't usually implemented in hardware. Even an entire
profile (of the MP@?L ilk) isn't necessarily implemented in a given piece
of hardware.

There might be a day where it is more common to support an entire full
featured profile, but an ATSC oriented MPEG2 decoder isn't necessarily
going to support the nooks and crannies for oddball CD player schemes.
(It might, and it might not -- but the feature wouldn't necessarily be
exposed, even to the OEM. The feature might/might not exist in the HW,
but is of little consequence.)

John
Anonymous
August 25, 2004 7:04:21 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <MPG.1b95b91d1406abad9897df@news.nabs.net>,
Jeff Rife <wevsr@nabs.net> writes:
> John S. Dyson (toor@iquest.net) wrote in alt.tv.tech.hdtv:
>> Your example is specious WRT the concept of generally supporting all
>> modes. Again, 1088 is definitely a special case.
>
> Based on what chips are in the various STBs *now* and what the PCI HDTV
> cards can decode (and what chips *they* have), I'm fairly confident that
> many more "special cases" are handled far more often than you think.
>
I happen to be privvy to such info. It would be wrong to divulge it in
any detail. Rather than being 'confident' of the info, I actually see
the real world (but non-available) specs. It (1088) is implemented as a
mode as a sibling of 1080 (but for slightly different circumstances.)

>
> And, 1088 isn't special at all.
>
It could be outside of the ATSC spec per se. There are numerous special cases
that are handled, including the VESA modes. However, not all of the
variations in between are. So, in the sense of it being implemented in
a specific piece of hardware that also supports ATSC (not really necessary
for US HDTV), it is definitely a special add on (potentially akin to VESA
equivalent modes.)

You were making an implication that 'just because 1088' is supported,
then all of the values in between are. In fact, 1088 is indeed implemented
as a special case in one design that I know if.

So, Even 1008 isnt' supported, even if 1024 might be.

Hint: there is alot more to the MPEG2 decoder stuff than just the MPEG2
decoder itself also. Even if a decoder might support a given mode, that
doesn't mean that the mode makes any sense in the context of the rest of
the system. For example, think about highly integrated product that handles
various 'trick modes', 'text modes', inbetween various SDTV/HDTV signals.
Fully general capability isn't very easy, and assuming certain structures (e.g.
interpolation filters, etc) allows for more reasonable featuresets.

Again, your implication of infinite (or even substantial) flexibility of
the MPEG2 decoders in consumer gear is generally wrong. There are some
devices that are designed and tested to be fully general purpose, but
consumer equipment (even high end stuff) isn't necessarily so. The
hardware mix and feature sets don't really easily support that. If
all you are doing is MPEG2 decoding, then our discussion ALLOWS your
suggestion about the MPEG2 decoder supporting everything, but even then
it isn't always true.

I am not interested in betting, but all I know are the specs of the chips
(for example) and the exposed APIS. (I have to be very careful not to
expose any proprietary info, but everything that I have described is exactly
true without embellishment.)

Again, there might be some chips whose design is for general purpose
applications, but that isn't the same thing as the highly integrated STB
designs. It is quite naive to believe that high volume consumer equipment
manufacturers don't use the most cost reduced designs with the most
integration possible for their applications. This often means the reduction
of non-used features.

Just because a scanning structure (HxV) looks like 'odd' numbers, they are
actually chosen (and the capability to support them) a-priori, and it is
serendipity when something might be able to be ad-hoc changed after the
fact.

John
!