Should I use 720p or 1080i?

Archived from groups: alt.tv.tech.hdtv (More info?)

I am going to be picking up my HD TIVO today and am curious if it's
best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
again is 720 more like 1440 or is simply have progressive better than
interlaced? Or is it simply a matter of opinion? I figured I'd stick
with one in my menu since I'll be using 480i on another TV and just
wonder which to go with. I'm not sure if it depends on what you're
watching or the TV you have but as for the TV's, I'll be using this
with two TV's
Sony KF-42WE610
Sony KF-50XBR800
32 answers Last reply
More about should 720p 1080i
  1. Archived from groups: alt.tv.tech.hdtv (More info?)

    Why are you asking us? Are your eyes "broken"?
    Believe it or not, your opinion counts more than anyone's here.

    "MarkW" <markwco(removenospam)@comcast.net> wrote in message
    news:6oli90d6taheg95i05pu5fvlmqbvbocm47@4ax.com...
    > I am going to be picking up my HD TIVO today and am curious if it's
    > best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
    > again is 720 more like 1440 or is simply have progressive better than
    > interlaced? Or is it simply a matter of opinion? I figured I'd stick
    > with one in my menu since I'll be using 480i on another TV and just
    > wonder which to go with. I'm not sure if it depends on what you're
    > watching or the TV you have but as for the TV's, I'll be using this
    > with two TV's
    > Sony KF-42WE610
    > Sony KF-50XBR800
  2. Archived from groups: alt.tv.tech.hdtv (More info?)

    On Wed, 05 May 2004 21:01:50 GMT, MarkW <markwco(removenospam)@comcast.net> wrote:

    >I am going to be picking up my HD TIVO today and am curious if it's
    >best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
    >again is 720 more like 1440 or is simply have progressive better than
    >interlaced? Or is it simply a matter of opinion? I figured I'd stick
    >with one in my menu since I'll be using 480i on another TV and just
    >wonder which to go with. I'm not sure if it depends on what you're
    >watching or the TV you have but as for the TV's, I'll be using this
    >with two TV's

    1080i is not better quality that 720p. There is a 2x factor between
    "i" and "p" - aka interlaced and progressive (non-interlaced). Interlaced
    only shows every other line on each pass. Alternating between the even
    and odd lines. Higher resolution but only half the picture at any given
    moment. Progressive shows all the lines on every pass. Lower resolution
    but the whole picture all the time. Thus 1080i uses the same bandwidth
    as 540p. Just presenting the information in a different way. So in "i"
    mode, 720p would be 1440i. In order of increasing quality, here is the
    list of the most common modes:

    480i (aka standard non-hd)
    480p
    1080i
    720p
    1080p

    It looks like Samsung will have a 1080p rear projection DLP by Christmas.
  3. Archived from groups: alt.tv.tech.hdtv (More info?)

    George Thorogood (thorogood@mailinator.com) wrote in alt.tv.tech.hdtv:
    > Thus 1080i uses the same bandwidth
    > as 540p.

    This is misleading. 1080i uses more bandwidth than 720p.

    > Just presenting the information in a different way. So in "i"
    > mode, 720p would be 1440i.

    Not even close. 1280x1440/60i would require twice the scan lines which
    would increase the effective, Kell-factored resolution to about the same as
    1280x1080/30p.

    The effective resolution for 1920x1080/60i is about the same as 1920x810/30p.

    Also, not all 720p is the same. It can be 1280x720/24p, 1280x720/30p or
    1280x720/60p. ABC and ESPN use the last one, which is mostly a waste of
    time for showing movies, since the extra frame rate is basically wasted with
    repeated frames. 1920x1080/60i does a better job with movie content because
    you have more pixels and the amount of temporal information is low enough
    that it is easily handled.

    A 1920x1080/60p display using de-interlacing will look a lot better on a
    source that started as a movie when given a 1920x1080/60i transmission
    instead of a 1280x720/60p transmission.

    --
    Jeff Rife | "I feel the need...the need for
    SPAM bait: | expeditious velocity"
    AskDOJ@usdoj.gov | -- Brain
    uce@ftc.gov |
  4. Archived from groups: alt.tv.tech.hdtv (More info?)

    On Wed, 05 May 2004 21:01:50 GMT, <markwco@comcast.net> wrote:
    > I am going to be picking up my HD TIVO today and am curious if it's
    > best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
    > again is 720 more like 1440 or is simply have progressive better than
    > interlaced? Or is it simply a matter of opinion? I figured I'd stick
    > with one in my menu since I'll be using 480i on another TV and just
    > wonder which to go with. I'm not sure if it depends on what you're
    > watching or the TV you have but as for the TV's, I'll be using this
    > with two TV's
    > Sony KF-42WE610
    > Sony KF-50XBR800

    It depends on the native format of your TV. Very few displays will do both.
    If your TV does 1080i, then set the TIVO to that. If it does 720p, then set
    the TIVO to that.


    John.
  5. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <jeqi901sh5sjvoj261ontdqo63g7cqgapr@4ax.com>,
    George Thorogood <thorogood@mailinator.com> writes:
    > On Wed, 05 May 2004 21:01:50 GMT, MarkW <markwco(removenospam)@comcast.net> wrote:
    >
    >>I am going to be picking up my HD TIVO today and am curious if it's
    >>best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
    >>again is 720 more like 1440 or is simply have progressive better than
    >>interlaced? Or is it simply a matter of opinion? I figured I'd stick
    >>with one in my menu since I'll be using 480i on another TV and just
    >>wonder which to go with. I'm not sure if it depends on what you're
    >>watching or the TV you have but as for the TV's, I'll be using this
    >>with two TV's
    >
    > 1080i is not better quality that 720p. There is a 2x factor between
    > "i" and "p" - aka interlaced and progressive (non-interlaced).
    >
    No... You see the stationary detail in both phases of the interlace for
    1080i. Both interlaced and progressive are sampled also, so bear the overhead
    of the sampling.

    >
    > Interlaced
    > only shows every other line on each pass.
    >
    The items of detail don't change on every pass.

    In essence, interlace is a tradeoff that trades-away temporal resolution
    so as to provide better spatial resolution for a given frame scanning
    structure.

    When comparing 1080i vs. 720p, it is also important to NOT forget the
    1280H pixels vs. the 1920H pixels. It is true that all of the detail
    isn't always recorded/transmitted, but that is also true of all kinds
    of detail given the MPEG2 encoding scheme.

    It would be best (fairest) to claim that 720p and 1080i have essentially
    the same vertical resolution for typical entertainment (non sports)
    video material. On the other hand, 1080i has higher horizontal resolution
    than 720p (in practice and in theory both.)

    Given the above, except for sports (esp for filmed material), 1080i is
    the GENERALLY best format. This is most true for film, where information
    can be reconstructed very nicely to give ALMOST 1080p type (not full, however)
    performance. 720p just doesn't do quite as well.

    John
  6. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <c7c2i1$u6i$8@news.iquest.net>, toor@iquest.net says...

    > No... You see the stationary detail in both phases of the interlace for
    > 1080i.

    And you get "judder" or "jitter."

    > In essence, interlace is a tradeoff that trades-away temporal resolution
    > so as to provide better spatial resolution for a given frame scanning
    > structure.

    Interlace is an obsolete, 70 year old "technology" used to thin out the
    information stream to make it fit through a limited bandwidth pipeline.
    Modern compression technology has eliminated the need for interlace.

    > When comparing 1080i vs. 720p, it is also important to NOT forget the
    > 1280H pixels vs. the 1920H pixels.

    You keep repeating this falsehood for some reason, and I'll keep
    repeating the correction. As it exists now, 1080i is 1440 by 1080
    pixels. 1080i has the edge in horizontal resolution, but not by much.
    720p has the edge in color resolution, in vertical resolution, and in
    temporal resolution.

    > Given the above, except for sports (esp for filmed material), 1080i is
    > the GENERALLY best format. This is most true for film, where information
    > can be reconstructed very nicely to give ALMOST 1080p type (not full, however)
    > performance.

    You've obviously never had to reconstruct an image shot interlaced in-
    camera. De-interlacers don't work well at all. The difficulty involved
    with trying to un-do interlacing is one of the reasons 480i looks really
    bad on the new HDTVs when an attempt is made to convert it to
    progressive. It's also one of the reasons the production industry is
    converting to 24 frames-per-second progressive.
  7. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <jeqi901sh5sjvoj261ontdqo63g7cqgapr@4ax.com>,
    thorogood@mailinator.com says...

    > In order of increasing quality, here is the
    > list of the most common modes:
    >
    > 480i (aka standard non-hd)
    > 480p
    > 1080i
    > 720p
    > 1080p

    I like your list. In addition to considerations involving picture
    quality, bandwidth considerations also enter into the equation.
    Of all the formats, 1080i is the biggest bandwidth hog. 1080/24p and
    1080/30p are much more efficient in terms of bandwidth requirements. As
    you know, a progressive video stream is easier to compress than an
    interlaced stream.
  8. Archived from groups: alt.tv.tech.hdtv (More info?)

    Mark, the first response was the best suggestion. Try both on both tvs
    and see which works best for you. I hooked up 4 720P digital tvs and
    my source quality was much much better set at 1080i. All the theories
    explanations and arguments you hear will never provide an answer, but
    your own eyes should. Use the same criteria on whether you prefer
    blondes or red heads whether you should use DVI connection or
    component.


    --
    toto


    ------------------------------------------------------------------------
    toto's Profile: 1764
    View this thread: 13607
  9. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <MPG.1b0457ac71a6ecc2989702@news.gwtc.net>,
    Steve Rimberg <rsteve@world1.net> writes:
    > In article <c7c2i1$u6i$8@news.iquest.net>, toor@iquest.net says...
    >
    >> No... You see the stationary detail in both phases of the interlace for
    >> 1080i.
    >
    > And you get "judder" or "jitter."
    >
    Not really -- dynamic filters are cool (effectively, there is already
    dynamic filtering going on in MPEG2 encoding.) When the 'flicker'
    starts being possible, then you increase the amount of interlace
    filtering. Since the detail is aliased anyway, removing the motion
    artifacts is actually almost information neturl.

    >
    >> In essence, interlace is a tradeoff that trades-away temporal resolution
    >> so as to provide better spatial resolution for a given frame scanning
    >> structure.
    >
    > Interlace is an obsolete, 70 year old "technology" used to thin out the
    > information stream to make it fit through a limited bandwidth pipeline.
    >
    Interlace exists, and it works. It might be suboptimal, but much less
    suboptimal than 720p60 on filmed material where 1080i30 certainly can/does
    look better. In the future, a little de-interlacing will help those
    old film/tape archives continue to be valuable.

    >
    > Modern compression technology has eliminated the need for interlace.
    >
    So what? If you want the real 'ideal', then the filmed stuff should
    be broadcast at 1080p24. 720p60 is silly on most material, considering
    that it is filmed (or film-look.)


    >> When comparing 1080i vs. 720p, it is also important to NOT forget the
    >> 1280H pixels vs. the 1920H pixels.
    >
    > You keep repeating this falsehood for some reason, and I'll keep
    > repeating the correction. As it exists now, 1080i is 1440 by 1080
    > pixels.
    >
    You keep on forgetting the necessary rolloff (which isn't as necessary
    when the sampling structure is 1920H instead of the very very limited
    1280H.) Either you don't realize, or are purposefully forgetting that
    the rolloff for 1280H usualy has to start at about effectivelly 800 or
    less TVL to avoid uglifying. Even if all of the 1920H doesn't exist,
    the needed rolloff for avoiding sampling effects would typically start
    being significant at the 1000-1100TVL level.

    >
    > 1080i has the edge in horizontal resolution, but not by much.
    >
    > 720p has the edge in color resolution, in vertical resolution, and in
    > temporal resolution.
    >
    If you look at real numbers, you'll find that for 'resolution' figures,
    even in the best case, 720p is a toss-up. Temporal resolution is essentially
    the ONLY advantage of 720p60. Also, if you really have ever seen 720p,
    the effects of the early rolloff for the 1280H anti-aliasing makes it
    look like a 'fantastic 480p' instead of that 'window' 1080i or 1080p
    effect.

    >
    >> Given the above, except for sports (esp for filmed material), 1080i is
    >> the GENERALLY best format. This is most true for film, where information
    >> can be reconstructed very nicely to give ALMOST 1080p type (not full, however)
    >> performance.
    >
    > You've obviously never had to reconstruct an image shot interlaced in-
    > camera. De-interlacers don't work well at all.
    >
    Plaeeze!!! 24fps material (be it 1080p24 or film) is easy to reconstruct
    when it is played out on 1080i30 formats. The problem with DVDs is that
    the proper flags aren't always used.

    If you start with 60fps material, then 720p will work better for motion.
    If you start with 24fps material (most scripted stuff), 1080i or 1080p
    (depending upon display) is a better match. 720p is silly for 24fps
    material, with no advantages.

    If you start with 'video look', then it is best to match the recording
    standards, where 1080i is probably the best all around except for motion,
    and does give the best video look. For sports or scientific work (where
    freeze frame is a primarily used feature), then 720p can be useful.

    One more note: for broadcast HDTV, 1080i is a tight fit. High motion
    and 1080i is a disaster sometimes, but not because of 1080i encoding
    itself on the ATSC channel. It is because of the all-too-common
    subchannels that force the 1080i to fit in 15mpbs.... That is very
    marginal for 1080i. This gives a double bonus for sports on 720p60,
    where it tends to fit into the ATSC channel, even with a subchannel
    or two.

    John
  10. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <MPG.1b04599fca3ca7b2989703@news.gwtc.net>,
    Ron Malvern <rmlvrn@nospam.com> writes:
    > In article <jeqi901sh5sjvoj261ontdqo63g7cqgapr@4ax.com>,
    > thorogood@mailinator.com says...
    >
    >> In order of increasing quality, here is the
    >> list of the most common modes:
    >>
    >> 480i (aka standard non-hd)
    >> 480p
    >> 1080i
    >> 720p
    >> 1080p
    >
    > I like your list. In addition to considerations involving picture
    > quality, bandwidth considerations also enter into the equation.
    > Of all the formats, 1080i is the biggest bandwidth hog. 1080/24p and
    > 1080/30p are much more efficient in terms of bandwidth requirements.
    >
    It bothers me that the TV stations/networks don't use 1080p24 more often
    for their playout. However, I suspect that the mode-change issues had
    caused them to decide to use 1080i30.

    Using 1080p24 would keep the 720p60 zealots from worrying about the
    temporal resolution that JUST DOESN"T exist in most scripted material.
    Using 1080p24 would also allow for easier (superfically higher quality)
    implementation of flicker free 1080p72 display. And, the limited resolution
    issues of 720p60 would be unimportant.

    John
  11. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <c7eeo5$1kae$2@news.iquest.net>, toor@iquest.net says...

    > It bothers me that the TV stations/networks don't use 1080p24 more often
    > for their playout.

    It bothers me as well. 1080/24p is the best of the HDTV formats now
    available, not only because of it's resolution advantages, but also
    because it has lessor bandwidth requirements. It is also more elegantly
    manipulable for special effects during the post production process. And
    finally, it more easily converted to the other formats in use around the
    world: (Europe's proposed 1080/25p and 1080/50i, along with existing PAL
    50i.)
    Some cable networks are considering 1080/24p as a transmission format:
    A&E is one such network, for its History Channel and other offerings. I
    would be immensely grateful if they take that step forward. I've been
    encouraging them to do so. They are one of our best customers.

    > However, I suspect that the mode-change issues had
    > caused them to decide to use 1080i30.

    Most broadcasters ended up using 1080i because their most trusted
    supplier, Sony, wouldn't sell them anything else. Sony was grossly
    behind the technology curve when these decisions were made, but the
    company can be quite stubborn about these things. Sony apparently had
    an R&D investment in 1080i that it had to amortize.

    > Using 1080p24 would keep the 720p60 zealots from worrying about the
    > temporal resolution that JUST DOESN"T exist in most scripted material.

    It does if the camera original was shot at 60p. But you are right when
    you say there is no advantage when the camera original is 24p film.

    > Using 1080p24 would also allow for easier (superfically higher quality)
    > implementation of flicker free 1080p72 display.

    Yes it would, and I'm looking forward to some experimentation along that
    line with displays. I haven't seen any yet. But from my own experience
    with computers, 72 frames per second seems to be a display rate that
    satisfies a greater number of viewers than does 60.
  12. Archived from groups: alt.tv.tech.hdtv (More info?)

    Ron Malvern (rmlvrn@nospam.com) wrote in alt.tv.tech.hdtv:
    > In article <c7eeo5$1kae$2@news.iquest.net>, toor@iquest.net says...
    >
    > > It bothers me that the TV stations/networks don't use 1080p24 more often
    > > for their playout.
    >
    > It bothers me as well. 1080/24p is the best of the HDTV formats now
    > available, not only because of it's resolution advantages, but also
    > because it has lessor bandwidth requirements.

    Well, for film-source material anyway. I don't think anybody would recommend
    anything less than 30p for sports or other live action, and then you get
    into the 1080/60i vs. 720/60p issues, where 60p still only helps on a few
    select shots, while the extra pixels on 1920x1080 tend to help all the time.

    --
    Jeff Rife |
    SPAM bait: | http://www.nabs.net/Cartoons/Dilbert/TokenRing.gif
    AskDOJ@usdoj.gov |
    uce@ftc.gov |
  13. Archived from groups: alt.tv.tech.hdtv (More info?)

    "Steve Rimberg" <rsteve@world1.net> wrote in message
    news:MPG.1b0457ac71a6ecc2989702@news.gwtc.net...
    > You keep repeating this falsehood for some reason, and I'll keep
    > repeating the correction. As it exists now, 1080i is 1440 by 1080
    > pixels. 1080i has the edge in horizontal resolution, but not by much.
    > 720p has the edge in color resolution, in vertical resolution, and in
    > temporal resolution.

    Why do you say this? The 16:9 standards are of course 1920x1080 and
    1280x720. Are you suggesting that most current HDTV screens cannot, in
    practice, achieve more than 1440 horizontal pixels of resolution? Even if
    true: (a) technology will presumably improve this over time; and (b) the
    full 1920-pixel horizontal resolution is still useful when zooming.
  14. Archived from groups: alt.tv.tech.hdtv (More info?)

    "John S. Dyson" <toor@iquest.net> wrote in message
    news:c7eeo5$1kae$2@news.iquest.net...
    >
    > It bothers me that the TV stations/networks don't use 1080p24 more often
    > for their playout. However, I suspect that the mode-change issues had
    > caused them to decide to use 1080i30.
    >
    > Using 1080p24 would keep the 720p60 zealots from worrying about the
    > temporal resolution that JUST DOESN"T exist in most scripted material.
    > Using 1080p24 would also allow for easier (superfically higher quality)
    > implementation of flicker free 1080p72 display. And, the limited
    resolution
    > issues of 720p60 would be unimportant.

    Totally agreed. 1080p24 wold look fantastic for movies, providing the best
    possible quality and efficiency of any HD standard, without having to worry
    about end-user de-interlacing (which in practice may be fairly difficult to
    do correctly even if the original source is film, unless HDTV handles this
    considerably better than DVD does). The idea of a 72Hz monitor is also
    hugely appealing. Maybe after more 1080p monitors become available, this
    will be considered more seriously.
  15. Archived from groups: alt.tv.tech.hdtv (More info?)

    "Jeff Rife" <wevsr@nabs.net> wrote in message
    news:MPG.1b04cb1493300d3298b3fd@news.nabs.net...
    > Ron Malvern (rmlvrn@nospam.com) wrote in alt.tv.tech.hdtv:
    > > In article <c7eeo5$1kae$2@news.iquest.net>, toor@iquest.net says...
    > >
    > > > It bothers me that the TV stations/networks don't use 1080p24 more
    often
    > > > for their playout.
    > >
    > > It bothers me as well. 1080/24p is the best of the HDTV formats now
    > > available, not only because of it's resolution advantages, but also
    > > because it has lessor bandwidth requirements.
    >
    > Well, for film-source material anyway. I don't think anybody would
    recommend
    > anything less than 30p for sports or other live action, and then you get
    > into the 1080/60i vs. 720/60p issues, where 60p still only helps on a few
    > select shots, while the extra pixels on 1920x1080 tend to help all the
    time.

    Agreed. In fact, I wouldn't want anything less than the 48-60fps (whether
    progressive or interlaced) range for sports or other dynamic motion. Even
    30fps is on the low end of framerates. While interlace has its problems,
    there is substantial benefit to the image at least half changing every
    1/60th of a second that keeps it from looking too choppy.
  16. Archived from groups: alt.tv.tech.hdtv (More info?)

    "toto" <toto.15uyby@news.satelliteguys.us> wrote in message
    news:toto.15uyby@news.satelliteguys.us...
    >
    > Mark, the first response was the best suggestion. Try both on both tvs
    > and see which works best for you. I hooked up 4 720P digital tvs and
    > my source quality was much much better set at 1080i. All the theories
    > explanations and arguments you hear will never provide an answer, but
    > your own eyes should. Use the same criteria on whether you prefer
    > blondes or red heads whether you should use DVI connection or
    > component.

    There is one reason that 1080i may look better on even a 480p set (or 720p)
    than the "native" 480p signal on those sets, and that is color resolution.
    Since HDTV signals encode color at half the resolution in each direction as
    brightness (luma), if you display a 1080 signal on a lower resolution
    screen, there will be an improvement in color resolution on those screens
    compared to a lower resolution signal on the same monitor. The luma will be
    resolution limited by the display device, but the chroma can be displayed at
    nearly its full resolution on even a 480-line display.

    Usually scaling a high-res signal down isn't too much of a problem (with
    appropriate filtering), but scaling a lower-resolution signal up often leads
    to obvious problems.
  17. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <MPG.1b04cb1493300d3298b3fd@news.nabs.net>,
    Jeff Rife <wevsr@nabs.net> wrote:

    > Ron Malvern (rmlvrn@nospam.com) wrote in alt.tv.tech.hdtv:
    > > In article <c7eeo5$1kae$2@news.iquest.net>, toor@iquest.net says...
    > >
    > > > It bothers me that the TV stations/networks don't use 1080p24 more often
    > > > for their playout.
    > >
    > > It bothers me as well. 1080/24p is the best of the HDTV formats now
    > > available, not only because of it's resolution advantages, but also
    > > because it has lessor bandwidth requirements.
    >
    > Well, for film-source material anyway. I don't think anybody would recommend
    > anything less than 30p for sports or other live action, and then you get
    > into the 1080/60i vs. 720/60p issues, where 60p still only helps on a few
    > select shots, while the extra pixels on 1920x1080 tend to help all the time.

    Wait, are there any sets capable of making use of 1080p?

    Also, the issue seems to be that a lot of local stations are not using
    the full bandwidth for various reasons, the most prominent of them being
    to multicast with secondary channels.
  18. Archived from groups: alt.tv.tech.hdtv (More info?)

    poldy (poldy@kfu.com) wrote in alt.tv.tech.hdtv:
    > > Well, for film-source material anyway. I don't think anybody would recommend
    > > anything less than 30p for sports or other live action, and then you get
    > > into the 1080/60i vs. 720/60p issues, where 60p still only helps on a few
    > > select shots, while the extra pixels on 1920x1080 tend to help all the time.
    >
    > Wait, are there any sets capable of making use of 1080p?

    My computer monitor does it fine right now, and by the end of the year
    there will be 1920x1080 DLP sets that will do progressive scan.

    > Also, the issue seems to be that a lot of local stations are not using
    > the full bandwidth for various reasons, the most prominent of them being
    > to multicast with secondary channels.

    I don't know about "a lot", but of the 5 local stations that do HD (ABC,
    CBS, NBC, PBS, WB), only PBS and ABC multicast. Since ABC does 720p,
    the 3Mbps weather radar doesn't do much to the quality of the HD. PBS
    does suffer a bit.

    Otherwise, the stations are using full bitrate (at least 18Mbps) for the
    single stream. Even the local Fox does this with their 480p (which gives
    a very nice picture, although not HD).

    I have heard that some stations do stupid multicasting in that they send
    both the HD and a 480i signal of the *same* programming. Since every
    ATSC receiver can receive and decode the HD version, it's just a stupid
    waste of bandwidth.

    --
    Jeff Rife | "She just dropped by to remind me that my life
    SPAM bait: | is an endless purgatory, interrupted by profound
    AskDOJ@usdoj.gov | moments of misery."
    uce@ftc.gov | -- Richard Karinsky, "Caroline in the City"
  19. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <MPG.1b05a2c1e3a5d46198b400@news.nabs.net>,
    Jeff Rife <wevsr@nabs.net> wrote:

    > > Wait, are there any sets capable of making use of 1080p?
    >
    > My computer monitor does it fine right now, and by the end of the year
    > there will be 1920x1080 DLP sets that will do progressive scan.

    So that would be a very small segment of the installed base which would
    have such sets.

    How would 1080p downconverted to 1080i look?
  20. Archived from groups: alt.tv.tech.hdtv (More info?)

    poldy (poldy@kfu.com) wrote in alt.tv.tech.hdtv:
    > How would 1080p downconverted to 1080i look?

    It depends on which flavor of 1080p. 1920x1080/24p sourced from film
    material would look pretty much the same on either a 1080/60p or 1080/60i
    display. The 1080/60p display would be slightly better, but not much,
    because (again) there just isn't enough temporal information in 24p to
    come close to stressing 60i. The same would be true of just about any
    flavor of 1080p sourced from 24p film.

    Native 1920x1080/30p would again look about the same on 1080/60p or
    1080/60i displays, with the advantage (a bit more than the 24p sourced
    material) to the 60p display.

    If we ever get a 1920x1080/60p source (it's not a legal ATSC mode, so it's
    unlikely to show up in *any* transmission), then that's the only one that
    would look much worse on a 1080i display.

    --
    Jeff Rife | "Five thousand dollars, huh? I'll bet we could
    SPAM bait: | afford that if we pooled our money together...
    AskDOJ@usdoj.gov | bought a gun...robbed a bank...."
    uce@ftc.gov | -- Drew Carey
  21. Archived from groups: alt.tv.tech.hdtv (More info?)

    This has been a very interesting thread so far.
    Could somebody be kind enough to explain a little bit more about the
    "rolloff" and the factors contributing to it?

    "John S. Dyson" <toor@iquest.net> wrote in message
    news:c7eej4$1kae$1@news.iquest.net...
    > In article <MPG.1b0457ac71a6ecc2989702@news.gwtc.net>,
    > Steve Rimberg <rsteve@world1.net> writes:
    > > In article <c7c2i1$u6i$8@news.iquest.net>, toor@iquest.net says...
    > >
    > >> No... You see the stationary detail in both phases of the interlace
    for
    > >> 1080i.
    > >
    > > And you get "judder" or "jitter."
    > >
    > Not really -- dynamic filters are cool (effectively, there is already
    > dynamic filtering going on in MPEG2 encoding.) When the 'flicker'
    > starts being possible, then you increase the amount of interlace
    > filtering. Since the detail is aliased anyway, removing the motion
    > artifacts is actually almost information neturl.
    >
    > >
    > >> In essence, interlace is a tradeoff that trades-away temporal
    resolution
    > >> so as to provide better spatial resolution for a given frame scanning
    > >> structure.
    > >
    > > Interlace is an obsolete, 70 year old "technology" used to thin out the
    > > information stream to make it fit through a limited bandwidth pipeline.
    > >
    > Interlace exists, and it works. It might be suboptimal, but much less
    > suboptimal than 720p60 on filmed material where 1080i30 certainly can/does
    > look better. In the future, a little de-interlacing will help those
    > old film/tape archives continue to be valuable.
    >
    > >
    > > Modern compression technology has eliminated the need for interlace.
    > >
    > So what? If you want the real 'ideal', then the filmed stuff should
    > be broadcast at 1080p24. 720p60 is silly on most material, considering
    > that it is filmed (or film-look.)
    >
    >
    >
    > >> When comparing 1080i vs. 720p, it is also important to NOT forget the
    > >> 1280H pixels vs. the 1920H pixels.
    > >
    > > You keep repeating this falsehood for some reason, and I'll keep
    > > repeating the correction. As it exists now, 1080i is 1440 by 1080
    > > pixels.
    > >
    > You keep on forgetting the necessary rolloff (which isn't as necessary
    > when the sampling structure is 1920H instead of the very very limited
    > 1280H.) Either you don't realize, or are purposefully forgetting that
    > the rolloff for 1280H usualy has to start at about effectivelly 800 or
    > less TVL to avoid uglifying. Even if all of the 1920H doesn't exist,
    > the needed rolloff for avoiding sampling effects would typically start
    > being significant at the 1000-1100TVL level.
    >
    > >
    > > 1080i has the edge in horizontal resolution, but not by much.
    > >
    > > 720p has the edge in color resolution, in vertical resolution, and in
    > > temporal resolution.
    > >
    > If you look at real numbers, you'll find that for 'resolution' figures,
    > even in the best case, 720p is a toss-up. Temporal resolution is
    essentially
    > the ONLY advantage of 720p60. Also, if you really have ever seen 720p,
    > the effects of the early rolloff for the 1280H anti-aliasing makes it
    > look like a 'fantastic 480p' instead of that 'window' 1080i or 1080p
    > effect.
    >
    > >
    > >> Given the above, except for sports (esp for filmed material), 1080i is
    > >> the GENERALLY best format. This is most true for film, where
    information
    > >> can be reconstructed very nicely to give ALMOST 1080p type (not full,
    however)
    > >> performance.
    > >
    > > You've obviously never had to reconstruct an image shot interlaced in-
    > > camera. De-interlacers don't work well at all.
    > >
    > Plaeeze!!! 24fps material (be it 1080p24 or film) is easy to reconstruct
    > when it is played out on 1080i30 formats. The problem with DVDs is that
    > the proper flags aren't always used.
    >
    > If you start with 60fps material, then 720p will work better for motion.
    > If you start with 24fps material (most scripted stuff), 1080i or 1080p
    > (depending upon display) is a better match. 720p is silly for 24fps
    > material, with no advantages.
    >
    > If you start with 'video look', then it is best to match the recording
    > standards, where 1080i is probably the best all around except for motion,
    > and does give the best video look. For sports or scientific work (where
    > freeze frame is a primarily used feature), then 720p can be useful.
    >
    > One more note: for broadcast HDTV, 1080i is a tight fit. High motion
    > and 1080i is a disaster sometimes, but not because of 1080i encoding
    > itself on the ATSC channel. It is because of the all-too-common
    > subchannels that force the 1080i to fit in 15mpbs.... That is very
    > marginal for 1080i. This gives a double bonus for sports on 720p60,
    > where it tends to fit into the ATSC channel, even with a subchannel
    > or two.
    >
    > John
    >
  22. Archived from groups: alt.tv.tech.hdtv (More info?)

    I have a front projection CRT display that can display both 1080i and 720p.
    It is connected to a HD cable box which outputs both 1080i and 720p
    depending on the original source format. (Thus there is no conversion
    happening at my end.) Here are my observations:
    1. 720p may required less bandwidth due to better MPEG compression of a
    progressive signal, however, it requires a display capable of synching to a
    MUCH higher frequency than 1080i.
    2. All of the material that I have viewed looks better at 1080i than at
    720p.
    3. Both 1080i and 720p blow 480p (progressive scan DVD quality) away.
    4. 1080i looks better with HD video sources than with film sources. (This
    may be due to the fact that my equipment doesn't compensate well for the 3/2
    pull-down artifacts you get when you convert a 24fps film to a 30fps
    signal.)
    5. I watched all of the recent major sporting events (Super Bowl, the NCAA
    tournament, etc) in HD and have zero complaints about picture quality.


    "MarkW" <markwco(removenospam)@comcast.net> wrote in message
    news:6oli90d6taheg95i05pu5fvlmqbvbocm47@4ax.com...
    > I am going to be picking up my HD TIVO today and am curious if it's
    > best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
    > again is 720 more like 1440 or is simply have progressive better than
    > interlaced? Or is it simply a matter of opinion? I figured I'd stick
    > with one in my menu since I'll be using 480i on another TV and just
    > wonder which to go with. I'm not sure if it depends on what you're
    > watching or the TV you have but as for the TV's, I'll be using this
    > with two TV's
    > Sony KF-42WE610
    > Sony KF-50XBR800
  23. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <8X7nc.110201$207.1414989@wagner.videotron.net>,
    "jean dumont" <jean.dumont@videotron.ca> writes:
    > This has been a very interesting thread so far.
    > Could somebody be kind enough to explain a little bit more about the
    > "rolloff" and the factors contributing to it?
    >
    Okay, I am pure 'techie', but I'll REALLY try to explain.

    VERY NONTECHNICAL:
    In a video sense, high frequency rolloff is a 'decrease' of detail,
    or sharpness for sharp edges. When the edges are very sharp, the
    rolloff will tend to spread out the edge, and make it look a little
    wider. When details are spread out, then the even smaller details
    are smoothed into nothingness.

    Slightly more technical:
    There are different ways of doing 'rolloff' necessary for different
    purposes. One (straightforward) way of doing the 'rolloff' is to
    electronically smooth the signal with a linear phase filter (one with
    constant time delay vs. frequency, otherwise the spreading will be
    more severe than need be.) Other ways to process the signal is to
    keep the fine details, but to intensively 'chop off' the strength
    of the high frequency (sharp edges) and spread only the strongest
    transistions. This can help to preserve SOME of the fine detail,
    but also help to shape the frequency spectrum.

    Even more technical:
    MPEG2 encoding and even most digitized video requires pre filtering
    before the digitization process, because if too much high frequency
    detail is presented, then large area beat products appear (like
    moire effects.) At the smaller scale, these effects look like
    stairstepping. MPEG encoding by itself, even if presented with
    properly bandlimited video, it can actually produce these aliasing
    effects when it has to chop too much detail from the signal (to
    fit within proper bandwidth.) So, MPEG encoding can actually produce
    ALIASING!!!

    Anyway, this is more than you asked for, and probably not enough of
    what you really need. However, I am a pure techie, with the tech
    mentality for the last 40yrs (and I am slightly older than that.)
    I am unable to think in 'non technical' terms without duress :-).

    John
  24. Archived from groups: alt.tv.tech.hdtv (More info?)

    Thanks for taking the time to explain John.

    So, if I understand you correctly, rolloff is used to prevent artifacts that
    would occur during the Mpeg2 compression process or is there other purposes
    for it?

    Would the Mpeg produced aliasing account for the rainbow effect that we see
    sometimes with fine details (ties, jackets etc...) over digital cable?
    I always assumed this was a comb filter issue (thinking the signal might
    have been compressed down to composite along the distribution line).

    "John S. Dyson" <toor@iquest.net> wrote in message
    news:c7jbi0$3bn$2@news.iquest.net...
    > In article <8X7nc.110201$207.1414989@wagner.videotron.net>,
    > "jean dumont" <jean.dumont@videotron.ca> writes:
    > > This has been a very interesting thread so far.
    > > Could somebody be kind enough to explain a little bit more about the
    > > "rolloff" and the factors contributing to it?
    > >
    > Okay, I am pure 'techie', but I'll REALLY try to explain.
    >
    > VERY NONTECHNICAL:
    > In a video sense, high frequency rolloff is a 'decrease' of detail,
    > or sharpness for sharp edges. When the edges are very sharp, the
    > rolloff will tend to spread out the edge, and make it look a little
    > wider. When details are spread out, then the even smaller details
    > are smoothed into nothingness.
    >
    > Slightly more technical:
    > There are different ways of doing 'rolloff' necessary for different
    > purposes. One (straightforward) way of doing the 'rolloff' is to
    > electronically smooth the signal with a linear phase filter (one with
    > constant time delay vs. frequency, otherwise the spreading will be
    > more severe than need be.) Other ways to process the signal is to
    > keep the fine details, but to intensively 'chop off' the strength
    > of the high frequency (sharp edges) and spread only the strongest
    > transistions. This can help to preserve SOME of the fine detail,
    > but also help to shape the frequency spectrum.
    >
    > Even more technical:
    > MPEG2 encoding and even most digitized video requires pre filtering
    > before the digitization process, because if too much high frequency
    > detail is presented, then large area beat products appear (like
    > moire effects.) At the smaller scale, these effects look like
    > stairstepping. MPEG encoding by itself, even if presented with
    > properly bandlimited video, it can actually produce these aliasing
    > effects when it has to chop too much detail from the signal (to
    > fit within proper bandwidth.) So, MPEG encoding can actually produce
    > ALIASING!!!
    >
    > Anyway, this is more than you asked for, and probably not enough of
    > what you really need. However, I am a pure techie, with the tech
    > mentality for the last 40yrs (and I am slightly older than that.)
    > I am unable to think in 'non technical' terms without duress :-).
    >
    > John
    >
  25. Archived from groups: alt.tv.tech.hdtv (More info?)

    "Brian K. White" <nospam@foxfire74.com> wrote in message news:<Vr9nc.2579$GL4.94@fe2.columbus.rr.com>...
    > I have a front projection CRT display that can display both 1080i and 720p.
    > It is connected to a HD cable box which outputs both 1080i and 720p
    > depending on the original source format. (Thus there is no conversion
    > happening at my end.) Here are my observations:
    > 1. 720p may required less bandwidth due to better MPEG compression of a
    > progressive signal, however, it requires a display capable of synching to a
    > MUCH higher frequency than 1080i.
    > 2. All of the material that I have viewed looks better at 1080i than at
    > 720p.
    > 3. Both 1080i and 720p blow 480p (progressive scan DVD quality) away.
    > 4. 1080i looks better with HD video sources than with film sources. (This
    > may be due to the fact that my equipment doesn't compensate well for the 3/2
    > pull-down artifacts you get when you convert a 24fps film to a 30fps
    > signal.)
    > 5. I watched all of the recent major sporting events (Super Bowl, the NCAA
    > tournament, etc) in HD and have zero complaints about picture quality.
    >
    >
    > "MarkW" <markwco(removenospam)@comcast.net> wrote in message
    > news:6oli90d6taheg95i05pu5fvlmqbvbocm47@4ax.com...
    > > I am going to be picking up my HD TIVO today and am curious if it's
    > > best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
    > > again is 720 more like 1440 or is simply have progressive better than
    > > interlaced? Or is it simply a matter of opinion? I figured I'd stick
    > > with one in my menu since I'll be using 480i on another TV and just
    > > wonder which to go with. I'm not sure if it depends on what you're
    > > watching or the TV you have but as for the TV's, I'll be using this
    > > with two TV's
    > > Sony KF-42WE610
    > > Sony KF-50XBR800


    I'm just going to jump in here and say something.
    I think the only format that makes sense is 1920x1080(24p).
    Now all I want to see are 16:9 crt vga screens. It shouldn't be hard
    at all.
    I don't know why they aren't around because you know people get all
    excited when they see that 16:9 ratio. And it could play
    1920x1080(24p). It wouldn't cost that much either. Less than $300.
  26. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <e5bnc.90531$j11.1722178@weber.videotron.net>,
    "jean dumont" <jean.dumont@videotron.ca> writes:
    > Thanks for taking the time to explain John.
    >
    > So, if I understand you correctly, rolloff is used to prevent artifacts that
    > would occur during the Mpeg2 compression process or is there other purposes
    > for it?
    >
    The 'rolloff' or 'anti-aliasing' is helpful to avoid artifacts during
    the MPEG2 compression process and EVEN the digitization process. If you
    sample an analog signal that has too much detail, the result will contain
    lots of weird, sometimes incomprehensible effects. When there is only
    slightly too much detail for the sampling rate, the effect will look
    like moire, extremely harsh details, perhaps mild banding. When there
    is far too much detail, the large scale artifacts and banding, along
    with extreme moire can appear.

    >
    > Would the Mpeg produced aliasing account for the rainbow effect that we see
    > sometimes with fine details (ties, jackets etc...) over digital cable?
    >
    That 'rainbow' effect with luma interfering with chroma is due to
    the fact that composite video was in the signal path. The composite
    to component (or s-video) converters often leave the interfering
    luma to mix with the chroma, and that does produce color flashing.

    However, in a very techincal sense, that effect is a cousin to the
    moire or other effects from undersampling. (Given an ideal composite
    decoder, when luma detail actually does spill into the chroma spectrum,
    that would be somewhat similar in some regards to other sampling
    effects.)

    >
    > I always assumed this was a comb filter issue (thinking the signal might
    > have been compressed down to composite along the distribution line).
    >
    You were likely correct regarding that issue.

    Raw MPEG2 (or undersampling) artifacts tend not to look like composite
    decoding artifacts. You'll not see chroma magically appearing based
    upon luma signals in MPEG2. One possible artifact might include too
    much luma or chroma causing artifacts in the other domain -- because
    of too little payload capability (squeezing too much data into too
    small a pipeline causes the MPEG2 encoder to make quality tradeoffs.)

    John
  27. Archived from groups: alt.tv.tech.hdtv (More info?)

    Thanks for the explanations John.

    I presume that if the sampling rate would be high enough, no rolloff would
    be necessary.
    If this is true, then the bottom line is that 19Mbits/s is not a high enough
    bitrate to allow for the full resolution of 1080ix1920 or even 720px1280?
    Why not allocate enough bandwith to do without rolloff?


    "John S. Dyson" <toor@iquest.net> wrote in message
    news:c7jic8$56b$2@news.iquest.net...
    > In article <e5bnc.90531$j11.1722178@weber.videotron.net>,
    > "jean dumont" <jean.dumont@videotron.ca> writes:
    > > Thanks for taking the time to explain John.
    > >
    > > So, if I understand you correctly, rolloff is used to prevent artifacts
    that
    > > would occur during the Mpeg2 compression process or is there other
    purposes
    > > for it?
    > >
    > The 'rolloff' or 'anti-aliasing' is helpful to avoid artifacts during
    > the MPEG2 compression process and EVEN the digitization process. If you
    > sample an analog signal that has too much detail, the result will contain
    > lots of weird, sometimes incomprehensible effects. When there is only
    > slightly too much detail for the sampling rate, the effect will look
    > like moire, extremely harsh details, perhaps mild banding. When there
    > is far too much detail, the large scale artifacts and banding, along
    > with extreme moire can appear.
    >
    > >
    > > Would the Mpeg produced aliasing account for the rainbow effect that we
    see
    > > sometimes with fine details (ties, jackets etc...) over digital cable?
    > >
    > That 'rainbow' effect with luma interfering with chroma is due to
    > the fact that composite video was in the signal path. The composite
    > to component (or s-video) converters often leave the interfering
    > luma to mix with the chroma, and that does produce color flashing.
    >
    > However, in a very techincal sense, that effect is a cousin to the
    > moire or other effects from undersampling. (Given an ideal composite
    > decoder, when luma detail actually does spill into the chroma spectrum,
    > that would be somewhat similar in some regards to other sampling
    > effects.)
    >
    > >
    > > I always assumed this was a comb filter issue (thinking the signal might
    > > have been compressed down to composite along the distribution line).
    > >
    > You were likely correct regarding that issue.
    >
    > Raw MPEG2 (or undersampling) artifacts tend not to look like composite
    > decoding artifacts. You'll not see chroma magically appearing based
    > upon luma signals in MPEG2. One possible artifact might include too
    > much luma or chroma causing artifacts in the other domain -- because
    > of too little payload capability (squeezing too much data into too
    > small a pipeline causes the MPEG2 encoder to make quality tradeoffs.)
    >
    > John
  28. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <MPG.1b063deaa821c11e98b405@news.nabs.net>,
    Jeff Rife <wevsr@nabs.net> wrote:

    > If we ever get a 1920x1080/60p source (it's not a legal ATSC mode, so it's
    > unlikely to show up in *any* transmission), then that's the only one that
    > would look much worse on a 1080i display.

    There are zero prospects for this as a broadcast format right?

    Unless they re-organized the spectrum, which everybody wants to use for
    other services. So not likely that they'd allocate more bandwidth for
    TV.

    And neither HD DVD nor Blue Ray has indicated which resolutions would be
    supported in their formats. If they can produce formats and content
    that reproduces all the 1080p formats, you would hope they'd do it,
    rather than settling for just the ATSC formats which are designed for a
    more bandwidth-limited medium.
  29. Archived from groups: alt.tv.tech.hdtv (More info?)

    poldy (poldy@kfu.com) wrote in alt.tv.tech.hdtv:
    > > If we ever get a 1920x1080/60p source (it's not a legal ATSC mode, so it's
    > > unlikely to show up in *any* transmission), then that's the only one that
    > > would look much worse on a 1080i display.
    >
    > There are zero prospects for this as a broadcast format right?

    For OTA, in the US, for the forseeable future, yes, the chance is zero.

    It could be seen via cable or DBS, but unless people build displays that
    accept it as input, there's no reason to transmit it.

    > Unless they re-organized the spectrum, which everybody wants to use for
    > other services. So not likely that they'd allocate more bandwidth for
    > TV.

    *Technically*, a change in the ATSC standard could allow 1080/60p in the
    current spectrum. But, 1080/60p would have some real problems at 19Mbps...
    sort of like what DVD would look like if the absolute maximum bitrate was
    5Mbps.

    So, it ain't gonna happen.

    --
    Jeff Rife |
    SPAM bait: | http://www.netfunny.com/rhf/jokes/99/Apr/columbine.html
    AskDOJ@usdoj.gov |
    uce@ftc.gov |
  30. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <poldy-934BFA.14265508052004@netnews.comcast.net>,
    poldy <poldy@kfu.com> writes:
    > In article <MPG.1b063deaa821c11e98b405@news.nabs.net>,
    > Jeff Rife <wevsr@nabs.net> wrote:
    >
    >> If we ever get a 1920x1080/60p source (it's not a legal ATSC mode, so it's
    >> unlikely to show up in *any* transmission), then that's the only one that
    >> would look much worse on a 1080i display.
    >
    > There are zero prospects for this as a broadcast format right?
    >
    > Unless they re-organized the spectrum, which everybody wants to use for
    > other services. So not likely that they'd allocate more bandwidth for
    > TV.
    >
    Most likely, we won't see 1080p60 as a broadcast format because
    of the reasons that you state, and the diminishing beneficial
    returns except on very large screens. For scripted material,
    it is most likely originally 24p anyway, so 30i transport with
    24p source material can allow for fairly good reconstruction
    to the 24p quality.

    For sports material, it would be nice to have 1080p60, but the
    real benefits are likely minimal over and above 720p60.

    In reality, the biggest (by far) problem with 1080i30 in the
    US ATSC system is for sports where there is too much movement
    and the available signal payload capability is insufficient.
    The interlace itself isn't the culprit. If there was perhaps
    25mbps instead of only (most likely) 15-16mbps, then the
    uglification of the sports when in 1080i30 wouldn't be so
    problematical.

    SOOO, 720p60 makes a really good tradeoff, where there is less
    spatial detail visible (especially for live broadcasts), but
    the payload needs for the much reduced amount of data given
    numerous technical factors are less, and 15mbps does a good
    job for 720p60.

    Where 1080i30 dies horrible is in NFL broadcasts where there
    is alot of moving crowd in the background while the camera
    is moving. This artifacting is NOT due to interlace per se,
    but interlace MIGHT contribute to it. The artifacting is
    due to inadequate payload capability.

    John
  31. Archived from groups: alt.tv.tech.hdtv (More info?)

    In article <dCfnc.101577$j11.1917930@weber.videotron.net>,
    "jean dumont" <jean.dumont@videotron.ca> writes:
    > Thanks for the explanations John.
    >
    > I presume that if the sampling rate would be high enough, no rolloff would
    > be necessary.
    >
    Well - the problem is that for various reasons, the sampling rate
    actually implies the sampling structure. The 1920H pixels and
    1080V pixels define the sampling structure, and the entire structure
    is updated at 1/30 or 1/60 of a second (or in europe, 1/25 or 1/50.)
    (Just thought of using the term 'softening' for 'rolloff.' Maybe
    that will help :-)).

    So, the 1920H pixels describes one of the parameters of the sampling
    structure, and if there is detail that overruns that structure
    (and actually, mathmatically that happens more often than one
    might think), then the artifacts occur.

    The frequency spectrum of the signal can wrap around the sampling
    frequency much easier than one might think. So, the pre-filtering
    before digitization and encoding is quite important. Actually,
    some equipment can ASSUME that the input spectrum won't wrap,
    but a general purpose digitization device will have to do some
    filtering.

    (My numbers below are necessarily very approximate, but do give
    a sense of the disadvantage in detail that a 1280H sampling
    structure gives. Even if there isn't alot of detail at 1920H
    level, and the detail has diminished significantly by the 1440H
    range, the necessary rolloff doesnt' have to be significant
    at important frequencies like the 1280H structure would require.)

    It probably doesn't take a large amount of filtering for video
    to mitigate the artifacting. Audio is probably much more critical.
    However, video filters are necessarily 'gentle' (they have slow
    changes in response versus frequency), and so to make sure that
    the video signal is -20dB at the sampling frequency, then the
    rolloff might have to be significant at 1/2 or lower frequency. (e.g.
    to make sure that video response is -20dB at 1280H to help mitigate
    the aliasing effects, then it might be necessary to start rolling
    off at relatively low spatial frequencies (maybe 500H equivalent),
    and perhaps 6 or 10dB down at 800-900H equivalent.) The needed
    rolloff is proportionally higher for the 1920H structure, and
    requires significantly less 'softening' of the signal.


    >
    > If this is true, then the bottom line is that 19Mbits/s is not a high enough
    > bitrate to allow for the full resolution of 1080ix1920 or even 720px1280?
    > Why not allocate enough bandwith to do without rolloff?
    >
    That 19mbps is related to the ENCODED bandwidth, and only indirectly related
    to the sampling structure. The reason why MPEG2 has troubles fitting
    into 19mbps (especiall troubles at 15mbps or so) is that MPEG2 can only
    remove so much spatial and temporal detail (and coefficients) before the
    image quality suffers.

    John
  32. Archived from groups: alt.tv.tech.hdtv (More info?)

    Jumpin Murphy! Thank you group! . . Finally an intelligent post with useful
    information and no flaming!

    Jodster
    <okner@newshosting.com> wrote in message
    news:19c13a19.0405081903.4bd8b392@posting.google.com...
    > "Brian K. White" <nospam@foxfire74.com> wrote in message
    news:<Vr9nc.2579$GL4.94@fe2.columbus.rr.com>...
    > > I have a front projection CRT display that can display both 1080i and
    720p.
    > > It is connected to a HD cable box which outputs both 1080i and 720p
    > > depending on the original source format. (Thus there is no conversion
    > > happening at my end.) Here are my observations:
    > > 1. 720p may required less bandwidth due to better MPEG compression of a
    > > progressive signal, however, it requires a display capable of synching
    to a
    > > MUCH higher frequency than 1080i.
    > > 2. All of the material that I have viewed looks better at 1080i than at
    > > 720p.
    > > 3. Both 1080i and 720p blow 480p (progressive scan DVD quality) away.
    > > 4. 1080i looks better with HD video sources than with film sources.
    (This
    > > may be due to the fact that my equipment doesn't compensate well for the
    3/2
    > > pull-down artifacts you get when you convert a 24fps film to a 30fps
    > > signal.)
    > > 5. I watched all of the recent major sporting events (Super Bowl, the
    NCAA
    > > tournament, etc) in HD and have zero complaints about picture quality.
    > >
    > >
    > > "MarkW" <markwco(removenospam)@comcast.net> wrote in message
    > > news:6oli90d6taheg95i05pu5fvlmqbvbocm47@4ax.com...
    > > > I am going to be picking up my HD TIVO today and am curious if it's
    > > > best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
    > > > again is 720 more like 1440 or is simply have progressive better than
    > > > interlaced? Or is it simply a matter of opinion? I figured I'd stick
    > > > with one in my menu since I'll be using 480i on another TV and just
    > > > wonder which to go with. I'm not sure if it depends on what you're
    > > > watching or the TV you have but as for the TV's, I'll be using this
    > > > with two TV's
    > > > Sony KF-42WE610
    > > > Sony KF-50XBR800
    >
    >
    > I'm just going to jump in here and say something.
    > I think the only format that makes sense is 1920x1080(24p).
    > Now all I want to see are 16:9 crt vga screens. It shouldn't be hard
    > at all.
    > I don't know why they aren't around because you know people get all
    > excited when they see that 16:9 ratio. And it could play
    > 1920x1080(24p). It wouldn't cost that much either. Less than $300.
Ask a new question

Read More

HDTV TV Sony Tivo Home Theatre