Sign in with
Sign up | Sign in
Your question

720p conversion to 1080i

Last response: in Home Theatre
Share
Anonymous
January 11, 2005 6:57:40 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

I was hoping that someone would explain to me if and why this is a bad idea.
I'm asking specifically because most rear projection crts convert to either
540p or 1080i. with DVDs or HD programming, I'm not anticipating any issues
with that, but with games (granted, there are only a few currently
available), I'm wondering if this will actually make them look worse.
should I just spend the extra $500-700 for a rplcd or dlp? if so, how would
1080i be handled as 720p?

in a related theme, would I just be better of going for the 720p native sets
anyway as there are no burn-in worries? is it safe to watch
letter/windowboxed movies with rpcrts? I'm not much of a stretch/zoom
fan...

thanks for the help, guys.

More about : 720p conversion 1080i

Anonymous
January 11, 2005 9:54:44 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

Khee Mao wrote:
>
> I was hoping that someone would explain to me if and why this is a bad idea.
> I'm asking specifically because most rear projection crts convert to either
> 540p or 1080i. with DVDs or HD programming, I'm not anticipating any issues
> with that, but with games (granted, there are only a few currently
> available), I'm wondering if this will actually make them look worse.
> should I just spend the extra $500-700 for a rplcd or dlp? if so, how would
> 1080i be handled as 720p?
>
> in a related theme, would I just be better of going for the 720p native sets
> anyway as there are no burn-in worries? is it safe to watch
> letter/windowboxed movies with rpcrts? I'm not much of a stretch/zoom
> fan...
>
> thanks for the help, guys.


I'm not sure what your exact question is.....
except for a possible burn issue...

I might add that the 720p HDTV processing issue either for
earlier HDTV monitors or current less expensive HDTV sets
requires that 720p HD signal takes a more 'Powerful CPU'
to process than for a 1080i Digital signal....

In the Past processing 720p was a more 'costly' feature
to include in a HDTV set...

I would not run the HD set in 'Torch (high contrast) mode and do
use
Stretch & Zoom CRT option IF the same 'Game' is run
frequently...
Anonymous
January 11, 2005 9:54:45 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

"Dennis Mayer" <Polaris1@execpc.com> wrote in message
news:41E47554.663F74E1@execpc.com...

>
>
> I'm not sure what your exact question is.....
> except for a possible burn issue...
>

sorry. to clarify: is converting a 720p signal to 1080i lossy? if so, how
bad?

>
> I would not run the HD set in 'Torch (high contrast) mode and do
> use
> Stretch & Zoom CRT option IF the same 'Game' is run
> frequently...

the stretch and zoom I was more concerned with in regards to watching SDTV.
would I need to have this turned onto prevent letterbox/windowbox ghosts
even if 'torch mode' was turned off?
Related resources
Anonymous
January 12, 2005 5:53:45 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

Khee Mao wrote:

> sorry. to clarify: is converting a 720p signal to 1080i lossy?
> if so, how bad?

Yes, it's lossy in both directions. It's going to be always a little
less sharp than material that originated in the desired resolution.
And since the material out there is currently divided about fifty-fifty
between the two, it's guaranteed that whichever kind of set you buy,
some part of the HD images you watch will be in imperfect condition.
Anonymous
January 12, 2005 7:24:52 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

There is virtually no loss in converting from a progressive format to an
interlaced one, since each
progressive frame consists of same time-domain data. Going from interlaced
to progressive is a
problem, since the odd and even scan lines show motion offset 1/60th sec in
time.
"Paul Kienitz" <paul-NOZPAM@paulkienitz.net> wrote in message
news:1105527225.742999.186190@f14g2000cwb.googlegroups.com...
> Khee Mao wrote:
>
>> sorry. to clarify: is converting a 720p signal to 1080i lossy?
>> if so, how bad?
>
> Yes, it's lossy in both directions. It's going to be always a little
> less sharp than material that originated in the desired resolution.
> And since the material out there is currently divided about fifty-fifty
> between the two, it's guaranteed that whichever kind of set you buy,
> some part of the HD images you watch will be in imperfect condition.
>
Anonymous
January 12, 2005 9:57:40 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

If you convert from progressive to interlace, you can't display both the odd
and even lines from the same frame. Otherwise, you'd have to skip every
other frame or the video would play at half speed. To prevent this, you'll
have go back to the interlace, where odd and even lines are from different
frames, and thus the motion artifacts. Whether or not this is noticeable is
another matter.

Tim

"Frank Provasek" <provasek@sbcglobal.net> wrote in message
news:o bcFd.4546$KJ2.3222@newsread3.news.atl.earthlink.net...
> There is virtually no loss in converting from a progressive format to an
> interlaced one, since each
> progressive frame consists of same time-domain data. Going from
> interlaced to progressive is a
> problem, since the odd and even scan lines show motion offset 1/60th sec
> in time.
> "Paul Kienitz" <paul-NOZPAM@paulkienitz.net> wrote in message
> news:1105527225.742999.186190@f14g2000cwb.googlegroups.com...
>> Khee Mao wrote:
>>
>>> sorry. to clarify: is converting a 720p signal to 1080i lossy?
>>> if so, how bad?
>>
>> Yes, it's lossy in both directions. It's going to be always a little
>> less sharp than material that originated in the desired resolution.
>> And since the material out there is currently divided about fifty-fifty
>> between the two, it's guaranteed that whichever kind of set you buy,
>> some part of the HD images you watch will be in imperfect condition.
>>
>
>
Anonymous
January 13, 2005 4:41:11 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

"Tim Watkins" <timandca@nospam.comcast.net> wrote in message
news:wa2dndyWZa_qJHjcRVn-pQ@comcast.com...
> If you convert from progressive to interlace, you can't display both the
> odd and even lines from the same frame. Otherwise, you'd have to skip
> every other frame or the video would play at half speed. To prevent this,
> you'll have go back to the interlace, where odd and even lines are from
> different frames, and thus the motion artifacts. Whether or not this is
> noticeable is another matter.
>
> Tim
>
The topic here was quality loss in going from 720p to 1080i, and while of
course you
have interlacing on the 1080i side, as long as you make a 1080i frame out
of a frame (720p/30fps),
or a 1080i FIELD from a frame (720p/60fps) it's a pretty clean conversion.

Going the other way from interlaced to progressive is not as nice. Making
FRAMES from
FIELDS involves guessing what half the scanning lines contain based on what
they contained
1/60 sec earlier.
Anonymous
January 13, 2005 6:10:26 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

"Frank Provasek" <provasek@sbcglobal.net> wrote in message
news:o bcFd.4546$KJ2.3222@newsread3.news.atl.earthlink.net...
> There is virtually no loss in converting from a progressive format to an
> interlaced one, since each
> progressive frame consists of same time-domain data. Going from
> interlaced to progressive is a
> problem, since the odd and even scan lines show motion offset 1/60th sec
> in time.

There can be a significant loss, as you're throwing away half of your data!
On the other hand, there needn't be noticeable or troublesome ARTIFACTS to
conetnd with, which is what I think you meant.
Anonymous
January 13, 2005 5:28:52 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

I have a Panasonic DVR that will output 16:9 at 480i or 480p. I had been
sending it to my Sony HDTV at 480p -- the set then converts all signals to
768p for display -- but last night changed it to 480i, to let the TV set do
the processing to progressive, as some here have suggested. Seems to me the
picture is a little smoother -- less grain. Haven't tried it with a DVD
yet.

mack
austin


"Frank Provasek" <provasek@sbcglobal.net> wrote in message
news:XkkFd.5613$C52.3611@newsread2.news.atl.earthlink.net...
>
> "Tim Watkins" <timandca@nospam.comcast.net> wrote in message
> news:wa2dndyWZa_qJHjcRVn-pQ@comcast.com...
>> If you convert from progressive to interlace, you can't display both the
>> odd and even lines from the same frame. Otherwise, you'd have to skip
>> every other frame or the video would play at half speed. To prevent
>> this, you'll have go back to the interlace, where odd and even lines are
>> from different frames, and thus the motion artifacts. Whether or not
>> this is noticeable is another matter.
>>
>> Tim
>>
> The topic here was quality loss in going from 720p to 1080i, and while of
> course you
> have interlacing on the 1080i side, as long as you make a 1080i frame out
> of a frame (720p/30fps),
> or a 1080i FIELD from a frame (720p/60fps) it's a pretty clean conversion.
>
> Going the other way from interlaced to progressive is not as nice. Making
> FRAMES from
> FIELDS involves guessing what half the scanning lines contain based on
> what they contained
> 1/60 sec earlier.
>
Anonymous
January 13, 2005 6:21:21 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

"Matthew Vaughan" <matt-no-spam-109@NOSPAM.hotmail.com> wrote in message
news:CElFd.1634$m31.17811@typhoon.sonic.net...
> "Frank Provasek" <provasek@sbcglobal.net> wrote in message
> news:o bcFd.4546$KJ2.3222@newsread3.news.atl.earthlink.net...
>> There is virtually no loss in converting from a progressive format to an
>> interlaced one, since each
>> progressive frame consists of same time-domain data. Going from
>> interlaced to progressive is a
>> problem, since the odd and even scan lines show motion offset 1/60th sec
>> in time.
>
> There can be a significant loss, as you're throwing away half of your
> data! On the other hand, there needn't be noticeable or troublesome
> ARTIFACTS to conetnd with, which is what I think you meant.
Well, 1080i has a higher DATA rate than 720p

But since most HD material is sourced from 35mm film, which is 24
frames/sec,
720p is REPEATING film frames 60% of the time. 1080i repeats film frames
only 20% of the time. Since 1080i has 50% greater horizontal resolution
than 720p,
that's why 1080i has been adopted by NBC, CBS, PBS, WB, UPN, HBO, Showtime,
Starz, HDNet etc.

The Fox and Mouse (ABC) use 720p
Anonymous
January 14, 2005 9:24:15 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

Frank Provasek wrote:

> "Tim Watkins" <timandca@nospam.comcast.net> wrote in message
> news:wa2dndyWZa_qJHjcRVn-pQ@comcast.com...
>
> The topic here was quality loss in going from 720p to 1080i, and
> while of course you
> have interlacing on the 1080i side, as long as you make a 1080i
> frame out of a frame (720p/30fps),
> or a 1080i FIELD from a frame (720p/60fps) it's a pretty clean
> conversion.
>
> Going the other way from interlaced to progressive is not as nice.
> Making FRAMES from
> FIELDS involves guessing what half the scanning lines contain based
> on what they contained 1/60 sec earlier.

With nonfilm source, going from a 720 frame to a 1080 field means a
loss of 1/3 of the resolution, and going the other direction means
interpolation. Interpolation never ends up as sharp as it would have
been if you'd scanned it at that resolution in the first place. And it
can result in artifacts where some parts of the screen have crisper
edges than others, unless they are blurred to match. With film source
(or a nonfilm source where the image currently isn't moving, if your
convert-ware is smart enough, which it probably isn't), it's sort of
the other way around: going from 1080 frame to 720 frame loses 1/3 of
the resolution, and going from 720 frame to 1080 frame is interpolated.
In no case is the result as good as the original, though in some cases
(particularly, 1080 film to 720) it may be almost as good as it would
have been if it had been scanned at the final resolution in the first
place.

Converting between 720 and 1080 is somewhat like converting between PAL
and NTSC. When's the last time you saw PAL/NTSC conversion results
that looked as good as unconverted material?
Anonymous
January 15, 2005 2:06:50 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

Frank Provasek wrote:

> Converting from 25 frames/sec to 30 frames/sec (or back) is what
> makes PAL/NTSC conversion ugly.

If that were the only source of the problem, then everything would
become crystal clear in stationary scenes.
Anonymous
January 15, 2005 6:30:26 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

"Paul Kienitz" <paul-NOZPAM@paulkienitz.net> wrote
>
> Converting between 720 and 1080 is somewhat like converting between PAL
> and NTSC. When's the last time you saw PAL/NTSC conversion results
> that looked as good as unconverted material?
>
Converting from 25 frames/sec to 30 frames/sec (or back) is what
makes PAL/NTSC conversion ugly.
Anonymous
January 15, 2005 8:07:50 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

"Paul Kienitz" <paul-NOZPAM@paulkienitz.net> wrote in message
news:1105772810.838252.147870@z14g2000cwz.googlegroups.com...
> Frank Provasek wrote:
>
>> Converting from 25 frames/sec to 30 frames/sec (or back) is what
>> makes PAL/NTSC conversion ugly.
>
> If that were the only source of the problem, then everything would
> become crystal clear in stationary scenes.
>
True, and that is very fatiguing to the eye, so a gaussian blur is applied
to stationary
scenes to match the motion blur...
Anonymous
January 16, 2005 2:24:34 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

Frank Provasek wrote:

> >> Converting from 25 frames/sec to 30 frames/sec (or back) is what
> >> makes PAL/NTSC conversion ugly.
> >
> > If that were the only source of the problem, then everything would
> > become crystal clear in stationary scenes.
> >
> True, and that is very fatiguing to the eye, so a gaussian blur is
> applied to stationary scenes to match the motion blur...

*cough* The motion blur depends on the SPEED of motion, so there is no
such thing as a gaussian blur that "matches" it. Oh, and 24 FPS film
has a huge amount of motion blur just from the long frame exposure
times, and I never noticed that fatiguing anyone's eye.


Y'know, it occurs to me to wish that since they're using both 720p and
1080i routinely, it would be nice if they would switch formats
depending on the content of the show, rather than just depending on
which network it's on... like, using 720p for sports and 1080i for
film. Unfortunately, I don't think the typical TV can switch between
them in midstream. At least mine probably can't... there's one local
subchannel that switches from 480i to 1080i at 8 PM every evening, and
if I'm watching it when it switches, I have to change to a different
channel and come back before it recognizes the format.
Anonymous
January 16, 2005 9:20:03 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

"Paul Kienitz" <paul-NOZPAM@paulkienitz.net> wrote in message
news:1105860274.389672.144090@f14g2000cwb.googlegroups.com...
> Frank Provasek wrote:
>
>> >> Converting from 25 frames/sec to 30 frames/sec (or back) is what
>> >> makes PAL/NTSC conversion ugly.
>> >
>> > If that were the only source of the problem, then everything would
>> > become crystal clear in stationary scenes.
>> >
>> True, and that is very fatiguing to the eye, so a gaussian blur is
>> applied to stationary scenes to match the motion blur...
>
> *cough* The motion blur depends on the SPEED of motion, so there is no
> such thing as a gaussian blur that "matches" it. Oh, and 24 FPS film
> has a huge amount of motion blur just from the long frame exposure
> times, and I never noticed that fatiguing anyone's eye.
>
Motion blur in video standards conversion refers to synthesized frames to
fill in where
frames are deleted or extra frames supplied. The algorithms sometimes
create a frame from
two frames 1/25 sec apart, giving an "exposure" of 1/12 sec.

Normal 35mm motion picture film uses an exposure of 1/50 sec or faster.
Anonymous
January 16, 2005 11:17:12 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

Frank Provasek wrote:

> >> > If that were the only source of the problem, then everything
> >> > would become crystal clear in stationary scenes.
> >>
> >> True, and that is very fatiguing to the eye, so a gaussian blur
> >> is applied to stationary scenes to match the motion blur...
> >
> > *cough* The motion blur depends on the SPEED of motion, so there
> > is no such thing as a gaussian blur that "matches" it. Oh, and 24
> > FPS film has a huge amount of motion blur just from the long frame
> > exposure times, and I never noticed that fatiguing anyone's eye.
>
> Motion blur in video standards conversion refers to synthesized
> frames to fill in where
> frames are deleted or extra frames supplied. The algorithms
> sometimes create a frame from
> two frames 1/25 sec apart, giving an "exposure" of 1/12 sec.
>
> Normal 35mm motion picture film uses an exposure of 1/50 sec or
faster.

Have you ever looked at the individual frames in a fast moving film
scene? The blur of one frame very commonly ends near where the blur of
the next frame starts. That's actually the better way to do it,
because using shorter exposures in fast moving scenes makes judder more
noticeable. Shorter exposures, as have to be used in strongly lit
scenes, highlight the inadequacies of 24 FPS frame rates pretty
obviously sometimes.

....Which is not the main issue I called bullshit on: that there is no
such thing as a gaussian blur that "matches" motion blur. The width of
a motion blur, in pixels, depends entirely on how fast the objects in
the picture are moving. The artificial blur applied to pal/ntsc
conversion is INDEPENDENT of motion blur!
January 17, 2005 2:17:01 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <1105935432.286827.94080@f14g2000cwb.googlegroups.com>, paul-
NOZPAM@paulkienitz.net says...
> Frank Provasek wrote:

> ...Which is not the main issue I called bullshit on: that there is no
> such thing as a gaussian blur that "matches" motion blur. The width of
> a motion blur, in pixels, depends entirely on how fast the objects in
> the picture are moving. The artificial blur applied to pal/ntsc
> conversion is INDEPENDENT of motion blur!

/shrug

Eventually all TVs will be 1080p (or better), and capable of displaying
both 1080i, and 720p (with simple resolution scaling) eliminating the
need to 'convert' between them.

Further, I -hope- eventually TVs will be able to accept, and display
content at its source fps rate, so we won't be converting between 24, 50
or 60 fps rates. The broadcaster broadcasts at what rate it was recorded
in, and the TV displays whats coming in at -that- rate.

(I suppose to prevent there being a 'jar' in the aesthetics of the
picture, commericals should be converted and encoded to match the same
resolution and fps rate as the surrounding content...)
Anonymous
January 17, 2005 6:05:54 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

42 wrote:

> Eventually all TVs will be 1080p (or better), and capable of
> displaying both 1080i, and 720p (with simple resolution scaling)
> eliminating the need to 'convert' between them.

Yes, the best way for a TV to be able to display both resolutions well
is to be better than either. If it got up to 2160p every scaling issue
would be completely gone. And since cinema projectors will probably go
to 2160p in the near future, this isn't so unlikely a scenario...

> Further, I -hope- eventually TVs will be able to accept, and display
> content at its source fps rate, so we won't be converting between 24,
> 50 or 60 fps rates. The broadcaster broadcasts at what rate it was
> recorded in, and the TV displays whats coming in at -that- rate.

And why not, computer monitors have been capable of variable frame
rates for years and years. I would hope that in five years any old
cheap TV could display PAL and other 50 FPS formats.

> (I suppose to prevent there being a 'jar' in the aesthetics of the
> picture, commericals should be converted and encoded to match the
> same resolution and fps rate as the surrounding content...)

If the TV is based on liquid crystals or something similar, where the
light output at any one pixel is continuous, the eye should not be able
to detect a change of frame rate. If it's based on CRTs or DLP it
might make a difference you could see.
Anonymous
January 17, 2005 8:06:28 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

Frank Provasek wrote:

> You have been wasting my time for a week that denying that
> conversion looks bad because
> of the frame rate problems, because "if that were the only
> source of the problem, then everything would
> become crystal clear in stationary scenes." Which if you look at a
> raw output IT DOES,

You seem to be trying to claim that I denied motion blur from frame
rate change was a major problem. Which I did not. The issue is that
you denied the existence of any OTHER problem besides motion blur,
saying it is the sole issue.

> and since a picture that resharpens
> and blurs when motion stops and starts is MORE annoying, a
> gaussian blur is applied to the motionless
> scenes to SUBJECTIVELY even out the "look." Then you argue
> about THAT, but later say
> " The artificial blur applied to pal/ntsc conversion is
> INDEPENDENT of motion blur!"
>
> Yes ..that is exactly what I said...it's an artificial GAUSSIAN
> blur to SUBJECTIVELY match the motion blur.

No, it's not what you said, you said that the gaussian blur is put in
to MATCH the motion blur. You now say that the blur is meant to
"subjectively" resemble the motion blur, which to me looks like backing
off quite a bit from your original wording. This is much more
plausible... but don't claim that's what either of us previously said.
Your use of the word "independent" seems to be in a much narrower sense
than mine... though it gets closer once you add "subjective". And
since you already made one response to my objection to the idea that
they match, I wonder why you didn't clarify this last time, if a rough
subjective resemblance was all you meant to assert.

All of which is a distraction from the actual point of discussion,
which is whether there is a blurring issue OTHER THAN motion blur. My
assertion was that there's a separate sharpness issue caused by the
interpolative resolution change, which requires some blurring to cover
it up, at least in cases where the original image isn't antialiased to
perfection (which it generally is not, especially if the source comes
directly from a video camera). You've left that question out entirely.

> Definition of Gaussian Blur
>
> http://www.maths.abdn.ac.uk/~igc/tch/mx4002/notes/node9...

I don't think the exact flavor of static blur is a very relevant issue.
(And you don't need to explain how it's defined, I know that stuff.)
Suffice it to say that any resemblance to motion blur is subjective
indeed.

> Then you argue that motion pictures blur more than video due to
> the long exposure time. And claim
> "the blur of one frame very commonly ends near where the blur of
> the next frame starts"
>
> IMPOSSIBLE (on a standard film camera) which typically uses a 170 or
> 180 degree shutter.

Well, all I can tell you is what I see on the finished film, which is
that the blurs often appear more continuous than not when viewed frame
by frame. And that even mild motion results in huge amounts of
blurring in a typical film scene. (It doesn't look too bad because the
eye naturally tends to blur motion similarly, at least in dimmer
light.) And if that blur is missing, the result looks bizarre, as in
"28 Days Later".

If you know about film camera shutters, perhaps you can tell me: I know
that early cameras used a simple chopper wheel as a shutter, so the
"degrees" of shutter was literal... and I don't think these shutters
were in the optical centers of lenses as is the case in, like, a
Hasselblad. So with that design, the length of PARTIAL exposure of the
film exceeded the nominal degrees of the shutter, because it took time
for the edge of the blade to cross the lens aperture. Might that still
be the case with modern cameras? I wouldn't have assumed so, because
it seems crude, but what I see on film suggests that they may still be
using that kind of exposure, perhaps because it improves the subjective
quality of motion blur by having the ends fade in and out instead of
switching on and off abruptly (which is indeed how it looks to me) --
much like the way that gaussian blur looks better than defocus blur.
If this is the case with modern cameras, it would explain why I see the
blur extending further than the nominal degrees of shutter. Do you
know if that's the case?
Anonymous
January 17, 2005 8:40:38 PM

Archived from groups: alt.tv.tech.hdtv (More info?)

> Frank Provasek wrote:
>
>> >> > If that were the only source of the problem, then everything
>> >> > would become crystal clear in stationary scenes.
>> >>
>> >> True, and that is very fatiguing to the eye, so a gaussian
>> >> blur
>> >> is applied to stationary scenes to match the motion blur...
>> >
>> > *cough* The motion blur depends on the SPEED of motion, so
>> > there
>> > is no such thing as a gaussian blur that "matches" it. Oh, and
>> > 24
>> > FPS film has a huge amount of motion blur just from the long
>> > frame
>> > exposure times, and I never noticed that fatiguing anyone's
>> > eye.
>>
>> Motion blur in video standards conversion refers to synthesized
>> frames to fill in where
>> frames are deleted or extra frames supplied. The algorithms
>> sometimes create a frame from
>> two frames 1/25 sec apart, giving an "exposure" of 1/12 sec.
>>
>> Normal 35mm motion picture film uses an exposure of 1/50 sec or
> faster.
>
> Have you ever looked at the individual frames in a fast moving
> film
> scene? The blur of one frame very commonly ends near where the
> blur of
> the next frame starts. That's actually the better way to do it,
> because using shorter exposures in fast moving scenes makes judder
> more
> noticeable. Shorter exposures, as have to be used in strongly lit
> scenes, highlight the inadequacies of 24 FPS frame rates pretty
> obviously sometimes.
>
> ...Which is not the main issue I called bullshit on: that there is
> no
> such thing as a gaussian blur that "matches" motion blur. The
> width of
> a motion blur, in pixels, depends entirely on how fast the objects
> in
> the picture are moving. The artificial blur applied to pal/ntsc
> conversion is INDEPENDENT of motion blur!
>
You have been wasting my time for a week that denying that
conversion looks bad
because of the frame rate problems, because "if that were the only
source of the problem, then everything would
become crystal clear in stationary scenes." Which if you look at a
raw output IT DOES, and since a picture that resharpens
and blurs when motion stops and starts is MORE annoying, a gaussian
blur is applied to the motionless
scenes to SUBJECTIVELY even out the "look." Then you argue about
THAT, but later say
" The artificial blur applied to pal/ntsc conversion is INDEPENDENT
of motion blur!"

Yes ..that is exactly what I said...it's an artificial GAUSSIAN blur
to SUBJECTIVELY match the
motion blur.

Definition of Gaussian Blur

http://www.maths.abdn.ac.uk/~igc/tch/mx4002/notes/node9...

Then you argue that motion pictures blur more than video due to the
long exposure time. And claim
"the blur of one frame very commonly ends near where the blur of
the next frame starts"

IMPOSSIBLE (on a standard film camera) which typically uses a 170 or
180 degree shutter.
The film is blocked half the time so that the next frame can be
positioned. The long motion blurs you
describe are common (and physically possible) on electronic systems
ONLY, especially on material
produced with tube cameras, which have lag not applicable to film
EVER.
Anonymous
January 18, 2005 1:02:52 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

42 wrote:
> In article <1105935432.286827.94080@f14g2000cwb.googlegroups.com>,
> paul- NOZPAM@paulkienitz.net says...
>> Frank Provasek wrote:
>
>> ...Which is not the main issue I called bullshit on: that there is no
>> such thing as a gaussian blur that "matches" motion blur. The width
>> of a motion blur, in pixels, depends entirely on how fast the
>> objects in the picture are moving. The artificial blur applied to
>> pal/ntsc conversion is INDEPENDENT of motion blur!
>
> /shrug
>
> Eventually all TVs will be 1080p (or better), and capable of
> displaying both 1080i, and 720p (with simple resolution scaling)
> eliminating the need to 'convert' between them.
>
> Further, I -hope- eventually TVs will be able to accept, and display
> content at its source fps rate, so we won't be converting between 24,
> 50 or 60 fps rates. The broadcaster broadcasts at what rate it was
> recorded in, and the TV displays whats coming in at -that- rate.
>
> (I suppose to prevent there being a 'jar' in the aesthetics of the
> picture, commericals should be converted and encoded to match the same
> resolution and fps rate as the surrounding content...)

Hmm....my Mits converts my OTA FOX signal to 1080i. I guess it's better
than rejecting it all together......

That's what I get for buying too early. No 720p or 1080p support, no
DVI........of course, it is a good reason to upgrade!
January 18, 2005 8:49:29 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <1106003154.562482.108600@z14g2000cwz.googlegroups.com>,
paul-NOZPAM@paulkienitz.net says...
> 42 wrote:
>
> > Eventually all TVs will be 1080p (or better), and capable of
> > displaying both 1080i, and 720p (with simple resolution scaling)
> > eliminating the need to 'convert' between them.
>
> Yes, the best way for a TV to be able to display both resolutions well
> is to be better than either. If it got up to 2160p every scaling issue
> would be completely gone. And since cinema projectors will probably go
> to 2160p in the near future, this isn't so unlikely a scenario...
>
> > Further, I -hope- eventually TVs will be able to accept, and display
> > content at its source fps rate, so we won't be converting between 24,
> > 50 or 60 fps rates. The broadcaster broadcasts at what rate it was
> > recorded in, and the TV displays whats coming in at -that- rate.
>
> And why not, computer monitors have been capable of variable frame
> rates for years and years. I would hope that in five years any old
> cheap TV could display PAL and other 50 FPS formats.

Yeah... except it could have happened 5 years -ago-... not sure why TVs
are lagging here. And until its widely adopted the broadcasters won't
support it.

> > (I suppose to prevent there being a 'jar' in the aesthetics of the
> > picture, commericals should be converted and encoded to match the
> > same resolution and fps rate as the surrounding content...)
>
> If the TV is based on liquid crystals or something similar, where the
> light output at any one pixel is continuous, the eye should not be able
> to detect a change of frame rate. If it's based on CRTs or DLP it
> might make a difference you could see.

Thus unless you see the near future extinction of CRT and DLP... :) 
January 18, 2005 9:14:48 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

In article <G7idnVsT8NAUO3HcRVn-iw@comcast.com>, alpertl@xxcomcast.net
says...
> 42 wrote:
> > In article <1105935432.286827.94080@f14g2000cwb.googlegroups.com>,
> > paul- NOZPAM@paulkienitz.net says...
> >> Frank Provasek wrote:
> >
> >> ...Which is not the main issue I called bullshit on: that there is no
> >> such thing as a gaussian blur that "matches" motion blur. The width
> >> of a motion blur, in pixels, depends entirely on how fast the
> >> objects in the picture are moving. The artificial blur applied to
> >> pal/ntsc conversion is INDEPENDENT of motion blur!
> >
> > /shrug
> >
> > Eventually all TVs will be 1080p (or better), and capable of
> > displaying both 1080i, and 720p (with simple resolution scaling)
> > eliminating the need to 'convert' between them.
> >
> > Further, I -hope- eventually TVs will be able to accept, and display
> > content at its source fps rate, so we won't be converting between 24,
> > 50 or 60 fps rates. The broadcaster broadcasts at what rate it was
> > recorded in, and the TV displays whats coming in at -that- rate.
> >
> > (I suppose to prevent there being a 'jar' in the aesthetics of the
> > picture, commericals should be converted and encoded to match the same
> > resolution and fps rate as the surrounding content...)
>
> Hmm....my Mits converts my OTA FOX signal to 1080i. I guess it's better
> than rejecting it all together......

Definately.

My point was largely that the issues between converting from 720p to
1080i and back again, is an ephemeral problem at most. Soon it will all
go to 1080p or better, and the issues really just go away. I don't
really want manufacturers to spend a lot of time refining the 720p->
1080i (or back again) conversion algorithms... just get on with
delivering affordable 1080p sets and convert to that.

> That's what I get for buying too early. No 720p or 1080p support, no
> DVI........of course, it is a good reason to upgrade!

Its a computer now. It was obsolete before you plugged it in. :) 
Anonymous
January 19, 2005 1:32:09 AM

Archived from groups: alt.tv.tech.hdtv (More info?)

42 wrote:
> In article <G7idnVsT8NAUO3HcRVn-iw@comcast.com>, alpertl@xxcomcast.net
> says...
>
> Its a computer now. It was obsolete before you plugged it in. :) 

Ya know, if you think about it, it's almost as expensive as computer
equipment when memory was running +$100 a meg using 9 single dram chips and
20 meg hard drives were huge is size and cost.......

Oy, these tech hobbies are always expensive. I'm surprised the wife has put
up with me for so long....don't feed the baby, daddy needs a new video card!
!