Advice Please: Video Storage Amounts Per Gigabyte

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Can someone point me in the direction of where I can get info
concerning how much umcompressed video can realistically be stored on a
73G hard drive at good and high resolutions?

Are there any charts that can give me a rough idea?

Thanks a lot.

Darren Harris
Staten ISland, New York.
12 answers Last reply
More about advice please video storage amounts gigabyte
  1. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    <Searcher7@mail.con2.com> wrote in message
    news:1112726280.017229.231370@f14g2000cwb.googlegroups.com...
    > Can someone point me in the direction of where I can get info
    > concerning how much umcompressed video can realistically be stored on a
    > 73G hard drive at good and high resolutions?

    If you're insisting on uncompressed video, then the only questions to answer
    are how many pixels, the frame rate, and the number of bits of information
    provided for each pixel. For instance, 640 x 480 at 60 Hz, 24 bits/pixel
    (8 each R, G, and B), if you eliminate the blanking time from consideration,
    is a raw data rate of

    640 x 480 x 60 x 24 bits (3 bytes) = 55.3 Mbytes/sec

    So a 73 Gbyte drive would be able to hold at most

    73/0.0553 = 1320 seconds of video, or about 22 minutes' worth.

    But no one stores high-resolution video without SOME form of compression.
    Standard TV video is normally interlaced, which can be viewed as a
    crude form of compression giving a 2:1 reduction. Full 24-bit RGB
    also isn't a very efficient means of storing color video; instead, you
    store 8 bits of luminance for each pixel, and then subsample the color
    information, which can give you something like another 4:1 or so
    reduction. But the big savings comes from applying a halfway decent
    video compression scheme, as is done with HDTV. In broadcast HD,
    a 1920 x 1080, 60 Hz interlaced video stream winds up at under
    20 Mbits/second; figure out how long you can run THAT into a
    73 GB disc, and you get a fairly pleasing large number out as a
    result.

    Bob M.
  2. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    Bob Myers wrote:

    >
    > <Searcher7@mail.con2.com> wrote in message
    > news:1112726280.017229.231370@f14g2000cwb.googlegroups.com...
    >> Can someone point me in the direction of where I can get info
    >> concerning how much umcompressed video can realistically be stored on a
    >> 73G hard drive at good and high resolutions?
    >
    > If you're insisting on uncompressed video, then the only questions to
    > answer are how many pixels, the frame rate, and the number of bits of
    > information
    > provided for each pixel. For instance, 640 x 480 at 60 Hz, 24 bits/pixel
    > (8 each R, G, and B), if you eliminate the blanking time from
    > consideration, is a raw data rate of
    >
    > 640 x 480 x 60 x 24 bits (3 bytes) = 55.3 Mbytes/sec
    >
    > So a 73 Gbyte drive would be able to hold at most
    >
    > 73/0.0553 = 1320 seconds of video, or about 22 minutes' worth.
    >
    > But no one stores high-resolution video without SOME form of compression.
    > Standard TV video is normally interlaced, which can be viewed as a
    > crude form of compression giving a 2:1 reduction. Full 24-bit RGB
    > also isn't a very efficient means of storing color video; instead, you
    > store 8 bits of luminance for each pixel, and then subsample the color
    > information, which can give you something like another 4:1 or so
    > reduction. But the big savings comes from applying a halfway decent
    > video compression scheme, as is done with HDTV. In broadcast HD,
    > a 1920 x 1080, 60 Hz interlaced video stream winds up at under
    > 20 Mbits/second; figure out how long you can run THAT into a
    > 73 GB disc, and you get a fairly pleasing large number out as a
    > result.

    The downside there is that you're dealing with lossy compression, which
    means that you run into generation loss.

    > Bob M.

    --
    --John
    to email, dial "usenet" and validate
    (was jclarke at eye bee em dot net)
  3. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "J. Clarke" <jclarke.usenet@snet.net.invalid> wrote in message
    news:d2van62eum@news1.newsguy.com...
    > > In broadcast HD,
    > > a 1920 x 1080, 60 Hz interlaced video stream winds up at under
    > > 20 Mbits/second; figure out how long you can run THAT into a
    > > 73 GB disc, and you get a fairly pleasing large number out as a
    > > result.
    >
    > The downside there is that you're dealing with lossy compression, which
    > means that you run into generation loss.

    Well, yes, but only if the video is decompressed and then
    recompressed on subsequent generations. If you just want
    to preserve the original content, you'd never decompress
    except to send it to a display. You'd just copy off the
    original compressed stream.


    Bob M.
  4. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    Bob Myers wrote:
    > "J. Clarke" <jclarke.usenet@snet.net.invalid> wrote in message
    > news:d2van62eum@news1.newsguy.com...
    > > > In broadcast HD,
    > > > a 1920 x 1080, 60 Hz interlaced video stream winds up at under
    > > > 20 Mbits/second; figure out how long you can run THAT into a
    > > > 73 GB disc, and you get a fairly pleasing large number out as a
    > > > result.
    > >
    > > The downside there is that you're dealing with lossy compression,
    which
    > > means that you run into generation loss.
    >
    > Well, yes, but only if the video is decompressed and then
    > recompressed on subsequent generations. If you just want
    > to preserve the original content, you'd never decompress
    > except to send it to a display. You'd just copy off the
    > original compressed stream.
    >
    >
    > Bob M.

    Thanks a lot.

    Of course there is the initial loss of detail, since there is no such
    things as loss-loess compression(in normal images/videos). :-)

    22 minutes is a good reference point for me to figure out a few things,
    but just in case, do you have any idea what the maximum number of
    minutes above and beyond that 22 minutes that I can fit on the drive
    *with compression* and allowing for no visually perceptable loss in
    detail(transferring directly to the display of course), and what
    specific form of compression that would be?

    Thanks a lot.

    Darren Harris
    Staten Island, New York.
  5. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    Bob Myers wrote:
    > <Searcher7@mail.con2.com> wrote in message
    > news:1112726280.017229.231370@f14g2000cwb.googlegroups.com...
    > > Can someone point me in the direction of where I can get info
    > > concerning how much umcompressed video can realistically be stored
    on a
    > > 73G hard drive at good and high resolutions?
    >
    > If you're insisting on uncompressed video, then the only questions to
    answer
    > are how many pixels, the frame rate, and the number of bits of
    information
    > provided for each pixel. For instance, 640 x 480 at 60 Hz, 24
    bits/pixel
    > (8 each R, G, and B), if you eliminate the blanking time from
    consideration,
    > is a raw data rate of
    >
    > 640 x 480 x 60 x 24 bits (3 bytes) = 55.3 Mbytes/sec
    >
    > So a 73 Gbyte drive would be able to hold at most
    >
    > 73/0.0553 = 1320 seconds of video, or about 22 minutes' worth.
    >
    > But no one stores high-resolution video without SOME form of
    compression.
    > Standard TV video is normally interlaced, which can be viewed as a
    > crude form of compression giving a 2:1 reduction. Full 24-bit RGB
    > also isn't a very efficient means of storing color video; instead,
    you
    > store 8 bits of luminance for each pixel, and then subsample the
    color
    > information, which can give you something like another 4:1 or so
    > reduction. But the big savings comes from applying a halfway decent
    > video compression scheme, as is done with HDTV. In broadcast HD,
    > a 1920 x 1080, 60 Hz interlaced video stream winds up at under
    > 20 Mbits/second; figure out how long you can run THAT into a
    > 73 GB disc, and you get a fairly pleasing large number out as a
    > result.
    >
    > Bob M.


    BTW. What is "blanking time"?

    The idea is to run this full screen on a 19 inch monitor(at the minimum
    24fps), so how do I determine the number of pixels?

    And speaking of ;lixels, besides luminance(whatever that is) and
    color(R,G,B) are there any other other characteristics are there to
    consider for each pixel?

    And if I were to use some form of compression, what is the best I can
    achieve without having to experience any noticeable drop in image
    quality?(Perhaps there is a standard "minutes per gigabyte" chart for
    something like this?

    Thanks a lot.

    Darren Harris
    Staten Island, New York.
  6. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    <Searcher7@mail.con2.com> wrote in message
    news:1115254257.574193.192560@g14g2000cwa.googlegroups.com...
    > BTW. What is "blanking time"?

    Just about all displays, and especially CRTs, require some "dead time"
    (i.e., periods during the video signal when no active image information
    is being transmitted) to let them take care of various necessary chores.
    In the CRT, for instance, it's the need for "retrace" time - the time
    required for the beam to be returned from right to left, or bottom to
    top, to be ready to begin scanning the next line or frame. The requirement
    for blanking time as a percentage of the overall horizontal or vertical
    periods generally goes up as the scan rate in that direction goes up -
    for high horizontal rates, blanking times between 25 and 30% of the
    horizontal total time are typical. Even non-CRT monitors generally
    require SOME "blanking" (non-active) time, although typically far
    less (amounting to perhaps 5-10% of the total time).


    > The idea is to run this full screen on a 19 inch monitor(at the minimum
    > 24fps), so how do I determine the number of pixels?

    If you're talking about HDTV or other digital TV systems, the number
    of pixels in the video signal is fixed by the appropriate standard.
    For instance, standard (U.S.) HDTV formats are 1280 x 720 and
    1920 x 1080 pixels. For analog (standard broadcast at present, for
    example), there is really no fixed number of pixels per scan line - analog
    video doesn't know anything about "pixels" - just a fixed number of
    active lines per frame or field. But if you are storing progressive-scan
    (non-interlaced) SDTV, 640 x 480 pixels is a good starting assumption.

    HOWEVER - this applies to the storage requirements question only.
    To run this to a "19 inch monitor," you will likely need to convert the
    video to a different format/timing at playback time. Without knowing
    more about that display, I can't say anything regarding what THAT
    video stream is going to look like.

    > And speaking of ;lixels, besides luminance(whatever that is) and
    > color(R,G,B) are there any other other characteristics are there to
    > consider for each pixel?

    I'm not sure what you're asking here. Certainly the RGB amplitude
    information is all there is, basically, in a standard color video signal,
    although for TV it isn't really represented as separate R, G, and B
    signals.

    > And if I were to use some form of compression, what is the best I can
    > achieve without having to experience any noticeable drop in image
    > quality?(Perhaps there is a standard "minutes per gigabyte" chart for
    > something like this?

    It's impossible to say without knowing what you would consider an
    acceptable level of image quality. As one data point, though, broadcast
    HDTV typically operates at a compression ratio (bits in the transmitted
    signal vs. bits in the original uncompressed video stream) of 50:1 or
    even higher. For instance, a 1280 x 720, 24-bit RGB, 60 FPS video
    signal has a bit rate of at least

    1280 x 720 x 24 x 60 = 1.327 Gbit/sec

    and yet this goes out over the air at absolutely no more than about 20
    Mbit/sec (the maximum permissible under this system in a 6 MHz
    channel), for a compression ratio of about 66:1. (It's not quite this
    simple, but this will do for an example.) So if you consider broadcast
    HDTV to be of acceptable quality, and can use a comparable compression
    scheme, you can work out your storage requirements (roughly) by
    figuring your initial uncompressed data rate and a compression ratio in
    this ballpark.

    Bob M.
  7. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    Bob Myers wrote:
    > <Searcher7@mail.con2.com> wrote in message
    > news:1115254257.574193.192560@g14g2000cwa.googlegroups.com...
    > > BTW. What is "blanking time"?
    >
    > Just about all displays, and especially CRTs, require some "dead
    time"
    > (i.e., periods during the video signal when no active image
    information
    > is being transmitted) to let them take care of various necessary
    chores.
    > In the CRT, for instance, it's the need for "retrace" time - the time
    > required for the beam to be returned from right to left, or bottom to
    > top, to be ready to begin scanning the next line or frame. The
    requirement
    > for blanking time as a percentage of the overall horizontal or
    vertical
    > periods generally goes up as the scan rate in that direction goes up
    -
    > for high horizontal rates, blanking times between 25 and 30% of the
    > horizontal total time are typical. Even non-CRT monitors generally
    > require SOME "blanking" (non-active) time, although typically far
    > less (amounting to perhaps 5-10% of the total time).
    >
    >
    > > The idea is to run this full screen on a 19 inch monitor(at the
    minimum
    > > 24fps), so how do I determine the number of pixels?
    >
    > If you're talking about HDTV or other digital TV systems, the number
    > of pixels in the video signal is fixed by the appropriate standard.
    > For instance, standard (U.S.) HDTV formats are 1280 x 720 and
    > 1920 x 1080 pixels. For analog (standard broadcast at present, for
    > example), there is really no fixed number of pixels per scan line -
    analog
    > video doesn't know anything about "pixels" - just a fixed number of
    > active lines per frame or field. But if you are storing
    progressive-scan
    > (non-interlaced) SDTV, 640 x 480 pixels is a good starting
    assumption.
    >
    > HOWEVER - this applies to the storage requirements question only.
    > To run this to a "19 inch monitor," you will likely need to convert
    the
    > video to a different format/timing at playback time. Without knowing
    > more about that display, I can't say anything regarding what THAT
    > video stream is going to look like.
    >
    > > And speaking of ;lixels, besides luminance(whatever that is) and
    > > color(R,G,B) are there any other other characteristics are there to
    > > consider for each pixel?
    >
    > I'm not sure what you're asking here. Certainly the RGB amplitude
    > information is all there is, basically, in a standard color video
    signal,
    > although for TV it isn't really represented as separate R, G, and B
    > signals.

    Ok. What I was asking is how many bits would would be required to allow
    for one pixel to store all possibilities fof color. You said that RGB
    amplitude info is all there is. So I'm just wondering how many colors
    are possible.(Well, actually I should probably find out how many colors
    are necessary first. That is how many different colors can be
    delineated by the human eye).

    > > And if I were to use some form of compression, what is the best I
    can
    > > achieve without having to experience any noticeable drop in image
    > > quality?(Perhaps there is a standard "minutes per gigabyte" chart
    for
    > > something like this?
    >
    > It's impossible to say without knowing what you would consider an
    > acceptable level of image quality. As one data point, though,
    broadcast
    > HDTV typically operates at a compression ratio (bits in the
    transmitted
    > signal vs. bits in the original uncompressed video stream) of 50:1 or
    > even higher. For instance, a 1280 x 720, 24-bit RGB, 60 FPS video
    > signal has a bit rate of at least
    >
    > 1280 x 720 x 24 x 60 = 1.327 Gbit/sec
    >
    > and yet this goes out over the air at absolutely no more than about
    20
    > Mbit/sec (the maximum permissible under this system in a 6 MHz
    > channel), for a compression ratio of about 66:1. (It's not quite
    this
    > simple, but this will do for an example.) So if you consider
    broadcast
    > HDTV to be of acceptable quality, and can use a comparable
    compression
    > scheme, you can work out your storage requirements (roughly) by
    > figuring your initial uncompressed data rate and a compression ratio
    in
    > this ballpark.

    66:1. It sounds as though that means that I am to multiply that 22
    minutes of video by 66, which gives me 1452 minutes(24.2 hours) of
    compressed video on a 73G drive.(What am I doing wrong?).

    Perhaps I should have just asked how much "movie quality video" can fit
    on a 73G drive, while exhibiting no noticeable degradation(as a reasult
    of whatever compression scheme is used) when shown on a 19-21 inch
    display.

    Thanks.

    Darren Harris
    Staten Island, New York.
  8. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    <Searcher7@mail.con2.com> wrote in message
    news:1115653884.700767.207010@z14g2000cwz.googlegroups.com...
    > Ok. What I was asking is how many bits would would be required to allow
    > for one pixel to store all possibilities fof color. You said that RGB
    > amplitude info is all there is. So I'm just wondering how many colors
    > are possible.(Well, actually I should probably find out how many colors
    > are necessary first. That is how many different colors can be
    > delineated by the human eye).

    That turns out to be a much bigger question than you may
    realize. How human vision, and especially color, actually
    work is a subject that can (and has, many times) fill a book.
    Even a simplified, cursory examination of the subject is a
    fair-sized chapter for a book. (One example of such is a
    chapter in my own book, "Display Interfaces: Fundamentals
    and Standards," if you don't mind the quick plug. Actually,
    I wouldn't recommend that anyone buy the book just to get
    the answer to this question - it's not worth that. But you
    may be able to find a copy on the shelves of your local
    college's engineering library. Both the copies that actually
    sold had to wind up SOMEWHERE...)

    But without going too far into color theory, let me try to
    give you a few answers here, The last question you asked,
    "how many colors can be delineated by the human eye" is
    a complex one to answer, and pretty much everything
    I'd wind up saying in a short-form answer would end with
    "but it's really not this simple." Let's just say that the eye
    can discern millions of different shades - possibly into the
    low tens of millions, at least - and leave it at the for now.
    So I'm just going to throw out several hopefully relevant
    comments. If anyone wants to go into further detail on any
    of them, we certainly can do that later.

    1. It is impossible for any practical electronic imaging device,
    whether it's a display or a printer, to cover the entire
    range of colors that the eye can see. So we're really not
    even going to worry about that basic question. The
    question is really how many bits of information you need to
    be able to describe an image (or rather, to allocate to each
    pixel of an image) in order to make that image sufficiently
    realistic and without artificats such that the eye will accept
    it as "realistic". Another word for this might be "photographic"
    - we generally accept quality color photos as "looking real,"
    so another relevant question is "how many bits per pixel
    do I need to make an image look as real as a photograph?"

    2. The short form of the answer to the question above is
    somewhere around 8-10 bits each of the primaries red,
    green, and blue, for electronic displays. That answer actually
    is one of those that deserves an "it isn't really that simple"
    after it, because for one thing it assumes that all three colors
    are equally important (and they're not), and for another
    people thinking in terms of "bits per color" will usually make
    the assumption that the values use linear encoding, which
    generally isn't the best choice here. But we'll just note that
    "24-bit color" (RGB at 8 each) is generally assumed to
    be "photorealistic" for most casual display work; 10 bits
    each will satisfy most of what's left, and 12 bits per primary
    will be overkill for all but the most demanding work.

    3. Since the original question really had nothing to do with
    representing "all the colors we could see" but instead had
    to do with storing a digital representation of standard video,
    it's probably more important to ask "just how much color
    information, in terms of bits/pixel, is actually available in the
    best video signal we're ever going to see?" The above
    answer turns out to apply pretty well here, too. RGB stored
    at 8-10 bits each will generally suffice for the storage of
    pretty high quality video (and in fact a lot of digital video
    starts out as a 24 bit/pixel RGB representation). However,
    video generally is NOT stored this way, but instead takes
    advantage of another quirk of human vision to permit essentially
    the same perceived quality without storing so many bits.
    Human eyes are much better at discerning differences in
    "brightness" (luminance) over a small distance than they are
    at discerning differences in colors (of the same perceived
    "brightness") over that same distance. In more technical terms,
    our spatial acuity is better for luminance changes than for
    "chrominance" (color) changes. Most digital video systems
    take advantage of this by storing image information not in
    RGB form, but as separate luminance and color information,
    then storing fewer samples (pixels) of the color information
    than of the luminance. In other words, they essentially convert
    the color image into a luminance-only ("black and white") image
    of the same "resolution" (number of pixels), and then add to
    that samples of the "chrominance" information which are
    effectively shared by a number of adjacent pixels. One popular
    digital storage standard encoding, for example, is to store one
    sample of color information (these are the signals you will
    see referred to as "U" and "V," or in digital terms "Cb" and
    "Cr") for every FOUR samples (or pixels) of luminance (Y)
    information. If you started out with 24-bit RGB, and used
    this sort of encoding for your digital video storage (still at eight
    bits each for Y, Cb, and Cr), you'd wind up with

    32 bits of Y,
    8 bits of Cb, and
    8 bits of Cr (for a total of 48 bits)

    for every four pixels in the original image. That's a 50%
    savings in storage space (or transmission "bandwidth") over
    the original 24 bits/pixel version, with very little loss of image
    quality for the typical "video" sorts of images.

    > 66:1. It sounds as though that means that I am to multiply that 22
    > minutes of video by 66, which gives me 1452 minutes(24.2 hours) of
    > compressed video on a 73G drive.(What am I doing wrong?).

    Nothing. If you could fit 22 minutes of uncompressed video in
    whatever your original format was onto said 73 gig drive, then yes,
    you could fit over 24 hours onto that same drive, using the exact same
    sort of compression (and compression ratio) as is used in the example
    shown for broadcast HDTV. This is exactly what makes the download
    of full-length movies from web sites possible without having to have
    God's own REALLY-high-speed network connection. Remember,
    though, that we have been making approximations all along here,
    so that "24.2 hours" should not be seen as a precise or guaranteed
    figure, but rather a ballpark estimate of what you might get given
    your original raw data calculation.


    > Perhaps I should have just asked how much "movie quality video" can fit
    > on a 73G drive, while exhibiting no noticeable degradation(as a reasult
    > of whatever compression scheme is used) when shown on a 19-21 inch
    > display.

    If you consider HDTV to be "movie quality video," and given that
    broadcast HDTV absolutely CANNOT require more than about
    19.2 Mbytes/sec, then we get:

    (73 Gbytes)/(19.2 Mbytes/sec) = 3893 seconds, or a little over
    one hour of video, just storing what comes over the air exactly
    as it's transmitted.

    But that's HD; DVD-quality video requires considerably less
    space, due to the much fewer pixels/frame, and to match that
    quality of what you typically get over-the-air with standard analog
    TV (roughly equal to perhaps 450 x 340 pixels per frame, rather
    than the 704 x 480 of DVDs) the requirements would be even
    lower. So yes, tens of hours of video on a 73-gig drive is not
    unrealistic.

    As a further example - consider how much space is available on
    a standard DVD, then realize that yes, they really DO store
    full-length movies on those! :-)

    Bob M.
  9. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    <Searcher7@mail.con2.com> wrote in message
    news:1115763750.102631.117550@f14g2000cwb.googlegroups.com...

    > So basically it makes no difference how many colors we can see because
    > the display will be the weak link as fas color resolution, correct?

    Correct; no possible realizable display can produce the full
    range of colors that can be seen by human eyes. You also need
    to be concerned, though, with whether or not a given display
    provides sufficient control over its primaries (i.e., how well
    you can set the levels of red, green, and blue light being
    produced) so that you don't wind up with visible artifacts
    (i.e., "banding") in the image. This is where the bits/color
    issue comes in.

    > Do the same numbers correlate to video as well?

    Yes; I'm not distinguishing the source of the images (i.e.,
    computer-generated, camera-generated video, etc.) in
    this discussion.

    > Assuming we are still talking about video, I assume that no one would
    > be able to visually tell the difference between 10-bit color and 12-bit
    > color, coorect?

    For the majority of applications and the majority of viewers
    this is correct (assuming that by "10-bit color" you mean 10
    bits per primary, and so forth).

    >
    > Ok. But I assume that there will be differences between standard video,
    > broadcast quality video, and high definition video, correct?

    Well, there's certainly a difference between the current analog
    broadcast standards and what is usually meant by "high definition."
    I'm not sure what you mean by "standard video" if not the sort
    that most analog TV stations put out today.

    > I assume that means no added compression after it is received to one's
    > hard drive.

    Correct.


    >
    > I don't know if DVD videos look better on a PC monitor or a TV screen,
    > but since I know *all* of the screen real estate of a TV will be used,
    > for one hour of the best quality DVD video, how much hard drive space
    > would be needed?(I guess I can do the math for there, and try to figure
    > out what it would be for HD quality).

    The amount of "screen real estate" used is irrelevant; either
    a TV screen or a PC monitor will still receive all of the video
    information, regardless of how it is displayed (TVs typically
    overscan, meaning that about 5% of the image is actually lost
    "behind the bezel"). The only important concerns in terms of
    the amount of information you need to store are (1) the number
    of pixels per frame, (2) the number of frames per second, and
    (3) the number of bits needed to be stored for each pixel. Item
    #3 can be affected by the color encoding used (e.g., RGB vs.
    YUV, plus the various color subsampling methods mentioned
    earlier), and any compression which might be applied.

    As to whether the video will look better on a PC monitor or
    TV screen - the two may (and often DO) look different, but
    "better" is a judgement call depending on just what quality
    factors are important to you. If we're talking about CRT displays,
    PC monitors generally provide better resolution (in the proper
    sense of the word, NOT just how many pixels are supposed to
    be provided in the image) but may not provide the correct color
    characteristics (primary colors and white point). TV displays,
    esp. CRT types, also tend to be brighter than PC monitors.


    Bob M.
  10. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    Thanks to Google screwing up, this is my third time writing and
    attempting to post this.

    > > Ok. But I assume that there will be differences between standard
    video,
    > > broadcast quality video, and high definition video, correct?
    >
    > Well, there's certainly a difference between the current analog
    > broadcast standards and what is usually meant by "high definition."
    > I'm not sure what you mean by "standard video" if not the sort
    > that most analog TV stations put out today.

    Oops! I meant analog, DVD, and HD.

    > > I don't know if DVD videos look better on a PC monitor or a TV
    screen,
    > > but since I know *all* of the screen real estate of a TV will be
    used,
    > > for one hour of the best quality DVD video, how much hard drive
    space
    > > would be needed?(I guess I can do the math for there, and try to
    figure
    > > out what it would be for HD quality).
    >
    > The amount of "screen real estate" used is irrelevant; either
    > a TV screen or a PC monitor will still receive all of the video
    > information, regardless of how it is displayed (TVs typically
    > overscan, meaning that about 5% of the image is actually lost
    > "behind the bezel"). The only important concerns in terms of
    > the amount of information you need to store are (1) the number
    > of pixels per frame, (2) the number of frames per second, and
    > (3) the number of bits needed to be stored for each pixel. Item
    > #3 can be affected by the color encoding used (e.g., RGB vs.
    > YUV, plus the various color subsampling methods mentioned
    > earlier), and any compression which might be applied.

    Well, as far as "screen real estate", what I meant was 19" vs. 21", vs.
    27", ect.

    I assumed that more data would have be involved in displaying the same
    video and quality on a 27" as opposed to a 19". So therefore, more hard
    drive space would be used for the same amount of minutes.

    > As to whether the video will look better on a PC monitor or
    > TV screen - the two may (and often DO) look different, but
    > "better" is a judgement call depending on just what quality
    > factors are important to you. If we're talking about CRT displays,
    > PC monitors generally provide better resolution (in the proper
    > sense of the word, NOT just how many pixels are supposed to
    > be provided in the image) but may not provide the correct color
    > characteristics (primary colors and white point). TV displays,
    > esp. CRT types, also tend to be brighter than PC monitors.

    I guess that is up for experimentation. But the goal is the achieve the
    most realistic picture, as far as on which display would the video look
    more "movie-like".

    Thanks.

    Darren Harris
    Staten Island, New York.
  11. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    <Searcher7@mail.con2.com> wrote in message
    news:1115937087.085734.278660@g43g2000cwa.googlegroups.com...

    > >
    > > The amount of "screen real estate" used is irrelevant; either
    > > a TV screen or a PC monitor will still receive all of the video
    > > information, regardless of how it is displayed (TVs typically
    > > overscan, meaning that about 5% of the image is actually lost
    > > "behind the bezel"). The only important concerns in terms of
    > > the amount of information you need to store are (1) the number
    > > of pixels per frame, (2) the number of frames per second, and
    > > (3) the number of bits needed to be stored for each pixel. Item
    > > #3 can be affected by the color encoding used (e.g., RGB vs.
    > > YUV, plus the various color subsampling methods mentioned
    > > earlier), and any compression which might be applied.
    >
    > Well, as far as "screen real estate", what I meant was 19" vs. 21", vs.
    > 27", ect.

    Still not relevant to the question of storage. One thing about
    video is that it's very, very standardized. The broadcast
    standard (analog) and current standard-definition DVDs both
    use about 480 lines per frame, and that doesn't change whether
    you view the result on a 19" monitor or a 55" projection TV -
    the lines you have are all you get to play with, period. The
    number of "pixels" per line (in analog terms, the bandwidth
    of the video signal) varies a bit between over-the-air broadcast
    and DVD, but not as much as you might think. About the
    worst case you'll ever run into, storage-wise, in "standard-definition"
    video is about 720 pixels per line x 480 lines (576 lines for the
    European broadcast standards), and that's that. Even the 640
    x 480 standard will provide a good deal more resolution (in the
    proper sense of the word - i.e., how much detail can be actually
    resolved per unit distance on the screen) than anything you'll
    get over the air. So use 720 x 480 or 640 x 480 as the
    starting point for your storage calculations, and don't worry
    about the screen size.

    > I assumed that more data would have be involved in displaying the same
    > video and quality on a 27" as opposed to a 19". So therefore, more hard
    > drive space would be used for the same amount of minutes.

    IF you could arbitrarily pick the pixel format and actually
    achieve the same resolution (in pixels per inch) on both displays,
    that would be the case, yes. But you can't do that and stick
    with anything resembling standard video.


    > I guess that is up for experimentation. But the goal is the achieve the
    > most realistic picture, as far as on which display would the video look
    > more "movie-like".

    Now you're asking about perceived quality, and that depends
    a whole lot on what the viewer in question considers important
    to getting a "movie-like" experience. For me, it would be the
    biggest screen possible, set to the proper (per TV standards)
    6500K white, and with enough brightness and contrast for the
    viewing environment such that the screen doesn't look faded and
    washed-out. Given that (well, OK, add progressive scan to the
    mix), even a 480-line source looks pretty good. True HDTV
    (1280 x 720 or 1920 x 1080) would look even better, of course,
    but on smaller screens (under about 40" diagonal, let's say) and
    at typical TV viewing distances, you'll be really hard pressed to
    see a whole lot of difference over properly-presented 480-line
    material. And HD takes a WHOLE lot more storage, which is
    one of the reasons that HD-DVDs are just now starting to turn
    up.

    Bob M.
  12. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    Bob Myers wrote:
    > <Searcher7@mail.con2.com> wrote in message
    > news:1115937087.085734.278660@g43g2000cwa.googlegroups.com...
    >
    > > >
    > > > The amount of "screen real estate" used is irrelevant; either
    > > > a TV screen or a PC monitor will still receive all of the video
    > > > information, regardless of how it is displayed (TVs typically
    > > > overscan, meaning that about 5% of the image is actually lost
    > > > "behind the bezel"). The only important concerns in terms of
    > > > the amount of information you need to store are (1) the number
    > > > of pixels per frame, (2) the number of frames per second, and
    > > > (3) the number of bits needed to be stored for each pixel. Item
    > > > #3 can be affected by the color encoding used (e.g., RGB vs.
    > > > YUV, plus the various color subsampling methods mentioned
    > > > earlier), and any compression which might be applied.
    > >
    > > Well, as far as "screen real estate", what I meant was 19" vs. 21", vs.
    > > 27", ect.
    >
    > Still not relevant to the question of storage. One thing about
    > video is that it's very, very standardized. The broadcast
    > standard (analog) and current standard-definition DVDs both
    > use about 480 lines per frame, and that doesn't change whether
    > you view the result on a 19" monitor or a 55" projection TV -
    > the lines you have are all you get to play with, period. The
    > number of "pixels" per line (in analog terms, the bandwidth
    > of the video signal) varies a bit between over-the-air broadcast
    > and DVD, but not as much as you might think. About the
    > worst case you'll ever run into, storage-wise, in "standard-definition"
    > video is about 720 pixels per line x 480 lines (576 lines for the
    > European broadcast standards), and that's that. Even the 640
    > x 480 standard will provide a good deal more resolution (in the
    > proper sense of the word - i.e., how much detail can be actually
    > resolved per unit distance on the screen) than anything you'll
    > get over the air. So use 720 x 480 or 640 x 480 as the
    > starting point for your storage calculations, and don't worry
    > about the screen size.

    So from what you are saying, the picture will look exactly the same on
    a 27" as it would on a 19" monitor.

    > > I assumed that more data would have be involved in displaying the same
    > > video and quality on a 27" as opposed to a 19". So therefore, more hard
    > > drive space would be used for the same amount of minutes.
    >
    > IF you could arbitrarily pick the pixel format and actually
    > achieve the same resolution (in pixels per inch) on both displays,
    > that would be the case, yes. But you can't do that and stick
    > with anything resembling standard video.

    I assume that you are referring to quality and not format.

    > > I guess that is up for experimentation. But the goal is the achieve the
    > > most realistic picture, as far as on which display would the video look
    > > more "movie-like".
    >
    > Now you're asking about perceived quality, and that depends
    > a whole lot on what the viewer in question considers important
    > to getting a "movie-like" experience. For me, it would be the
    > biggest screen possible, set to the proper (per TV standards)
    > 6500K white, and with enough brightness and contrast for the
    > viewing environment such that the screen doesn't look faded and
    > washed-out. Given that (well, OK, add progressive scan to the
    > mix), even a 480-line source looks pretty good. True HDTV
    > (1280 x 720 or 1920 x 1080) would look even better, of course,
    > but on smaller screens (under about 40" diagonal, let's say) and
    > at typical TV viewing distances, you'll be really hard pressed to
    > see a whole lot of difference over properly-presented 480-line
    > material. And HD takes a WHOLE lot more storage, which is
    > one of the reasons that HD-DVDs are just now starting to turn
    > up.

    Okay, so what I have to do is determine that point at which there is no
    significant difference in picture quality as a result of increasing the
    data to work with.

    So since there is no big difference in picture quality between HD and
    480-line(all factors taken into consideration), for storage space
    reasons it would be best to go with 480-line material. I just need to
    figure out how to convert that into numbers that I can work with in the
    equation: (480) x 60 x 24 bits = ?

    In fact, instead of 60fps, shouldn't the frame rate be 24fps? What
    would be the use in going higher?

    Thanks.

    Darren Harris
    Staten Island, New York.
Ask a new question

Read More

Graphics Cards Video Storage Gigabyte Graphics