Radeon 9800 Pro, Xp 2400,266mhz,1 gig pc2100 ram, this a g..

Archived from groups: alt.comp.hardware.overclocking (More info?)

Radeon 9800 Pro, Xp 2400,266mhz,1 gig pc2100 ram, this a good setup ?

Will the 9800 pro use all its power, or should i just stick to the
9600 Pro worth the upgrade to the 9800 Pro ?

Thanks.
17 answers Last reply
More about radeon 9800 2400 266mhz pc2100
  1. Archived from groups: alt.comp.hardware.overclocking (More info?)

    "We Live for the One we Die for the One" <Mr fred@yahoo.com.au> wrote in
    message news:mo00b05dpfopceijfnleuc7rn7opoeiubi@4ax.com...

    " Radeon 9800 Pro, Xp 2400,266mhz,1 gig pc2100 ram, this a good setup ?
    Will the 9800 pro use all its power, or should i just stick to the 9600 Pro
    worth the upgrade to the 9800 Pro ? "


    It all depends what games you play. Also, depending on the rest of your
    system, upgrades may be better utilised elsewhere. What motherboard do you
    have?
  2. Archived from groups: alt.comp.hardware.overclocking (More info?)

    "Cuzman" wrote
    What motherboard do you have?


    I think he has an ageing ASUS A7A266.

    We Live for the One we Die for the One: The price difference between the
    9600XT and the 9800Pro is now very small.
    --
    Wayne ][
  3. Archived from groups: alt.comp.hardware.overclocking (More info?)

    Eventually we'll reach the point the rest of a computer is thrown in as a
    freebie when one buys a display adapter!

    --
    Phil Weldon, pweldonatmindjumpdotcom
    For communication,
    replace "at" with the 'at sign'
    replace "mindjump" with "mindspring."
    replace "dot" with "."


    "Wayne Youngman" <waynes.spamtrap@tiscali.co.uk> wrote in message
    news:40b13664$1_1@mk-nntp-2.news.uk.tiscali.com...
    >
    > "Cuzman" wrote
    > What motherboard do you have?
    >
    >
    > I think he has an ageing ASUS A7A266.
    >
    > We Live for the One we Die for the One: The price difference between the
    > 9600XT and the 9800Pro is now very small.
    > --
    > Wayne ][
    >
    >
  4. Archived from groups: alt.comp.hardware.overclocking (More info?)

    "Phil Weldon" wrote
    > Eventually we'll reach the point the rest of a computer is thrown in as a
    > freebie when one buys a display adapter!

    Indeed!
    I feel a bit sorry for this kid, he has been *theorising* over an upgrade
    for about 9 months, but you can never get ahead of the *joneses* as new
    things are always creeping out.

    If the O.P has a 9600XT, XP2400+, 1GB of PC2100 and an ASUS A7A266 I think
    that's a nice enough system, I built something similar for my Brother in
    February except I used an ABIT AN7 + 512MB of Crucial PC3200, so instead of
    running the XP2400+ @15x133 it is running at 11x200, of course the extra
    FSB/Mem MHz really helps.

    From my limited recent game playing experience I find it hard to tell the
    difference between the 9800 and the 9600XT while running say 1024x768 res.
    Of course I can see the difference in benchmarks!
    --
    Wayne ][
  5. Archived from groups: alt.comp.hardware.overclocking (More info?)

    Yeah, and who can actually SEE 500 frames per second, or even 200 frames per
    second for that matter, and as if any monitor could DISPLAY 200 frames per
    second!

    --
    Phil Weldon, pweldonatmindjumpdotcom
    For communication,
    replace "at" with the 'at sign'
    replace "mindjump" with "mindspring."
    replace "dot" with "."


    "Wayne Youngman" <waynes.spamtrap@tiscali.co.uk> wrote in message
    news:40b1da87$1_1@mk-nntp-2.news.uk.tiscali.com...
    >
    > "Phil Weldon" wrote
    > > Eventually we'll reach the point the rest of a computer is thrown in as
    a
    > > freebie when one buys a display adapter!
    >
    > Indeed!
    > I feel a bit sorry for this kid, he has been *theorising* over an upgrade
    > for about 9 months, but you can never get ahead of the *joneses* as new
    > things are always creeping out.
    >
    > If the O.P has a 9600XT, XP2400+, 1GB of PC2100 and an ASUS A7A266 I think
    > that's a nice enough system, I built something similar for my Brother in
    > February except I used an ABIT AN7 + 512MB of Crucial PC3200, so instead
    of
    > running the XP2400+ @15x133 it is running at 11x200, of course the extra
    > FSB/Mem MHz really helps.
    >
    > From my limited recent game playing experience I find it hard to tell the
    > difference between the 9800 and the 9600XT while running say 1024x768 res.
    > Of course I can see the difference in benchmarks!
    > --
    > Wayne ][
    >
    >
  6. Archived from groups: alt.comp.hardware.overclocking (More info?)

    Ive got an A7v266-e so iam limited to a Xp 2600 266mhz CPU but can't
    get one in Australia so i settled for XP 2400 :(

    So you think 9800 would be lets say 50% bettter than a Readeon 9600
    pro ?

    Worth the upgrade if i keep all i have ?

    And upgrading i can do tommorow NEW everthing, but i realy want to cut
    that down to maybe a nre PC every four years FOR THE LOVE OF GOD :)

    Pcs are SUCH a WASTE of money, as soon as you buy one its worth 50%
    less :)

    Thanks.


    On Mon, 24 May 2004 12:21:21 GMT, "Phil Weldon"
    <notdisclosed@example.com> wrote:

    >Yeah, and who can actually SEE 500 frames per second, or even 200 frames per
    >second for that matter, and as if any monitor could DISPLAY 200 frames per
    >second!
  7. Archived from groups: alt.comp.hardware.overclocking (More info?)

    I am just indicating that rating performance by frames per second may allow
    comparisons, but is it a USEFUL comparison, especially since we all seem
    perfectly satisfied by movies at 24 frames per second and television
    displays at 30, 50, or 60 frames per second (I'm sorry, but PAL and SECAM at
    25 frames per second gives me a headache.)

    As for my personal use, I like the price on display adapters two generations
    behind the bleeding edge. Paying $400 US or $500 US for performance that is
    only helpful for a handful of 3-D game programs is a very expensive
    performance boost for a very limited use. Money spent on boosting the
    performance of your entire system for a wide range of uses is better spent.

    --
    Phil Weldon, pweldonatmindjumpdotcom
    For communication,
    replace "at" with the 'at sign'
    replace "mindjump" with "mindspring."
    replace "dot" with "."


    "We Live for the One we Die for the One" <Mr fred@yahoo.com.au> wrote in
    message news:jrv3b09bg8dqt63o110k3dfaqckuoaarnm@4ax.com...
    >
    >
    > Ive got an A7v266-e so iam limited to a Xp 2600 266mhz CPU but can't
    > get one in Australia so i settled for XP 2400 :(
    >
    > So you think 9800 would be lets say 50% bettter than a Readeon 9600
    > pro ?
    >
    > Worth the upgrade if i keep all i have ?
    >
    > And upgrading i can do tommorow NEW everthing, but i realy want to cut
    > that down to maybe a nre PC every four years FOR THE LOVE OF GOD :)
    >
    > Pcs are SUCH a WASTE of money, as soon as you buy one its worth 50%
    > less :)
    >
    > Thanks.
    >
    >
    >
    >
    > On Mon, 24 May 2004 12:21:21 GMT, "Phil Weldon"
    > <notdisclosed@example.com> wrote:
    >
    > >Yeah, and who can actually SEE 500 frames per second, or even 200 frames
    per
    > >second for that matter, and as if any monitor could DISPLAY 200 frames
    per
    > >second!
    >
  8. Archived from groups: alt.comp.hardware.overclocking (More info?)

    Phil Weldon wrote:

    > I am just indicating that rating performance by frames per second may allow
    > comparisons, but is it a USEFUL comparison, especially since we all seem
    > perfectly satisfied by movies at 24 frames per second and television
    > displays at 30, 50, or 60 frames per second (I'm sorry, but PAL and SECAM at
    > 25 frames per second gives me a headache.)

    Yeah. PAL and SECAM 25 FPS (50Hz refresh) DOES flicker, their claims to the
    contrary notwithstanding. I notice it too.

    While I'm not sure I buy the whole theory, partly because it's not
    something I've spent a lot of time on, there IS research which shows a
    perceptible difference with frames rates 'too high to see'. I.E. faster
    than the monitor refresh interval.

    The postulated reason is that, with movie and TV, you're taking a
    'snapshot' of real life movement, not a 'frozen in time' stagnant image
    (things don't stop moving for your snapshot to take place), so there is
    'smearing' of it over the frame interval and, they think, this provides
    additional cues to the eye.

    With computer generated frames, however, they ARE simply one stagnant image
    after another, computer shifted the 'right amount', 'full frame' at a time,
    to the next image to simulate movement. The idea is that faster than the
    refresh rate frame generation creates a more lifelike 'moving picture' that
    the refresh rate is then taking the 'snapshot' of, kind of like how 'real
    life' is moving all the time as the frame is taken.

    Seems to me that, if it were 'perceptible', it would appear more like
    tearing, since it isn't as if the entire image were moving, only 'part' of
    the frame would be in the 'new position', but then I haven't run actual
    human tests so I would be speculating whereas others claim to have observed
    it. Also, by perceptible they don't mean consciously observable, just that
    the observers seem to feel that the 'too fast' frame rates are 'more
    realistic'. Maybe the eye compensates for the 'partial' smear just as it
    does for flicker and in recreating full color from 3 primaries. That would
    make me think there is some minimum multiple (maybe an odd multiple so it
    cycles through the image) before the effect would be effective, again, like
    a minimum rate to remove flicker.


    > As for my personal use, I like the price on display adapters two generations
    > behind the bleeding edge. Paying $400 US or $500 US for performance that is
    > only helpful for a handful of 3-D game programs is a very expensive
    > performance boost for a very limited use. Money spent on boosting the
    > performance of your entire system for a wide range of uses is better spent.
    >
  9. Archived from groups: alt.comp.hardware.overclocking (More info?)

    When images on a strip of film are projected onto a screen, the images are
    discrete, perhaps separated by a small blank interval, depending on the
    projector (one type has continuous film advance rather than intermittent,
    and uses a rotating prism syncronized with the film movement to project a
    single frame until the next moves into place.)

    As for frame rate on a comuter monitor, there is absolutely no way for
    information to reach the screen faster than the frame rate of the monitor.
    If frame synch is turned off, and the frame generation rate allowed to
    exceed the monitor frame rate then GPU and CPU power is just being wasted
    because the extra will never reach the screen (to display at twice the
    monitor frame rate would mean that half the information is never displayed,
    and the GPU and CPU processing power would be better spent on increased
    image quality.) And then there would be the displacement with moving objects
    or a panning motion at some point in the displayed composite frame.

    The "frozen in time" effect is just as present in film as in CGI. After
    all, the exposure time can be varied in cinemaphotography to freeze any
    motion (the downside is that more sensitive film, a faster lens, increased
    scene illumination, and or "pushed" development must be used.) And what
    about CGI use in filmed movies ("Hell Boy", "Von Helsing", "Shreck II"?)

    Those who report seeing a difference with computer display images when the
    computer frame rate is higher than the monitor display rate are either
    perceiving the "image tearing" you mention as meaningful screen action
    (indicating really short attention spans, really short) or reacting to frame
    rate hype for 3-D accelerated adapter cards and games. Or maybe it is the
    aura of an impending seizure.

    --
    Phil Weldon, pweldonatmindjumpdotcom
    For communication,
    replace "at" with the 'at sign'
    replace "mindjump" with "mindspring."
    replace "dot" with "."

    "David Maynard" <dNOTmayn@ev1.net> wrote in message
    news:10b54vdk7psai09@corp.supernews.com...
    > Phil Weldon wrote:
    >
    > > I am just indicating that rating performance by frames per second may
    allow
    > > comparisons, but is it a USEFUL comparison, especially since we all seem
    > > perfectly satisfied by movies at 24 frames per second and television
    > > displays at 30, 50, or 60 frames per second (I'm sorry, but PAL and
    SECAM at
    > > 25 frames per second gives me a headache.)
    >
    > Yeah. PAL and SECAM 25 FPS (50Hz refresh) DOES flicker, their claims to
    the
    > contrary notwithstanding. I notice it too.
    >
    > While I'm not sure I buy the whole theory, partly because it's not
    > something I've spent a lot of time on, there IS research which shows a
    > perceptible difference with frames rates 'too high to see'. I.E. faster
    > than the monitor refresh interval.
    >
    > The postulated reason is that, with movie and TV, you're taking a
    > 'snapshot' of real life movement, not a 'frozen in time' stagnant image
    > (things don't stop moving for your snapshot to take place), so there is
    > 'smearing' of it over the frame interval and, they think, this provides
    > additional cues to the eye.
    >
    > With computer generated frames, however, they ARE simply one stagnant
    image
    > after another, computer shifted the 'right amount', 'full frame' at a
    time,
    > to the next image to simulate movement. The idea is that faster than the
    > refresh rate frame generation creates a more lifelike 'moving picture'
    that
    > the refresh rate is then taking the 'snapshot' of, kind of like how 'real
    > life' is moving all the time as the frame is taken.
    >
    > Seems to me that, if it were 'perceptible', it would appear more like
    > tearing, since it isn't as if the entire image were moving, only 'part' of
    > the frame would be in the 'new position', but then I haven't run actual
    > human tests so I would be speculating whereas others claim to have
    observed
    > it. Also, by perceptible they don't mean consciously observable, just that
    > the observers seem to feel that the 'too fast' frame rates are 'more
    > realistic'. Maybe the eye compensates for the 'partial' smear just as it
    > does for flicker and in recreating full color from 3 primaries. That would
    > make me think there is some minimum multiple (maybe an odd multiple so it
    > cycles through the image) before the effect would be effective, again,
    like
    > a minimum rate to remove flicker.
    >
    >
    > > As for my personal use, I like the price on display adapters two
    generations
    > > behind the bleeding edge. Paying $400 US or $500 US for performance
    that is
    > > only helpful for a handful of 3-D game programs is a very expensive
    > > performance boost for a very limited use. Money spent on boosting the
    > > performance of your entire system for a wide range of uses is better
    spent.
    > >
    >
  10. Archived from groups: alt.comp.hardware.overclocking (More info?)

    Phil Weldon wrote:

    > When images on a strip of film are projected onto a screen, the images are
    > discrete, perhaps separated by a small blank interval, depending on the
    > projector (one type has continuous film advance rather than intermittent,
    > and uses a rotating prism syncronized with the film movement to project a
    > single frame until the next moves into place.)

    I am aware of how a movie projector works. You missed the point.

    > As for frame rate on a comuter monitor, there is absolutely no way for
    > information to reach the screen faster than the frame rate of the monitor.

    No one said it could.

    > If frame synch is turned off, and the frame generation rate allowed to
    > exceed the monitor frame rate then GPU and CPU power is just being wasted
    > because the extra will never reach the screen (to display at twice the
    > monitor frame rate would mean that half the information is never displayed,
    > and the GPU and CPU processing power would be better spent on increased
    > image quality.) And then there would be the displacement with moving objects
    > or a panning motion at some point in the displayed composite frame.
    >
    > The "frozen in time" effect is just as present in film as in CGI. After
    > all, the exposure time can be varied in cinemaphotography to freeze any
    > motion (the downside is that more sensitive film, a faster lens, increased
    > scene illumination, and or "pushed" development must be used.) And what
    > about CGI use in filmed movies ("Hell Boy", "Von Helsing", "Shreck II"?)

    You mean they weren't real?

    On the other hand I doubt they were generated real time on a PC.

    > Those who report seeing a difference with computer display images when the
    > computer frame rate is higher than the monitor display rate are either
    > perceiving the "image tearing" you mention as meaningful screen action
    > (indicating really short attention spans, really short)

    It indicates no such thing. Just as perceiving 'purple' from three
    phosphors, none of which are 'purple', doesn't 'indicate' you're damn fast
    at fourier calculations.

    > or reacting to frame
    > rate hype for 3-D accelerated adapter cards and games. Or maybe it is the
    > aura of an impending seizure.

    I told you I had doubts about it but for you to just whimsically dismiss
    it, unless you have done the appropriate experiments, is a bit cavalier.

    You are looking solely at the 'mechanics' of the 'device' and, using that
    kind of analysis, it's also obvious that color television can't work
    because there isn't enough bandwidth for the color information, by an order
    of magnitude, and you simply can't reproduce the visible spectrum with 3
    fixed wavelength phosphors. But it does work due to the peculiarities of
    the human eye and human perception.

    But I'm not going to 'argue' it with you because it isn't my theory and I'm
    not an expert on it. I simply note that there ARE people who say it makes a
    difference, based on experiments they've done, and they have a theory as to
    why.
  11. Archived from groups: alt.comp.hardware.overclocking (More info?)

    I didn't explain to you how movie projectors work, I just responded to your
    description of the 'theory' and the errors in that 'theory' as you describe
    it. Evidently the expounders of that theory don't understand how movie
    projectors work, nor how the cameras work either.

    How the CGI composite images were generated has nothing with how they are
    currently displayed in cinemas. The point is that they are quite satisfying
    at 24 frames per seconds.

    And of course color television isn't impossible, and WHAT bandwidth? USA
    broadcast channels? Video amplifier bandwidth in television receivers?
    Red bandwidth? Green bandwidth? Blue bandwidth? Video bandwidth? In NTSC
    encoding, GREEN bandwidth is more than three times that of BLUE, something
    like 1.7 MHz to .5 MHz, with RED bandwidth falling somewhere in between.

    I don't whimiscally dismiss theory, it is bogus and I make fun of it, as it
    deserves. NTSC television, on the other hand, depends on valid theories
    that are confirmed, and depends on information that reaches the screen. The
    video game rate "theory" evidently depends on information that does not
    reach the screen. THAT is why I make fun of it.
    --
    Phil Weldon, pweldonatmindjumpdotcom
    For communication,
    replace "at" with the 'at sign'
    replace "mindjump" with "mindspring."
    replace "dot" with "."

    "David Maynard" <dNOTmayn@ev1.net> wrote in message
    news:10b5gq2oeirgb05@corp.supernews.com...
    > Phil Weldon wrote:
    >
    > > When images on a strip of film are projected onto a screen, the images
    are
    > > discrete, perhaps separated by a small blank interval, depending on the
    > > projector (one type has continuous film advance rather than
    intermittent,
    > > and uses a rotating prism syncronized with the film movement to project
    a
    > > single frame until the next moves into place.)
    >
    > I am aware of how a movie projector works. You missed the point.
    >
    > > As for frame rate on a comuter monitor, there is absolutely no way for
    > > information to reach the screen faster than the frame rate of the
    monitor.
    >
    > No one said it could.
    >
    > > If frame synch is turned off, and the frame generation rate allowed to
    > > exceed the monitor frame rate then GPU and CPU power is just being
    wasted
    > > because the extra will never reach the screen (to display at twice the
    > > monitor frame rate would mean that half the information is never
    displayed,
    > > and the GPU and CPU processing power would be better spent on increased
    > > image quality.) And then there would be the displacement with moving
    objects
    > > or a panning motion at some point in the displayed composite frame.
    > >
    > > The "frozen in time" effect is just as present in film as in CGI.
    After
    > > all, the exposure time can be varied in cinemaphotography to freeze any
    > > motion (the downside is that more sensitive film, a faster lens,
    increased
    > > scene illumination, and or "pushed" development must be used.) And what
    > > about CGI use in filmed movies ("Hell Boy", "Von Helsing", "Shreck II"?)
    >
    > You mean they weren't real?
    >
    > On the other hand I doubt they were generated real time on a PC.
    >
    > > Those who report seeing a difference with computer display images when
    the
    > > computer frame rate is higher than the monitor display rate are either
    > > perceiving the "image tearing" you mention as meaningful screen action
    > > (indicating really short attention spans, really short)
    >
    > It indicates no such thing. Just as perceiving 'purple' from three
    > phosphors, none of which are 'purple', doesn't 'indicate' you're damn fast
    > at fourier calculations.
    >
    > > or reacting to frame
    > > rate hype for 3-D accelerated adapter cards and games. Or maybe it is
    the
    > > aura of an impending seizure.
    >
    > I told you I had doubts about it but for you to just whimsically dismiss
    > it, unless you have done the appropriate experiments, is a bit cavalier.
    >
    > You are looking solely at the 'mechanics' of the 'device' and, using that
    > kind of analysis, it's also obvious that color television can't work
    > because there isn't enough bandwidth for the color information, by an
    order
    > of magnitude, and you simply can't reproduce the visible spectrum with 3
    > fixed wavelength phosphors. But it does work due to the peculiarities of
    > the human eye and human perception.
    >
    > But I'm not going to 'argue' it with you because it isn't my theory and
    I'm
    > not an expert on it. I simply note that there ARE people who say it makes
    a
    > difference, based on experiments they've done, and they have a theory as
    to
    > why.
    >
  12. Archived from groups: alt.comp.hardware.overclocking (More info?)

    Phil Weldon wrote:

    > I didn't explain to you how movie projectors work,


    I didn't say you did. I said I know how they work, which includes your
    "rotating prism" explanation and the rest.

    > I just responded to your
    > description of the 'theory' and the errors in that 'theory' as you describe
    > it. Evidently the expounders of that theory don't understand how movie
    > projectors work, nor how the cameras work either.

    They understand it just fine.

    > How the CGI composite images were generated has nothing with how they are
    > currently displayed in cinemas.

    More appropriately, "how they are currently displayed" has nothing to do
    with how they are generated, which is the POINT of their theory: what
    happens BEFORE it's sent to display.

    > The point is that they are quite satisfying
    > at 24 frames per seconds.

    Which is irrelevant to the idea they proposed.

    > And of course color television isn't impossible,

    Yes, of course. And I gave the reason why.

    > and WHAT bandwidth?

    The color information, as I said. Makes no difference 'where' in the whole
    schlemiel we look, there is a LIMIT to how much color information can
    POSSIBLY be there because of how it's encoded.

    > USA
    > broadcast channels? Video amplifier bandwidth in television receivers?
    > Red bandwidth? Green bandwidth? Blue bandwidth? Video bandwidth? In NTSC
    > encoding, GREEN bandwidth is more than three times that of BLUE, something
    > like 1.7 MHz to .5 MHz, with RED bandwidth falling somewhere in between.

    NTSC. The entire video bandwidth for luminance is 4.2 MHz. All of that is
    available for 'B&W'. For color, the chroma subcarrier is modulated on top
    of it at about 3.58Mhz and is comprised of two color information signals, I
    and Q, (since we can use those with the luminance to recreate 3 primary
    color signals). The I signal is bandwidth limited to about 1.5 MHz with the
    Q limited to about .6 Mhz. That's the 'best' you could get without the
    attendant phase and amplitude distortions resulting from broadcast
    transmission.

    (If you care, the I and Q are derived as follows"

    I = 0.74 (R'-Y) - 0.27 (B'-Y) = 0.60 R' - 0.28 G' - 0.32 B'
    Q = 0.48 (R'-Y) + 0.41 (B'-Y) = 0.21 R' - 0.52 G' + 0.31 B'

    )

    Now, subtract a .6Mhz bandwidth signal from a 4.2 MHz bandwidth signal and
    the useful resulting signal is not going to contain any more resolution
    than the lower of the two bandwidths: .6 MHz (plus uncorrected high
    frequency luminance components, unless they're filtered out.)

    The result is that NTSC color resolution STINKS. Which is one reason why
    they make incredibly lousy PC monitors.

    But, as it turns out, the human eye is more sensitive to luminance
    information than it is to color so your mind's eye just doesn't give much
    of a tinker's dam about how positively dismal the color resolution is when
    viewing 'natural scenes' (as opposed to graphics/text) on a TV.


    > I don't whimiscally dismiss theory, it is bogus and I make fun of it, as it
    > deserves.

    That's exactly what they said about Goddard's stupid notion that rockets
    would work in the vacuum of space.

    > NTSC television, on the other hand, depends on valid theories
    > that are confirmed, and depends on information that reaches the screen. The
    > video game rate "theory" evidently depends on information that does not
    > reach the screen. THAT is why I make fun of it.

    Your 'humor' of it is based on a false premise then.
  13. Archived from groups: alt.comp.hardware.overclocking (More info?)

    Working from your description of this "theory" of the effect of frame rates
    higher than the monitor refresh rate, I repeat my criticism. It depends on
    information that does not reach the display. Color television is completely
    different matter. It depends on our perception of what DOES reach the
    display, rather than what DOES NOT reach the display, an important
    difference, and an example of what separates science from mysticism. I
    will though my back issues of the SMPT magazine, however, for mention of
    something that bears on this "theory."

    Anyway, the limitation of NTSC compared to PAL/SCAM is not so much
    resolution, but control of color distortion in the broadcast path. The
    vertical resolution increase that PAL/SECAM gains over NTSC is at the
    expense of lower temporal resolution. The horizontal resolution increase
    is at the is at the expense (for broadcast) of increased spectrum cost.
    Compare the number of television broadcast stations in the PAL/SECAM world
    with the number of television broadcast stations in the NTSC world. Now
    that decoding of HDTV type broadcasts is possible in television receivers,
    everything changes, and, like Richard Nixon, we won't have Never The Same
    Color to kick around any more (well, he did make a comeback... hopefully
    we'll be luckier with NTSC.)

    --
    Phil Weldon, pweldonatmindjumpdotcom
    For communication,
    replace "at" with the 'at sign'
    replace "mindjump" with "mindspring."
    replace "dot" with "."

    "David Maynard" <dNOTmayn@ev1.net> wrote in message
    news:10b65tl1k1k3c5d@corp.supernews.com...
    > Phil Weldon wrote:
    >
    > > I didn't explain to you how movie projectors work,
    >
    >
    > I didn't say you did. I said I know how they work, which includes your
    > "rotating prism" explanation and the rest.
    >
    > > I just responded to your
    > > description of the 'theory' and the errors in that 'theory' as you
    describe
    > > it. Evidently the expounders of that theory don't understand how movie
    > > projectors work, nor how the cameras work either.
    >
    > They understand it just fine.
    >
    > > How the CGI composite images were generated has nothing with how they
    are
    > > currently displayed in cinemas.
    >
    > More appropriately, "how they are currently displayed" has nothing to do
    > with how they are generated, which is the POINT of their theory: what
    > happens BEFORE it's sent to display.
    >
    > > The point is that they are quite satisfying
    > > at 24 frames per seconds.
    >
    > Which is irrelevant to the idea they proposed.
    >
    > > And of course color television isn't impossible,
    >
    > Yes, of course. And I gave the reason why.
    >
    > > and WHAT bandwidth?
    >
    > The color information, as I said. Makes no difference 'where' in the whole
    > schlemiel we look, there is a LIMIT to how much color information can
    > POSSIBLY be there because of how it's encoded.
    >
    > > USA
    > > broadcast channels? Video amplifier bandwidth in television receivers?
    > > Red bandwidth? Green bandwidth? Blue bandwidth? Video bandwidth? In
    NTSC
    > > encoding, GREEN bandwidth is more than three times that of BLUE,
    something
    > > like 1.7 MHz to .5 MHz, with RED bandwidth falling somewhere in
    between.
    >
    > NTSC. The entire video bandwidth for luminance is 4.2 MHz. All of that is
    > available for 'B&W'. For color, the chroma subcarrier is modulated on top
    > of it at about 3.58Mhz and is comprised of two color information signals,
    I
    > and Q, (since we can use those with the luminance to recreate 3 primary
    > color signals). The I signal is bandwidth limited to about 1.5 MHz with
    the
    > Q limited to about .6 Mhz. That's the 'best' you could get without the
    > attendant phase and amplitude distortions resulting from broadcast
    > transmission.
    >
    > (If you care, the I and Q are derived as follows"
    >
    > I = 0.74 (R'-Y) - 0.27 (B'-Y) = 0.60 R' - 0.28 G' - 0.32 B'
    > Q = 0.48 (R'-Y) + 0.41 (B'-Y) = 0.21 R' - 0.52 G' + 0.31 B'
    >
    > )
    >
    > Now, subtract a .6Mhz bandwidth signal from a 4.2 MHz bandwidth signal and
    > the useful resulting signal is not going to contain any more resolution
    > than the lower of the two bandwidths: .6 MHz (plus uncorrected high
    > frequency luminance components, unless they're filtered out.)
    >
    > The result is that NTSC color resolution STINKS. Which is one reason why
    > they make incredibly lousy PC monitors.
    >
    > But, as it turns out, the human eye is more sensitive to luminance
    > information than it is to color so your mind's eye just doesn't give much
    > of a tinker's dam about how positively dismal the color resolution is when
    > viewing 'natural scenes' (as opposed to graphics/text) on a TV.
    >
    >
    > > I don't whimiscally dismiss theory, it is bogus and I make fun of it, as
    it
    > > deserves.
    >
    > That's exactly what they said about Goddard's stupid notion that rockets
    > would work in the vacuum of space.
    >
    > > NTSC television, on the other hand, depends on valid theories
    > > that are confirmed, and depends on information that reaches the screen.
    The
    > > video game rate "theory" evidently depends on information that does not
    > > reach the screen. THAT is why I make fun of it.
    >
    > Your 'humor' of it is based on a false premise then.
    >
  14. Archived from groups: alt.comp.hardware.overclocking (More info?)

    Phil Weldon wrote:

    > Working from your description of this "theory" of the effect of frame rates
    > higher than the monitor refresh rate, I repeat my criticism. It depends on
    > information that does not reach the display.

    No, it doesn't.

    > Color television is completely
    > different matter. It depends on our perception of what DOES reach the
    > display, rather than what DOES NOT reach the display, an important
    > difference, and an example of what separates science from mysticism. I
    > will though my back issues of the SMPT magazine, however, for mention of
    > something that bears on this "theory."

    I would agree if your assumption were correct, but it's not.

    > Anyway, the limitation of NTSC compared to PAL/SCAM is not so much
    > resolution, but control of color distortion in the broadcast path.

    If the point had been a comparison of NTSC vs PAL/SECAM you'd be correct
    but the point wasn't a comparison. The point was the inherent poor
    resolution of the color information.

    PAL/SECAM also transmit low bandwidth color information but the modulation
    techniques compensate for broadcast anomalies better, at the expense of
    more complicated circuitry and more expensive TV sets. However, they depend
    on the same 'tricks of the eye' to work.

    > The
    > vertical resolution increase that PAL/SECAM gains over NTSC is at the
    > expense of lower temporal resolution. The horizontal resolution increase
    > is at the is at the expense (for broadcast) of increased spectrum cost.
    > Compare the number of television broadcast stations in the PAL/SECAM world
    > with the number of television broadcast stations in the NTSC world. Now
    > that decoding of HDTV type broadcasts is possible in television receivers,
    > everything changes, and, like Richard Nixon, we won't have Never The Same
    > Color to kick around any more (well, he did make a comeback... hopefully
    > we'll be luckier with NTSC.)

    The point had nothing to do with comparing various TV formats. The point
    was that the human eye 'compensates' for the sketchy pictorial information
    and 'interprets' it into a 'reasonable' representation. That pictorial
    information does not have to be 'technically correct', or even 'good',
    because the human eye does peculiar things of it's own in turning it into
    "what you (think you) see."
  15. Archived from groups: alt.comp.hardware.overclocking (More info?)

    What, then, do any of your points have to do with sending more frames to
    the display device than can be displayed, which, I thought, was the
    contention of this "theory". I agree, NTSC, PAL, and SECAM have nothing to
    do with the contention EXCEPT for the fact that broadcast television depends
    on perception of information that DOES reach the screen, NOT on information
    that DOES NOT reach the screen, as your explanation of the "theory" of
    excess frame rate indicates. Or maybe I have misinterpreted your
    explantation, in which case more discussion is fruitless unless you can
    clarify.

    I hope the "theory" is not just that the CAPABILITY of frame rates far
    beyond the display presentation frame rate indicate excess capacity to
    handle peak graphics processing requirements well above the average graphics
    processing requirements. If that is the contention, then I don't think
    ANYONE would differ.

    --
    Phil Weldon, pweldonatmindjumpdotcom
    For communication,
    replace "at" with the 'at sign'
    replace "mindjump" with "mindspring."
    replace "dot" with "."


    "David Maynard" <dNOTmayn@ev1.net> wrote in message
    news:10b7id0ruee6p54@corp.supernews.com...
    > Phil Weldon wrote:
    >
    > > Working from your description of this "theory" of the effect of frame
    rates
    > > higher than the monitor refresh rate, I repeat my criticism. It depends
    on
    > > information that does not reach the display.
    >
    > No, it doesn't.
    >
    > > Color television is completely
    > > different matter. It depends on our perception of what DOES reach the
    > > display, rather than what DOES NOT reach the display, an important
    > > difference, and an example of what separates science from mysticism. I
    > > will though my back issues of the SMPT magazine, however, for mention of
    > > something that bears on this "theory."
    >
    > I would agree if your assumption were correct, but it's not.
    >
    > > Anyway, the limitation of NTSC compared to PAL/SCAM is not so much
    > > resolution, but control of color distortion in the broadcast path.
    >
    > If the point had been a comparison of NTSC vs PAL/SECAM you'd be correct
    > but the point wasn't a comparison. The point was the inherent poor
    > resolution of the color information.
    >
    > PAL/SECAM also transmit low bandwidth color information but the modulation
    > techniques compensate for broadcast anomalies better, at the expense of
    > more complicated circuitry and more expensive TV sets. However, they
    depend
    > on the same 'tricks of the eye' to work.
    >
    > > The
    > > vertical resolution increase that PAL/SECAM gains over NTSC is at the
    > > expense of lower temporal resolution. The horizontal resolution
    increase
    > > is at the is at the expense (for broadcast) of increased spectrum
    cost.
    > > Compare the number of television broadcast stations in the PAL/SECAM
    world
    > > with the number of television broadcast stations in the NTSC world. Now
    > > that decoding of HDTV type broadcasts is possible in television
    receivers,
    > > everything changes, and, like Richard Nixon, we won't have Never The
    Same
    > > Color to kick around any more (well, he did make a comeback... hopefully
    > > we'll be luckier with NTSC.)
    >
    > The point had nothing to do with comparing various TV formats. The point
    > was that the human eye 'compensates' for the sketchy pictorial information
    > and 'interprets' it into a 'reasonable' representation. That pictorial
    > information does not have to be 'technically correct', or even 'good',
    > because the human eye does peculiar things of it's own in turning it into
    > "what you (think you) see."
    >
    >
  16. Archived from groups: alt.comp.hardware.overclocking (More info?)

    Phil Weldon wrote:

    > What, then, do any of your points have to do with sending more frames to
    > the display device than can be displayed, which, I thought, was the
    > contention of this "theory".

    No, and it's obvious to even the most casual observer that you can't 'send
    more frames than can be displayed' but you're so obsessed with insisting
    that is 'the theory' that you won't give it 2 seconds of thought.

    One idea we had talked about before you jumped on this 'things that are
    never seen' bandwagon was a frame consisting partly of one and partly of
    the next, caused by the generated frame rate being faster than the display
    frame rate. And while you seem to be absolutely convinced the observer is
    demented I can imagine the eye 'integrating' the effect just as it does
    other 'fragmented' information in conventional TV images.

    Whether it does, or not, I don't know as I've never done any experiments
    with it.


    > I agree, NTSC, PAL, and SECAM have nothing to
    > do with the contention EXCEPT for the fact that broadcast television depends
    > on perception of information that DOES reach the screen, NOT on information
    > that DOES NOT reach the screen, as your explanation of the "theory" of
    > excess frame rate indicates.

    No, it doesn't, regardless of how many times you repeat it and I repeat
    that it doesn't.

    > Or maybe I have misinterpreted your
    > explantation, in which case more discussion is fruitless unless you can
    > clarify.
    >
    > I hope the "theory" is not just that the CAPABILITY of frame rates far
    > beyond the display presentation frame rate indicate excess capacity to
    > handle peak graphics processing requirements well above the average graphics
    > processing requirements. If that is the contention, then I don't think
    > ANYONE would differ.
    >
  17. Archived from groups: alt.comp.hardware.overclocking (More info?)

    Well, I am neither obsessed nor on a bandwagon. Why don't you restate the
    "theory" (which you made clear is not yours) to clarify the discussion.
    We've been talking past each other. Perception of video images, both analog
    has been exhaustivly studied, there really isn't anything "new" about video
    games that hasn't been studied in developing digital compression and display
    of moving images.

    --
    Phil Weldon, pweldonatmindjumpdotcom
    For communication,
    replace "at" with the 'at sign'
    replace "mindjump" with "mindspring."
    replace "dot" with "."


    "David Maynard" <dNOTmayn@ev1.net> wrote in message
    news:10b7qni2lns599f@corp.supernews.com...
    > Phil Weldon wrote:
    >
    > > What, then, do any of your points have to do with sending more frames
    to
    > > the display device than can be displayed, which, I thought, was the
    > > contention of this "theory".
    >
    > No, and it's obvious to even the most casual observer that you can't 'send
    > more frames than can be displayed' but you're so obsessed with insisting
    > that is 'the theory' that you won't give it 2 seconds of thought.
    >
    > One idea we had talked about before you jumped on this 'things that are
    > never seen' bandwagon was a frame consisting partly of one and partly of
    > the next, caused by the generated frame rate being faster than the display
    > frame rate. And while you seem to be absolutely convinced the observer is
    > demented I can imagine the eye 'integrating' the effect just as it does
    > other 'fragmented' information in conventional TV images.
    >
    > Whether it does, or not, I don't know as I've never done any experiments
    > with it.
    >
    >
    > > I agree, NTSC, PAL, and SECAM have nothing to
    > > do with the contention EXCEPT for the fact that broadcast television
    depends
    > > on perception of information that DOES reach the screen, NOT on
    information
    > > that DOES NOT reach the screen, as your explanation of the "theory" of
    > > excess frame rate indicates.
    >
    > No, it doesn't, regardless of how many times you repeat it and I repeat
    > that it doesn't.
    >
    > > Or maybe I have misinterpreted your
    > > explantation, in which case more discussion is fruitless unless you can
    > > clarify.
    > >
    > > I hope the "theory" is not just that the CAPABILITY of frame rates far
    > > beyond the display presentation frame rate indicate excess capacity to
    > > handle peak graphics processing requirements well above the average
    graphics
    > > processing requirements. If that is the contention, then I don't think
    > > ANYONE would differ.
    > >
    >
Ask a new question

Read More

Windows XP RAM Radeon Overclocking