tft monitor - which one should I choose?

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Hello,

I am currently thinking of buying a TFT monitor. What properties should
I take into account when deciding which one to buy? Obviously, the
physical size. I am doubting between 17" and 19". But I've heard a 17"
TFT monitor displays more than a 17" CRT monitor. Is this correct?

What is the difference between TFT and LCD?

At [0], different monitors are offered. Prices for 17" 1280x1024 25ms range
from €429 to €650. The difference I see is that the latter has "700:1,
270cd dvi, zilver". When do I notice this?

I use my computer to watch television, and read and write texts. What
should I choose?

yours,
Gerrit.

[0] http://www.utwente.nl/itshop/prijslijsten/Prijslijst%20accessoires/index.html#monitoren_flatpanel

--
Ervaringen met het Syndroom van Asperger:
http://topjaklont.student.utwente.nl
Socialistische Partij:
http://www.sp.nl/
54 answers Last reply
More about monitor choose
  1. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "Gerrit Holl" <Gerrit@nl.linux.org> wrote in message
    news:slrnc7slnk.sr7.Gerrit@topjaklont.student.utwente.nl...
    > Hello,
    >
    > I am currently thinking of buying a TFT monitor. What properties should
    > I take into account when deciding which one to buy? Obviously, the
    > physical size. I am doubting between 17" and 19". But I've heard a 17"
    > TFT monitor displays more than a 17" CRT monitor. Is this correct?
    >
    > What is the difference between TFT and LCD?

    "TFT" means "thin-film transistor," which is the technology used
    to create the active-matrix array on modern monitor LCDs. Virtually
    all current monitors are of the active-matrix TFT type, so within this
    market there is no difference.

    A 17" LCD monitor DOES provide a larger active area than a 17"
    CRT, as it has been traditional in the LCD industry to base the diagonal
    size measurement on the active area, while CRT diagonals give the
    overall CRT size. A "17 inch CRT" will typically have an active area
    that's about 15.5" or so in diagonal size.

    >
    > At [0], different monitors are offered. Prices for 17" 1280x1024 25ms
    range
    > from â,¬429 to â,¬650. The difference I see is that the latter has "700:1,
    > 270cd dvi, zilver". When do I notice this?

    700:1 is the contrast ratio; within reason, higher is better, although
    in recent years these specs have become almost meaningless due to
    the tendency to quote only "dark ambient" numbers. Anything over
    300:1 in actual delivered contrast would be outstanding, but very,
    very few products actually provide this level of performance in normal
    office or home lighting conditions (contrast in these situations is
    dominated by the reflection of ambient light from the screen, not
    from the inherent white/black contrast of the display device itself).
    270 cd/m^2 (read "candelas per square meter", also sometimes referred
    to by the older term, "nits") is the measure of the luminance or
    "brightness" of the display (i.e., how bright white areas are at
    maximum). This is actually a fairly average level; higher specs may
    be found, but again take any published specs with a grain of salt - it's
    far better to actually SEE the display, and go with what YOU like in terms
    of the overall appearance. "DVI" is the current digital interface standard;
    it may be useful for you, but only if you have a video source providing a
    DVI output. I don't know what "zilver" means, unless this is a typo and
    it meant that the case color was "silver."

    >
    > I use my computer to watch television, and read and write texts. What
    > should I choose?

    The best advice is to try out the monitor with the sort of images you
    commonly use, and choose the one that YOU think looks the best.
    For TV use, you'll want a fairly fast response time - 25 ms or better should
    be the absolute minimum, and you'll see a distinct improvement if you can
    get to 16 ms or less. TV viewing also generally calls for high brightness
    (over 300 cd/m^2 would be good), and good color performance - if you
    want accurate color in your TV images, the monitor should be able to
    be set to a white point of 6500K (it will be listed in this form as "color
    temperature"), and the larger the color gamut (expressed as a percentage
    of the standard NTSC or EBU gamuts) the better.

    Bob M.
  2. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "Bob Myers" <nospamplease@address.invalid> wrote in message news:<SZyfc.3186$c94.723@news.cpqcorp.net>...
    > "Gerrit Holl" <Gerrit@nl.linux.org> wrote in message
    > news:slrnc7slnk.sr7.Gerrit@topjaklont.student.utwente.nl...
    > > Hello,
    > >
    > > I am currently thinking of buying a TFT monitor. What properties should
    > > I take into account when deciding which one to buy? Obviously, the
    > > physical size. I am doubting between 17" and 19". But I've heard a 17"
    > > TFT monitor displays more than a 17" CRT monitor. Is this correct?
    > >
    > > What is the difference between TFT and LCD?
    >
    > "TFT" means "thin-film transistor," which is the technology used
    > to create the active-matrix array on modern monitor LCDs. Virtually
    > all current monitors are of the active-matrix TFT type, so within this
    > market there is no difference.
    >
    > A 17" LCD monitor DOES provide a larger active area than a 17"
    > CRT, as it has been traditional in the LCD industry to base the diagonal
    > size measurement on the active area, while CRT diagonals give the
    > overall CRT size. A "17 inch CRT" will typically have an active area
    > that's about 15.5" or so in diagonal size.
    >
    > >
    > > At [0], different monitors are offered. Prices for 17" 1280x1024 25ms
    > range
    > > from â,¬429 to â,¬650. The difference I see is that the latter has "700:1,
    > > 270cd dvi, zilver". When do I notice this?
    >
    > 700:1 is the contrast ratio; within reason, higher is better, although
    > in recent years these specs have become almost meaningless due to
    > the tendency to quote only "dark ambient" numbers. Anything over
    > 300:1 in actual delivered contrast would be outstanding, but very,
    > very few products actually provide this level of performance in normal
    > office or home lighting conditions (contrast in these situations is
    > dominated by the reflection of ambient light from the screen, not
    > from the inherent white/black contrast of the display device itself).
    > 270 cd/m^2 (read "candelas per square meter", also sometimes referred
    > to by the older term, "nits") is the measure of the luminance or
    > "brightness" of the display (i.e., how bright white areas are at
    > maximum). This is actually a fairly average level; higher specs may
    > be found, but again take any published specs with a grain of salt - it's
    > far better to actually SEE the display, and go with what YOU like in terms
    > of the overall appearance. "DVI" is the current digital interface standard;
    > it may be useful for you, but only if you have a video source providing a
    > DVI output. I don't know what "zilver" means, unless this is a typo and
    > it meant that the case color was "silver."
    >
    > >
    > > I use my computer to watch television, and read and write texts. What
    > > should I choose?
    >
    > The best advice is to try out the monitor with the sort of images you
    > commonly use, and choose the one that YOU think looks the best.
    > For TV use, you'll want a fairly fast response time - 25 ms or better should
    > be the absolute minimum, and you'll see a distinct improvement if you can
    > get to 16 ms or less. TV viewing also generally calls for high brightness
    > (over 300 cd/m^2 would be good), and good color performance - if you
    > want accurate color in your TV images, the monitor should be able to
    > be set to a white point of 6500K (it will be listed in this form as "color
    > temperature"), and the larger the color gamut (expressed as a percentage
    > of the standard NTSC or EBU gamuts) the better.
    >
    > Bob M.

    I can't add much to Bob M's excellent response, but here goes. You
    might want to consider a TFT with dual CPU option. That's a TFT with
    both analogue and digital inputs and you can connected and display 2
    computers on 1 monitor. Also, a swivel screen may be useful. This
    means you can position the monitor vertically to display more
    information without using scrollbars. Be careful of dead pixels.
    These are pixels that stay on or off all the time. Sometimes you need
    at least 5 dead pixels to get a refund/replacement. I guess the
    cheaper monitors will have more chance of dead pixels. You might want
    to consider pixel pitch and resolution. Most TFTs only work best at
    their native resolution.
  3. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    Gerrit Holl <Gerrit@nl.linux.org> wrote in message news:<slrnc7slnk.sr7.Gerrit@topjaklont.student.utwente.nl>...
    > Hello,
    >
    > I am currently thinking of buying a TFT monitor. What properties should
    > I take into account when deciding which one to buy? Obviously, the
    > physical size. I am doubting between 17" and 19". But I've heard a 17"
    > TFT monitor displays more than a 17" CRT monitor. Is this correct?
    >
    > What is the difference between TFT and LCD?
    >
    > At [0], different monitors are offered. Prices for 17" 1280x1024 25ms range
    > from €429 to €650. The difference I see is that the latter has "700:1,
    > 270cd dvi, zilver". When do I notice this?
    >
    > I use my computer to watch television, and read and write texts. What
    > should I choose?
    >
    > yours,
    > Gerrit.
    >
    > [0] http://www.utwente.nl/itshop/prijslijsten/Prijslijst%20accessoires/index.html#monitoren_flatpanel

    I've just purchased my first TFT today. I went through the headache
    of visiting numerous computer stores to examine monitors and also read
    dozens of reviews. In the end, I bought an 19" LG L1920P.

    It has 2 USB ports called upstream and downstream. I'm not quite sure
    what the possibilities are with these ports as the manual is very
    brief. If anyone can inform me, please do?

    It has 2 video inputs, analogue and digital. I have it connected to
    the digital input as my graphics card has a digital output. Viewing
    high resolutions images on this monitor is a real pleasure. Browsing
    text or surfing the net is also a pleasure due to the sharp text. I
    tried a dvd on the monitor when I was in the store and it looked
    really impressive. My only disappointment so far is playing 3D
    snooker. The monitor ghosts a lot when the snooker balls are rolling
    across the table. I don't know if switching to analogue, which uses
    75hz, may improve this. Digital runs at 60hz.

    I had a look at some other monitors before buying the LG L1920P. For
    example, I looked at a couple of LG 17" monitors, the L1710S and the
    L1720B, but the text on those looked really small and difficult to
    read. They were priced at £299 and £349. When compared to the
    L1920P, they look terrible, so it makes sense to dump the extra coin
    and get the one with the nicest display. Another thing, I checked the
    L1920P for dead pixels using LCDTest and not a single dead pixel was
    found. I've seen dead pixels on cheaper models.
  4. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    I just ordered the SDM-P232W/B!!! I can't wait. I saw it on display at
    Frys, played a few games with it(shooters) and fell in love with it.

    I saw not ghosting whats-so-ever....very bright/sharp pictures....

    Redbrick...who loves his CLK

    In article <30ad41a9.0405061434.ac35dc3@posting.google.com>,
    spam_eliminator@lycos.com says...
    >
    >Gerrit Holl <Gerrit@nl.linux.org> wrote in message
    news:<slrnc7slnk.sr7.Gerrit@topjaklont.student.utwente.nl>...
    >> Hello,
    >>
    >> I am currently thinking of buying a TFT monitor. What properties should
    >> I take into account when deciding which one to buy? Obviously, the
    >> physical size. I am doubting between 17" and 19". But I've heard a 17"
    >> TFT monitor displays more than a 17" CRT monitor. Is this correct?
    >>
    >> What is the difference between TFT and LCD?
    >>
    >> At [0], different monitors are offered. Prices for 17" 1280x1024 25ms range
    >> from €429 to €650. The difference I see is that the latter has
    "700:1,
    >> 270cd dvi, zilver". When do I notice this?
    >>
    >> I use my computer to watch television, and read and write texts. What
    >> should I choose?
    >>
    >> yours,
    >> Gerrit.
    >>
    >> [0]
    http://www.utwente.nl/itshop/prijslijsten/Prijslijst%20accessoires/index.html#
    monitoren_flatpanel
    >
    >I've just purchased my first TFT today. I went through the headache
    >of visiting numerous computer stores to examine monitors and also read
    >dozens of reviews. In the end, I bought an 19" LG L1920P.
    >
    >It has 2 USB ports called upstream and downstream. I'm not quite sure
    >what the possibilities are with these ports as the manual is very
    >brief. If anyone can inform me, please do?
    >
    >It has 2 video inputs, analogue and digital. I have it connected to
    >the digital input as my graphics card has a digital output. Viewing
    >high resolutions images on this monitor is a real pleasure. Browsing
    >text or surfing the net is also a pleasure due to the sharp text. I
    >tried a dvd on the monitor when I was in the store and it looked
    >really impressive. My only disappointment so far is playing 3D
    >snooker. The monitor ghosts a lot when the snooker balls are rolling
    >across the table. I don't know if switching to analogue, which uses
    >75hz, may improve this. Digital runs at 60hz.
    >
    >I had a look at some other monitors before buying the LG L1920P. For
    >example, I looked at a couple of LG 17" monitors, the L1710S and the
    >L1720B, but the text on those looked really small and difficult to
    >read. They were priced at £299 and £349. When compared to the
    >L1920P, they look terrible, so it makes sense to dump the extra coin
    >and get the one with the nicest display. Another thing, I checked the
    >L1920P for dead pixels using LCDTest and not a single dead pixel was
    >found. I've seen dead pixels on cheaper models.
  5. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    redbrick@fastermail.com (Redbrick) wrote in message news:<76iqc.858$Wm.666@newssvr23.news.prodigy.com>...
    > I just ordered the SDM-P232W/B!!! I can't wait. I saw it on display at
    > Frys, played a few games with it(shooters) and fell in love with it.
    >
    > I saw not ghosting whats-so-ever....very bright/sharp pictures....
    >
    > Redbrick...who loves his CLK
    >
    so, how is the sdm-p232w/b?
  6. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    In article <30ad41a9.0406061047.78fd6249@posting.google.com>,
    spam_eliminator@lycos.com says...
    >
    >redbrick@fastermail.com (Redbrick) wrote in message
    news:<76iqc.858$Wm.666@newssvr23.news.prodigy.com>...
    >> I just ordered the SDM-P232W/B!!! I can't wait. I saw it on display at
    >> Frys, played a few games with it(shooters) and fell in love with it.
    >>
    >> I saw not ghosting whats-so-ever....very bright/sharp pictures....
    >>
    >> Redbrick...who loves his CLK
    >>
    >so, how is the sdm-p232w/b?

    It's an incredible monitor. I've been playing Far Cry, Homeworld, FS2004..
    incredible experience. The size of the image really draws you into the
    environment...if that even makes sense....

    Redbrick...who Loves his CLK
  7. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    redbrick@fastermail.com (Redbrick) wrote in message news:<ymxyc.197$rl2.107@newssvr23.news.prodigy.com>...
    > In article <30ad41a9.0406061047.78fd6249@posting.google.com>,
    > spam_eliminator@lycos.com says...
    > >
    > >redbrick@fastermail.com (Redbrick) wrote in message
    > news:<76iqc.858$Wm.666@newssvr23.news.prodigy.com>...
    > >> I just ordered the SDM-P232W/B!!! I can't wait. I saw it on display at
    > >> Frys, played a few games with it(shooters) and fell in love with it.
    > >>
    > >> I saw not ghosting whats-so-ever....very bright/sharp pictures....
    > >>
    > >> Redbrick...who loves his CLK
    > >>
    > >so, how is the sdm-p232w/b?
    >
    > It's an incredible monitor. I've been playing Far Cry, Homeworld, FS2004..
    > incredible experience. The size of the image really draws you into the
    > environment...if that even makes sense....
    >
    > Redbrick...who Loves his CLK
    Yes, it makes a lot of sense. Unlike a CRT, an LCD doesn't have thick
    piece of glass between you and the graphics, and that makes a big
    difference.
  8. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    spam_eliminator@lycos.com (.) wrote:

    >Unlike a CRT, an LCD doesn't have thick
    >piece of glass between you and the graphics, and that makes a big
    >difference.

    Oh. Glass. How horrible.

    Unlike an LCD, a CRT doesn't have a polarizing filter between you and
    the graphics, and that makes a big difference.
  9. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    chrisv <chrisv@nospam.invalid> wrote in message news:<09jld01dkf102nftnbsu4b1tc8pia4tvi6@4ax.com>...
    > spam_eliminator@lycos.com (.) wrote:
    >
    > >Unlike a CRT, an LCD doesn't have thick
    > >piece of glass between you and the graphics, and that makes a big
    > >difference.
    >
    > Oh. Glass. How horrible.
    >
    > Unlike an LCD, a CRT doesn't have a polarizing filter between you and
    > the graphics, and that makes a big difference.

    A die hard CRT-er.

    Number of LCD monitors at my local electronics store: 20
    Number of CRT monitors at my local electronics store: 1

    Go figure.
  10. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    In article <09jld01dkf102nftnbsu4b1tc8pia4tvi6@4ax.com>,
    chrisv <chrisv@nospam.invalid> wrote:
    > spam_eliminator@lycos.com (.) wrote:
    >
    > >Unlike a CRT, an LCD doesn't have thick
    > >piece of glass between you and the graphics, and that makes a big
    > >difference.
    >
    > Oh. Glass. How horrible.
    >
    > Unlike an LCD, a CRT doesn't have a polarizing filter between you and
    > the graphics, and that makes a big difference.

    Many CRTs have polarizing filters? I have an add-on one, and AFAIK there's
    no warning about not using it with certain monitors (they might all be
    aligned the same say).

    --
    -eben ebQenW1@EtaRmpTabYayU.rIr.OcoPm home.tampabay.rr.com/hactar
    PISCES: Try to avoid any Virgos or Leos with the Ebola virus.
    You are the Lord of the Dance, no matter what those idiots at
    work say. -- Weird Al, _Your Horoscope for Today_
  11. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    ebenONE@tampabay.ARE-ARE.com.unmunge (Hactar) wrote:

    >In article <09jld01dkf102nftnbsu4b1tc8pia4tvi6@4ax.com>,
    >chrisv <chrisv@nospam.invalid> wrote:
    >> spam_eliminator@lycos.com (.) wrote:
    >>
    >> >Unlike a CRT, an LCD doesn't have thick
    >> >piece of glass between you and the graphics, and that makes a big
    >> >difference.
    >>
    >> Oh. Glass. How horrible.
    >>
    >> Unlike an LCD, a CRT doesn't have a polarizing filter between you and
    >> the graphics, and that makes a big difference.
    >
    >Many CRTs have polarizing filters?

    Of course not.

    >I have an add-on one, and AFAIK there's
    >no warning about not using it with certain monitors (they might all be
    >aligned the same say).

    It wouldn't make much sense to place an add-on one in front of an LCD
    monitor. If you do, you'll know right away if it's not aligned with
    the monitor's filter, as you wouldn't be able to see anything.
  12. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    In article <6k4md09psfn0o4dd5lhhsiileqpqchvghe@4ax.com>,
    chrisv <chrisv@nospam.invalid> wrote:
    > ebenONE@tampabay.ARE-ARE.com.unmunge (Hactar) wrote:
    >
    > >In article <09jld01dkf102nftnbsu4b1tc8pia4tvi6@4ax.com>,
    > >chrisv <chrisv@nospam.invalid> wrote:
    > >> spam_eliminator@lycos.com (.) wrote:
    > >>
    > >> >Unlike a CRT, an LCD doesn't have thick
    > >> >piece of glass between you and the graphics, and that makes a big
    > >> >difference.
    > >>
    > >> Oh. Glass. How horrible.
    > >>
    > >> Unlike an LCD, a CRT doesn't have a polarizing filter between you and
    > >> the graphics, and that makes a big difference.
    > >
    > >Many CRTs have polarizing filters?
    >
    > Of course not.

    Ah, I misread that the other way. Carry on.

    > >I have an add-on one, and AFAIK there's
    > >no warning about not using it with certain monitors (they might all be
    > >aligned the same say).
    >
    > It wouldn't make much sense to place an add-on one in front of an LCD
    > monitor. If you do, you'll know right away if it's not aligned with
    > the monitor's filter, as you wouldn't be able to see anything.

    It's older than LCD monitors; that's why there's no warning against using
    it with one.

    You can get some interesting effects with that and a digital watch, though.

    --
    -eben ebQenW1@EtaRmpTabYayU.rIr.OcoPm home.tampabay.rr.com/hactar

    Hanlon's Razor: "Never attribute to malice that which can be
    adequately explained by stupidity." Derived from Robert Heinlein
  13. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "chrisv" <chrisv@nospam.invalid> wrote in message
    news:09jld01dkf102nftnbsu4b1tc8pia4tvi6@4ax.com...
    > Unlike an LCD, a CRT doesn't have a polarizing filter between you and
    > the graphics, and that makes a big difference.

    Well, let's see - this means that you won't be able to
    use an LCD monitor with certain polarizing sunglasses.
    Outside of that, what "big difference" did you have in
    mind?

    Bob M.
  14. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "Bob Myers" <nospamplease@address.invalid> wrote:

    (Dishonestly-snipped context restored)

    >"chrisv" <chrisv@nospam.invalid> wrote:
    >>
    >>spam_eliminator@lycos.com wrote:
    >>>
    >>>Unlike a CRT, an LCD doesn't have thick
    >>>piece of glass between you and the graphics, and that
    >>> makes a big difference.
    >>
    >> Unlike an LCD, a CRT doesn't have a polarizing filter between you and
    >> the graphics, and that makes a big difference.
    >
    >Well, let's see - this means that you won't be able to
    >use an LCD monitor with certain polarizing sunglasses.
    >Outside of that, what "big difference" did you have in
    >mind?

    I'll tell you, Bob, right after you tell me what "big difference" the
    thick glass on the front of a CRT makes.

    Doesn't surprise me at all, Bob, that you'd intentionally ignore the
    point I was making (that spam_eliminator's statement was ridiculous),
    and hypocritically attack it, with no mention of what spam_eliminator
    said.
  15. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    spam_eliminator@lycos.com (.) wrote:

    >chrisv <chrisv@nospam.invalid> wrote:
    >>
    >> spam_eliminator@lycos.com (.) wrote:
    >>
    >> >Unlike a CRT, an LCD doesn't have thick
    >> >piece of glass between you and the graphics, and that makes a big
    >> >difference.
    >>
    >> Oh. Glass. How horrible.
    >>
    >> Unlike an LCD, a CRT doesn't have a polarizing filter between you and
    >> the graphics, and that makes a big difference.
    >
    >A die hard CRT-er.

    A typical LCD snob.

    I notice you ignored my point, i.e. your statement was stupid.

    >Number of LCD monitors at my local electronics store: 20
    >Number of CRT monitors at my local electronics store: 1

    Well, that's proof that they're better right there! Not.
  16. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    spam_eliminator@lycos.com (.) wrote:

    I should have given this a better answer. 8)

    >Number of LCD monitors at my local electronics store: 20
    >Number of CRT monitors at my local electronics store: 1

    Oh, I know they're pushing them. More revenue for them, you know.
    Why sell a $100 monitor when you can sell a $300 monitor? Open up a
    Dell flyer, and there's no hint that such a thing as a CRT monitor
    exists!

    >Go figure.

    That's how I feel about a lot of human behavior. Have you ever
    noticed that most people will actually accelerate toward a red light,
    up until the moment they have to apply the brakes? 8)
  17. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "chrisv" <chrisv@nospam.invalid> wrote in message
    news:oefod053s0rmnegnu4gcuvuh42fr8660uk@4ax.com...
    > I'll tell you, Bob, right after you tell me what "big difference" the
    > thick glass on the front of a CRT makes.

    That one's pretty easy - the thick glass (required by the fact
    that the CRT faceplate has to resist a considerable amount of
    air pressure, and made thicker when you go to "flat-face"
    CRTs) has a couple of very objectionable optical effects -
    especially troublesome in desktop monitor use, since the
    typical viewing distance is on the same order as the screen
    diagonal. The first is the very obvious impact the glass has
    on the visual uniformity of the image - since you ARE using
    the thing with your eye relatively close to the glass, you're
    looking through a good deal more glass when you view the
    sides and corners of the image than you are in the center.
    Monitor CRT glass is pretty much always tinted - with a
    transmission generally in the range of 50-90% - as a
    contrast-enhancement technique. So, you wind up with the
    outside of the image looking quite a bit dimmer than the inside.
    (And since the tint is never perfectly neutral, there are similar
    impacts on the color uniformity.)

    The second effect is refractive; since you are looking at the
    outer extremes of the image through the glass at an angle, the
    light from this portion is refracted differently (as you see it)
    than the center. This leads to a number of distortion effects -
    most notably, the appearance of a concave ("bowed inward")
    image when truly flat faceplate glass (which was very thick) was
    first tried a number of years ago (as in the old Zenith "FTM" tube
    design).


    > Doesn't surprise me at all, Bob, that you'd intentionally ignore the
    > point I was making (that spam_eliminator's statement was ridiculous),
    > and hypocritically attack it, with no mention of what spam_eliminator
    > said.

    I didn't ignore the point you were making - it was simply wrong,
    and spam_eliminator was right. Thick faceplate glass is NOT
    desirable, for the reasons I gave above. And I would have thought
    that a CRT expert such as yourself would have already been quite
    aware of such concerns, so they didn't need to be repeated here.
    Guess I was wrong about that, huh?

    Bob M.
  18. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "chrisv" <chrisv@nospam.invalid> wrote in message
    news:q1kod0dsnjghg223f64g6bd3ukruiq7d7h@4ax.com...
    > Oh, I know they're pushing them. More revenue for them, you know.
    > Why sell a $100 monitor when you can sell a $300 monitor? Open up a
    > Dell flyer, and there's no hint that such a thing as a CRT monitor
    > exists!
    >


    Chris' standard response - CRTs are losing the market just
    because the big bad monitor and system makers pushed a
    clearly inferior product on an ignorant and easily-duped
    public.

    Ho-hum....

    Bob M.
  19. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    On Sat, 26 Jun 2004 19:23:35 +0000, Bob Myers wrote:

    > "chrisv" <chrisv@nospam.invalid> wrote in message
    > news:oefod053s0rmnegnu4gcuvuh42fr8660uk@4ax.com...
    >> I'll tell you, Bob, right after you tell me what "big difference" the
    >> thick glass on the front of a CRT makes.
    >
    > That one's pretty easy - the thick glass (required by the fact that the
    > CRT faceplate has to resist a considerable amount of air pressure, and
    > made thicker when you go to "flat-face" CRTs) has a couple of very
    > objectionable optical effects - especially troublesome in desktop
    > monitor use, since the typical viewing distance is on the same order as
    > the screen diagonal. The first is the very obvious impact the glass has
    > on the visual uniformity of the image - since you ARE using the thing
    > with your eye relatively close to the glass, you're looking through a
    > good deal more glass when you view the sides and corners of the image
    > than you are in the center.

    I calculate the difference due to this angle at 12% more glass at the
    corners than in the middle, assuming viewing distance = diagonal. This
    small difference in absolute brightness is compressed by the logarithmic
    response of human vision. It's generally not noticeable. No "big
    difference", Bob.

    > Monitor CRT glass is pretty much always tinted - with a transmission
    > generally in the range of 50-90% - as a contrast-enhancement technique.

    LCD's have tint too, Bob.

    > So, you wind up with the
    > outside of the image looking quite a bit dimmer than the inside.

    Bull. "Quite a bit dimmer" indeed.

    > (And since the tint is never perfectly neutral, there are similar
    > impacts on the color uniformity.)

    Yet CRT's are still better than LCD's in this regard. From
    http://website.lineone.net/~del.palmer/lacie.html

    <quote>
    It is also true that CRT monitors are at least 2 to 3 times more accurate
    when it comes to displaying color than LCD screens even when both are
    displaying 24-bit color and both are measured and calibrated with a
    colorimeter. CRT is able to maintain color uniformity across the screen 2
    to 3 times better than an LCD as well. </quote>

    No "big difference" in favor of the LCD, Bob.

    > The second effect is refractive; since you are looking at the outer
    > extremes of the image through the glass at an angle, the light from this
    > portion is refracted differently (as you see it) than the center. This
    > leads to a number of distortion effects - most notably, the appearance
    > of a concave ("bowed inward") image when truly flat faceplate glass
    > (which was very thick) was first tried a number of years ago (as in the
    > old Zenith "FTM" tube design).

    Using 15-year-old CRT designs to support your case, Bob? On my (modern)
    CRT's, there is NOT any refractive distortion that any reasonable person
    would describe as a "big difference". I can't notice any at all. The
    geometric distortions that I CAN notice are not caused by the "thick
    glass".

    >> Doesn't surprise me at all, Bob, that you'd intentionally ignore the
    >> point I was making (that spam_eliminator's statement was ridiculous),
    >> and hypocritically attack it, with no mention of what spam_eliminator
    >> said.
    >
    > I didn't ignore the point you were making - it was simply wrong, and
    > spam_eliminator was right.

    His point is technically correct in that an "ideal" monitor would have
    nothing in-between the graphics and your eyes. However, I maintain that
    his claim that the glass makes a "big" (presumably negative) difference in
    the image quality vs. a LCD is ridiculous, since LCD's are also non-ideal
    in this regard. Sorry, Bob, but my point is quite valid.

    > Thick faceplate glass is NOT
    > desirable, for the reasons I gave above.

    Will you now be fair and admit that LCD's are also imperfect in their
    transmission of the graphics to your eyes, Bob? I'd like to see you
    demonstrate your impartiality by posting a similar critique of the LCD's
    light-transmission compromises.

    > And I would have thought that a CRT expert such as yourself would have
    > already been quite aware of such concerns, so they didn't need to be
    > repeated here.

    I know enough to have recognized spam_eliminator's biased, unfair remark,
    Bob.

    > Guess I was wrong about that, huh?

    Once again, you (intentionally, I think) missed the point, Bob.
  20. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    On Sat, 26 Jun 2004 19:25:18 +0000, Bob Myers wrote:

    > "chrisv" <chrisv@nospam.invalid> wrote in message
    > news:q1kod0dsnjghg223f64g6bd3ukruiq7d7h@4ax.com...
    >>
    >> Oh, I know they're pushing them. More revenue for them, you know. Why
    >> sell a $100 monitor when you can sell a $300 monitor? Open up a Dell
    >> flyer, and there's no hint that such a thing as a CRT monitor exists!
    >
    > Chris' standard response - CRTs are losing the market just because the
    > big bad monitor and system makers pushed

    I've never claimed CRT's are losing market share "just because" LCD's are
    being pushed. You're being dishonest, Bob.

    > a clearly inferior product on an ignorant and easily-duped public.

    More dishonesty. I don't claim LCD monitors are "clearly inferior". In
    many applications, they are the best choice. If they were the same price
    as CRT monitors, they would be, IMO, the best choice in the majority of
    applications.

    As for my response being "standard", when someone illogically implies that
    "LCD's are what the stores are promoting, therefore they must be better",
    I will counter with what I believe is the real reason for the heavy
    promotion of LCD monitors.

    > Ho-hum....

    Indeed.
  21. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    dizzy <dizzy@nospam.invalid> wrote in message news:<pan.2004.06.27.22.01.08.351824@nospam.invalid>...
    > On Sat, 26 Jun 2004 19:25:18 +0000, Bob Myers wrote:
    >
    > > "chrisv" <chrisv@nospam.invalid> wrote in message
    > > news:q1kod0dsnjghg223f64g6bd3ukruiq7d7h@4ax.com...
    > >>
    > >> Oh, I know they're pushing them. More revenue for them, you know. Why
    > >> sell a $100 monitor when you can sell a $300 monitor? Open up a Dell
    > >> flyer, and there's no hint that such a thing as a CRT monitor exists!
    > >
    > > Chris' standard response - CRTs are losing the market just because the
    > > big bad monitor and system makers pushed
    >
    > I've never claimed CRT's are losing market share "just because" LCD's are
    > being pushed. You're being dishonest, Bob.
    >
    > > a clearly inferior product on an ignorant and easily-duped public.
    >
    > More dishonesty. I don't claim LCD monitors are "clearly inferior". In
    > many applications, they are the best choice. If they were the same price
    > as CRT monitors, they would be, IMO, the best choice in the majority of
    > applications.
    >
    > As for my response being "standard", when someone illogically implies that
    > "LCD's are what the stores are promoting, therefore they must be better",
    > I will counter with what I believe is the real reason for the heavy
    > promotion of LCD monitors.
    >
    > > Ho-hum....
    >
    > Indeed.


    In my opinion the best thing to do is to buy an LCD TV - This way you
    get the best of both worlds!! You need only buy one unit!! I myself
    have just bought a RELISYS 17" LCD TV - its gr8! - seeing as my
    bedroom is only small it means I dont need to have both a monitor (for
    PC) & a TV! - fantastic - happy days!
    Check out the selection where i bought it:

    http://www.epinx.com/Audio_Visual/Plasma_and_LCD/LCD_TVs/
  22. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    dizzy <dizzy@nospam.invalid> wrote in message news:<pan.2004.06.27.22.01.08.351824@nospam.invalid>...
    > On Sat, 26 Jun 2004 19:25:18 +0000, Bob Myers wrote:
    >
    > > "chrisv" <chrisv@nospam.invalid> wrote in message
    > > news:q1kod0dsnjghg223f64g6bd3ukruiq7d7h@4ax.com...
    > >>
    > >> Oh, I know they're pushing them. More revenue for them, you know. Why
    > >> sell a $100 monitor when you can sell a $300 monitor? Open up a Dell
    > >> flyer, and there's no hint that such a thing as a CRT monitor exists!
    > >
    > > Chris' standard response - CRTs are losing the market just because the
    > > big bad monitor and system makers pushed
    >
    > I've never claimed CRT's are losing market share "just because" LCD's are
    > being pushed. You're being dishonest, Bob.
    >
    > > a clearly inferior product on an ignorant and easily-duped public.
    >
    > More dishonesty. I don't claim LCD monitors are "clearly inferior". In
    > many applications, they are the best choice. If they were the same price
    > as CRT monitors, they would be, IMO, the best choice in the majority of
    > applications.
    >
    > As for my response being "standard", when someone illogically implies that
    > "LCD's are what the stores are promoting, therefore they must be better",
    > I will counter with what I believe is the real reason for the heavy
    > promotion of LCD monitors.
    >
    > > Ho-hum....
    >
    > Indeed.


    ..............but why lie - CRT's are a thing of the past! they are big
    & ugly!! Get a phat LCD like mine!!!!!

    http://www.epinx.com/Personal_Computing/Monitors/21inch_plus_TFT_and_LCD/NMD60000810.html
  23. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    > I've never claimed CRT's are losing market share "just because" LCD's are
    > being pushed. You're being dishonest, Bob.

    Chris, that is a very honest assessment of what your opinion
    APPEARS to be here, based on what you've said. There's no
    "dishonesty" involved - I do not believe one thing and state another
    - and I will thank you to choose your words more carefully in the
    future.

    There appear to be only two broad possibilities, here - either
    CRTs are losing marketing share because the LCD IS "being
    pushed," or they're losing market share because the LCD is the
    superior product. It clearly cannot be that CRTs are losing
    share because they are more costly than the alternative, because
    they aren't. Now, you have already discounted (repeatedly!) the
    possibility that the CRT market share loss is due to the superiority
    of the LCD - so what ELSE could we conclude, other than that
    you believe the LCD is "being pushed"? Please offer another
    alternative, if you have one.

    > More dishonesty. I don't claim LCD monitors are "clearly inferior". In
    > many applications, they are the best choice. If they were the same price
    > as CRT monitors, they would be, IMO, the best choice in the majority of
    > applications.

    More naivete on your part, then - it seems pretty clear from the
    market response that the cost advantages of the CRT do not
    outweigh its shortcomings in the minds of the buying public.
    The CRT, admittedly, will remain the display of choice in the most
    cost-conscious markets and applications - but outside of there, it
    seems pretty clear that its time is just about past.

    Or are you thinking that LCD monitors, for some reason, "should"
    be the same price as their CRT equivalents?


    > As for my response being "standard", when someone illogically implies that
    > "LCD's are what the stores are promoting, therefore they must be better",
    > I will counter with what I believe is the real reason for the heavy
    > promotion of LCD monitors.

    And if only you would be clear that this IS just your belief,
    that would be one thing. It would be very nice, though, if
    that expression of belief were backed up with some actual
    evidence or experience.

    Speaking of "dishonesty," by the way - I would certainly
    hope that you are not thinking of me as the "someone"
    making the above illogical implication...

    Bob M.
  24. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "Bob Myers" <nospamplease@address.invalid> wrote:

    (context restored)

    >chrisv wrote:
    >>
    >>Bob Myers wrote:
    >>>
    >>>Chris' standard response - CRTs are losing the market just
    >>>because the big bad monitor and system makers pushed a
    >>>clearly inferior product on an ignorant and easily-duped
    >>>public.
    >>
    >> I've never claimed CRT's are losing market share "just because" LCD's are
    >> being pushed. You're being dishonest, Bob.
    >
    >Chris, that is a very honest assessment of what your opinion
    >APPEARS to be here, based on what you've said. There's no
    >"dishonesty" involved - I do not believe one thing and state another
    >- and I will thank you to choose your words more carefully in the
    >future.

    Sorry, Bob, but there's no way that anyone could have logically
    concluded that my position is what you claimed it is. This is not the
    first time I've had this problem with you. I will thank you to use
    your words more carefully in the future.

    >There appear to be only two broad possibilities, here - either
    >CRTs are losing marketing share because the LCD IS "being
    >pushed," or they're losing market share because the LCD is the
    >superior product.

    There are other factors, Bob, and none of them, including the two
    above, are mutually exclusive.

    >It clearly cannot be that CRTs are losing
    >share because they are more costly than the alternative, because
    >they aren't.

    Correct.

    >Now, you have already discounted (repeatedly!) the
    >possibility that the CRT market share loss is due to the superiority
    >of the LCD

    You pretend to not understand that this is not as simple as one being
    "superior" to the other, Bob. Why?

    >- so what ELSE could we conclude, other than that
    >you believe the LCD is "being pushed"? Please offer another
    >alternative, if you have one.

    As we've already discussed, Bob, the "coolness" factor, and the
    "newer/flatter must be better" factor, are huge. The above factors
    are, in general, NOT tempered by consumer knowledge of the performance
    trade-offs involved.

    I've never claimed that the CRT's are losing market share "just
    because" LCD's are being pushed, Bob. Period.

    >> More dishonesty. I don't claim LCD monitors are "clearly inferior". In
    >> many applications, they are the best choice. If they were the same price
    >> as CRT monitors, they would be, IMO, the best choice in the majority of
    >> applications.
    >
    >More naivete on your part, then

    Incorrect, and another completely unsupported charge from you, Bob.
    What I wrote is entirely reasonable and hardly evidence of any alleged
    "naivete" on my part.

    Before you start with the insults, I think you should at least
    point-out what was incorrect or naive in what I said, because your
    response below does NOT dispute what I wrote above.

    > - it seems pretty clear from the
    >market response that the cost advantages of the CRT do not
    >outweigh its shortcomings in the minds of the buying public.

    More illogic from you. What's more popular is not a measure of
    goodness. GM sells more cars than Honda, for example. FWD cars
    outsell RWD cars by orders of magnitude - am I "naive" when I claim
    the RWD is better?

    >The CRT, admittedly, will remain the display of choice in the most
    >cost-conscious markets and applications - but outside of there, it
    >seems pretty clear that its time is just about past.

    For various reasons, some of which are valid, others of which are
    related to extracting more money from our wallets.

    I'll try to make it simple for you, Bob:

    # of reasons > 1

    >Or are you thinking that LCD monitors, for some reason, "should"
    >be the same price as their CRT equivalents?

    My mind reels from the illogic, and the veiled insult that I could be
    foolish-enough to think any such thing. Do you think I'm an idiot,
    Bob?

    >> As for my response being "standard", when someone illogically implies that
    >> "LCD's are what the stores are promoting, therefore they must be better",
    >> I will counter with what I believe is the real reason for the heavy
    >> promotion of LCD monitors.
    >
    >And if only you would be clear that this IS just your belief,
    >that would be one thing. It would be very nice, though, if
    >that expression of belief were backed up with some actual
    >evidence or experience.

    Look at a Dell flyer, Bob. This has already been discussed at length.

    >Speaking of "dishonesty," by the way - I would certainly
    >hope that you are not thinking of me as the "someone"
    >making the above illogical implication...

    Having difficulties following the thread, Bob? Spam_eliminator (.) is
    the person who made that implication, to which I then objected.
    However, I find it rather ironic that you would object, even if I was
    thinking of you, since you've made similar errors of logic.
  25. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    On Sat, 26 Jun 2004 19:23:35 +0000, Bob Myers wrote:

    > "chrisv" <chrisv@nospam.invalid> wrote in message
    > news:oefod053s0rmnegnu4gcuvuh42fr8660uk@4ax.com...
    >> I'll tell you, Bob, right after you tell me what "big difference" the
    >> thick glass on the front of a CRT makes.
    >
    > That one's pretty easy - the thick glass (required by the fact that the
    > CRT faceplate has to resist a considerable amount of air pressure, and
    > made thicker when you go to "flat-face" CRTs) has a couple of very
    > objectionable optical effects - especially troublesome in desktop
    > monitor use, since the typical viewing distance is on the same order as
    > the screen diagonal. The first is the very obvious impact the glass has
    > on the visual uniformity of the image - since you ARE using the thing
    > with your eye relatively close to the glass, you're looking through a
    > good deal more glass when you view the sides and corners of the image
    > than you are in the center.

    I calculate the difference due to this angle at 12% more glass at the
    corners than in the middle, assuming viewing distance = diagonal.
    This small difference in absolute brightness is compressed by the
    logarithmic response of human vision. It's generally not noticeable.
    No "big difference", Bob.

    > Monitor CRT glass is pretty much always tinted - with a transmission
    > generally in the range of 50-90% - as a contrast-enhancement technique.

    LCD's have tint too, Bob.

    > So, you wind up with the
    > outside of the image looking quite a bit dimmer than the inside.

    Bull. "Quite a bit dimmer" indeed.

    > (And since the tint is never perfectly neutral, there are similar
    > impacts on the color uniformity.)

    Yet CRT's are still better than LCD's in this regard. From
    http://website.lineone.net/~del.palmer/lacie.html

    <quote>
    It is also true that CRT monitors are at least 2 to 3 times more
    accurate when it comes to displaying color than LCD screens even when
    both are displaying 24-bit color and both are measured and calibrated
    with a colorimeter. CRT is able to maintain color uniformity across
    the screen 2 to 3 times better than an LCD as well. </quote>

    No "big difference" in favor of the LCD, Bob.

    > The second effect is refractive; since you are looking at the outer
    > extremes of the image through the glass at an angle, the light from this
    > portion is refracted differently (as you see it) than the center. This
    > leads to a number of distortion effects - most notably, the appearance
    > of a concave ("bowed inward") image when truly flat faceplate glass
    > (which was very thick) was first tried a number of years ago (as in the
    > old Zenith "FTM" tube design).

    Using 15-year-old CRT designs to support your case, Bob? On my
    (modern) CRT's, there is NOT any refractive distortion that any
    reasonable person would describe as a "big difference". I can't
    notice any at all. The geometric distortions that I CAN notice are
    not caused by the "thick glass".

    >> Doesn't surprise me at all, Bob, that you'd intentionally ignore the
    >> point I was making (that spam_eliminator's statement was ridiculous),
    >> and hypocritically attack it, with no mention of what spam_eliminator
    >> said.
    >
    > I didn't ignore the point you were making - it was simply wrong, and
    > spam_eliminator was right.

    His point is technically correct in that an "ideal" monitor would have
    nothing in-between the graphics and your eyes. However, I maintain
    that his claim that the glass makes a "big" (presumably negative)
    difference in the image quality vs. a LCD is ridiculous, since LCD's
    are also non-ideal in this regard. Sorry, Bob, but my point is quite
    valid.

    > Thick faceplate glass is NOT
    > desirable, for the reasons I gave above.

    Will you now be fair and admit that LCD's are also imperfect in their
    transmission of the graphics to your eyes, Bob? I'd like to see you
    demonstrate your impartiality by posting a similar critique of the
    LCD's light-transmission compromises.

    > And I would have thought that a CRT expert such as yourself would have
    > already been quite aware of such concerns, so they didn't need to be
    > repeated here.

    I know enough to have recognized spam_eliminator's biased, unfair
    remark, Bob.

    > Guess I was wrong about that, huh?

    Once again, you (intentionally, I think) missed the point, Bob.
  26. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "chrisv" <chrisv@nospam.invalid> wrote in message
    news:vcqle0hl430pc0p9b5idh5vsqu8j7tqsjm@4ax.com...
    ..
    > >- so what ELSE could we conclude, other than that
    > >you believe the LCD is "being pushed"? Please offer another
    > >alternative, if you have one.
    >
    > As we've already discussed, Bob, the "coolness" factor, and the
    > "newer/flatter must be better" factor, are huge. The above factors
    > are, in general, NOT tempered by consumer knowledge of the performance
    > trade-offs involved.

    And what, in YOUR view, is the source of the "coolness" factor,
    or the notion that "newer/flatter" must be better - notions that you
    again apparently disagree with, or at least do not consider to be
    valid reasons for making this choice - if not "the LCD being pushed"?
    The above makes it perfectly clear that YOU feel that the consumer
    is ignorant of the "performance trade-offs involved" (could you
    possibly be a little more specific?), and therefore is making a buying
    decision based on what YOU consider invalid or irrelevant factors.


    > >More naivete on your part, then
    >
    > Incorrect, and another completely unsupported charge from you, Bob.
    > What I wrote is entirely reasonable and hardly evidence of any alleged
    > "naivete" on my part.

    On the contrary - what you have written appears to be ample
    evidence of a lack of familiarity with the monitor market on your
    part.


    > > - it seems pretty clear from the
    > >market response that the cost advantages of the CRT do not
    > >outweigh its shortcomings in the minds of the buying public.
    >
    > More illogic from you. What's more popular is not a measure of
    > goodness. GM sells more cars than Honda, for example. FWD cars
    > outsell RWD cars by orders of magnitude - am I "naive" when I claim
    > the RWD is better?

    At the very least, you're being overly simplistic. If you make the
    flat, unqualified claim that "RWD is better," then you force us to
    ask the question, "better for WHAT?" How YOU define "better"
    very likely does not apply to all usages or applications, and so such
    blanket claims wind up being nonsensical.

    Similarly, to make a blanket statement that "CRTs are better" would
    be equally nonsensical - such things need to at the very least be
    qualified with a statement regarding the specific application in question,
    or noting that such a statement simply constitutes your personal
    preference (which clearly is not subject to objective, quantitative
    analysis).


    > For various reasons, some of which are valid, others of which are
    > related to extracting more money from our wallets.

    There you go again - that unsupported assertion, that flies in the
    face of everything we know about how this market evolved over the
    past 15 years.


    > My mind reels from the illogic, and the veiled insult that I could be
    > foolish-enough to think any such thing. Do you think I'm an idiot,
    > Bob?

    Well, I certainly don't see anything to be gained by giving you an
    honest answer to THAT one...you'll just get angry again.


    Bob M.
  27. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "chrisv" <chrisv@nospam.invalid> wrote in message
    news:h5ule0hr56jnqnbfgf8p73q9m9oa6islhj@4ax.com...
    > I calculate the difference due to this angle at 12% more glass at the
    > corners than in the middle, assuming viewing distance = diagonal.

    And also assuming, apparently, that the glass IS simply a flat sheet
    rather than the somewhat more complex shape of the typical faceplate.
    Further, you assume that the light emission from the phosphor screen
    is spatially uniform rather than (at best) Lambertian.

    Your figures would be a lot more impressive if they were measured
    rather than simplistically calculated.


    > > Monitor CRT glass is pretty much always tinted - with a transmission
    > > generally in the range of 50-90% - as a contrast-enhancement technique.
    >
    > LCD's have tint too, Bob.

    Gee, Chris, do you think I don't know that? However, please
    check in with the glass makers and see what the transmission of the front
    substrate glass itself is in the typical LCD panel. See what the
    OVERALL transmission is of the entire stack past the LC material
    itself, and compare that with the CRT faceplate.

    > > So, you wind up with the
    > > outside of the image looking quite a bit dimmer than the inside.
    >
    > Bull. "Quite a bit dimmer" indeed.

    Well, since you're clearly possessed of considerable experience in
    this area - what do you believe is the typical luminance uniformity
    profile of a CRT? How does it compare with the typical LCD?


    > Yet CRT's are still better than LCD's in this regard. From
    > http://website.lineone.net/~del.palmer/lacie.html

    I looked at the site; suffice it to say that anyone can post a web
    page, and some know more than others what they're talking
    about.


    > It is also true that CRT monitors are at least 2 to 3 times more
    > accurate when it comes to displaying color than LCD screens even when
    > both are displaying 24-bit color and both are measured and calibrated
    > with a colorimeter. CRT is able to maintain color uniformity across
    > the screen 2 to 3 times better than an LCD as well.

    Which is a rather easy statement to write in the absence of data.
    Want to see some, from a real test lab?


    > Using 15-year-old CRT designs to support your case, Bob? On my
    > (modern) CRT's, there is NOT any refractive distortion that any
    > reasonable person would describe as a "big difference". I can't
    > notice any at all. The geometric distortions that I CAN notice are
    > not caused by the "thick glass".

    Would you care to quantify that? If not, again, would you like to
    see some numbers? (I'm somewhat surprised that you're willing to
    admit to any geometric distortions at ALL, but....)


    > His point is technically correct in that an "ideal" monitor would have
    > nothing in-between the graphics and your eyes. However, I maintain
    > that his claim that the glass makes a "big" (presumably negative)
    > difference in the image quality vs. a LCD is ridiculous, since LCD's
    > are also non-ideal in this regard. Sorry, Bob, but my point is quite
    > valid.

    Let's see - we're agreed that glass makes a difference, and yet you
    don't think there might be a "big difference" between a display
    technology in which that glass is many millimeters thick and one
    in which the comparable glass is under a millimeter thick. Wanna
    try that one again?


    > Will you now be fair and admit that LCD's are also imperfect in their
    > transmission of the graphics to your eyes, Bob? I'd like to see you
    > demonstrate your impartiality by posting a similar critique of the
    > LCD's light-transmission compromises.

    Of course; to use YOUR favorite debate tactic, it is "dishonest"
    of you to imply that I have ever said that the LCD represents
    anything remotely like a "perfect" display technology. However,
    I am very certain that I have a considerably better understanding of
    the sources - and likelihood of correction - of the imperfections
    in the two technologies than you do.

    Bob M.
  28. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "Bob Myers" <nospamplease@address.invalid> wrote:

    >> > So, you wind up with the
    >> > outside of the image looking quite a bit dimmer than the inside.
    >>
    >> Bull. "Quite a bit dimmer" indeed.
    >
    >Well, since you're clearly possessed of considerable experience in
    >this area - what do you believe is the typical luminance uniformity
    >profile of a CRT? How does it compare with the typical LCD?

    See below, Bob.

    >> Yet CRT's are still better than LCD's in this regard. From
    >> http://website.lineone.net/~del.palmer/lacie.html
    >
    >I looked at the site; suffice it to say that anyone can post a web
    >page, and some know more than others what they're talking
    >about.

    The text I quoted from the link above is attributed to Lacie, a
    respected manufacturer of professional display monitors. Hardly just
    "anyone", Bob.

    Here's something else - a quote from NEC/Mitsubishi, found at:
    http://www.necmitsubishi.com/support/css/monitortechguide/index11.htm

    <quote>
    Based on the current core technologies, CRT monitors are able to
    display a wider color space than LCD monitors and deliver more
    consistent brightness uniformity throughout the screen.
    </quote>

    Gee, Bob, that seems to blow your entire argument, that the CRT's
    thick glass causes it to have inferior luminance uniformity, right out
    of the water.
  29. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "Bob Myers" <nospamplease@address.invalid> wrote:

    >"chrisv" <chrisv@nospam.invalid> wrote:
    >>
    >> >More naivete on your part, then
    >>
    >> Incorrect, and another completely unsupported charge from you, Bob.
    >> What I wrote is entirely reasonable and hardly evidence of any alleged
    >> "naivete" on my part.
    >
    >On the contrary - what you have written appears to be ample
    >evidence of a lack of familiarity with the monitor market on your
    >part.

    Repeating your illogical, antagonistic conclusions does not make them
    any more true, Bob. I note that, once again, despite my direct
    request, you failed to actually dispute anything that I said in the
    paragraph in question. Instead you again chose creative editing as
    your "solution" of choice.

    In any case, Bob, as much as you'd like to change to subject, the
    issue of this little sub-thread is the allegations that you made
    regarding my postion, when you wrote "Chris' standard response - CRTs
    are losing the market just because the big bad monitor and system
    makers pushed a clearly inferior product on an ignorant and
    easily-duped public."

    The rest is mostly your flopping-around trying to justify these
    illogical, unsubstantiated, and untrue allegations.

    >> > - it seems pretty clear from the
    >> >market response that the cost advantages of the CRT do not
    >> >outweigh its shortcomings in the minds of the buying public.
    >>
    >> More illogic from you. What's more popular is not a measure of
    >> goodness. GM sells more cars than Honda, for example. FWD cars
    >> outsell RWD cars by orders of magnitude - am I "naive" when I claim
    >> the RWD is better?
    >
    >At the very least, you're being overly simplistic.

    That's rather ironic coming from you, Bob, who has been attempting to
    over-simplify the monitor market, such as when you wrote "There appear
    to be only two broad possibilities, here - either CRTs are losing
    marketing share because the LCD IS "being pushed," or they're losing
    market share because the LCD is the superior product."

    >If you make the
    >flat, unqualified claim that "RWD is better," then you force us to
    >ask the question, "better for WHAT?" How YOU define "better"
    >very likely does not apply to all usages or applications, and so such
    >blanket claims wind up being nonsensical.
    >Similarly, to make a blanket statement that "CRTs are better" would
    >be equally nonsensical - such things need to at the very least be
    >qualified with a statement regarding the specific application in question,
    >or noting that such a statement simply constitutes your personal
    >preference (which clearly is not subject to objective, quantitative
    >analysis).

    No kidding, Bob, this is the same point I've repeatedly tried to make
    about the monitor market, but you just come back with more of your
    self-serving, illogical, oversimplifications. Go back and read my
    last post (message ID: <vcqle0hl430pc0p9b5idh5vsqu8j7tqsjm@4ax.com>)
    to see abundant evidence of what I am talking about.

    I hope, now that you appear to understand these concepts, that we
    won't see your erroneous oversimplifications again.

    Maybe we'll also see an end to your misrepresenting my position, as
    you did when you said that I think that LCD's are a "clearly inferior
    product", and that I think that LCD's are gaining market share "just
    because" they are being pushed.

    Face it, Bob, no matter how much convoluted illogic you throw at it,
    I've never said anything which would lead anyone to logically conclude
    that I thought either of those things. The situation is obviously too
    complex for "clearly inferior" and "just because".

    >> For various reasons, some of which are valid, others of which are
    >> related to extracting more money from our wallets.
    >
    >There you go again - that unsupported assertion, that flies in the
    >face of everything we know about how this market evolved over the
    >past 15 years.

    LOL. Sure, Bob, it's got nothing to do with money. They have no
    motivation at all to sell you a $300 monitor instead of a $100
    monitor. It's simply not a factor. </sarcasm>

    Sheesh. Talk about naive...

    >> My mind reels from the illogic, and the veiled insult that I could be
    >> foolish-enough to think any such thing. Do you think I'm an idiot,
    >> Bob?
    >
    >Well, I certainly don't see anything to be gained by giving you an
    >honest answer to THAT one...you'll just get angry again.

    Angry if you arrived at yet another illogical, unsupported conclusion,
    Bob? No, but I do think your subtle ad hominem attacks are worth
    noting...
  30. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "Bob Myers" <nospamplease@address.invalid> wrote:

    >chrisv wrote:
    >>
    >> Will you now be fair and admit that LCD's are also imperfect in their
    >> transmission of the graphics to your eyes, Bob? I'd like to see you
    >> demonstrate your impartiality by posting a similar critique of the
    >> LCD's light-transmission compromises.
    >
    >Of course; to use YOUR favorite debate tactic, it is "dishonest"
    >of you to imply that I have ever said that the LCD represents
    >anything remotely like a "perfect" display technology.

    I never implied any such thing, Bob. I simply think it's less than
    fair of you to nit-pick the CRT's flaws, while denying us the benefit
    of your expert critique of the LCD's imperfections "in their
    transmission of the graphics to your eyes".

    The above is NOT encompassing of LCD display technology, much less a
    claim that your omissions imply that you think LCD's are "perfect".

    Really, any talk of something being "perfect" is absurd on it's face.
    Obviously, I would never imply something so ridiculous, and that makes
    your above claim ridiculous.

    No, Bob, unlike many of the frustrated people that I face-off with, I
    put forth a clean, honest, logical argument, which of course is what
    frustrates the other guy, seemingly forcing him, in his eagerness to
    "beat" me, into illogical arguments and sometimes worse.

    >However,
    >I am very certain that I have a considerably better understanding of
    >the sources - and likelihood of correction - of the imperfections
    >in the two technologies than you do.

    No doubt. But anyone with a pair of eyes and some common sense can
    reasonably disagree with spam_eliminator's "big difference" statement.
  31. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "chrisv" <chrisv@nospam.invalid> wrote in message
    news:2b2re0tv9f4kbbi83s6kpkgckvrg4jv3l7@4ax.com...
    > >Well, since you're clearly possessed of considerable experience in
    > >this area - what do you believe is the typical luminance uniformity
    > >profile of a CRT? How does it compare with the typical LCD?
    >
    > See below, Bob.

    Y'know, I looked and I looked, and I still didn't see anything
    on what YOU think is the typical luminance uniformity profile
    of a CRT or an LCD, or where you got that impression. You've
    found a fair number of subjective comments on various web sites
    - some of which appear to have been written by people who
    know what they're talking about - but still not a shred of objective
    data. So where is it?


    > The text I quoted from the link above is attributed to Lacie, a
    > respected manufacturer of professional display monitors. Hardly just
    > "anyone", Bob.

    Y'know, Chris, I generally hate to play this card, but I myself
    am "hardly just anyone" either. What is it you think I DO for
    a living, and for that matter have done for the past 20 years?

    By the way, LaCie (not "Lacie," please) is NOT a manufacturer of
    monitors. They are a reseller, similar in that business model to a very
    many other companies. (What they manufacture is storage peripherals,
    with displays as a sideline business.) The text you quoted is still
    out of date in many regards, is a great oversimplification in others,
    and in some places is downright erroneous - I don't care WHO
    wrote it. Believe it or not, not everything you read on the web is
    correct! Just for one example - what does "maintaining color uniformity
    across the screen 2 to 3 times better than an LCD" MEAN, exactly?
    As measured how? In what color space? Under what ambient
    conditions? With what equipment, image content, display settings,
    etc.? The statement as given is meaningless at best. The site you
    referenced is full of such simplistic, nonsensical, and unsupported
    pronouncements. Here's another one: "CRT monitors reproduce
    color temperature much more accurately than LCD monitors. This
    is especially important to those working in the 5000K range when
    working with pre-press and color."

    This statement is nonsense on the face of it - "color temperature"
    refers to the setting of the white point of the display - the color
    you see when the RGB inputs are all set to their maximum value.
    CRT monitors may offer multiple white point settings, but the typical
    default white of a CRT is 9300K - for the simple reason that this
    gives a "brighter"-appearing white, by virtue of having excess blue.
    It is a noticeably bluish white, and not one which corresponds well
    to any industry color standard. TV, for example, is defined around
    a 6500K point (specifically, the CIE "D65" illuminant, which is not
    quite on the black-body temperature curve), while many document
    or pre-press apps will use the 5000 or 5500K points as these are
    more of a "paper white." But the notion that there is something
    inherent in LCD technology that would prevent the use of these
    points is just silly - the native white point of such displays, unlike the
    CRT, is set by the white point of the backlight and the characteristics
    of the color filters. So they CAN be designed to match pretty much
    any point you need. LCD panels for the mainstream monitor industry
    have generally tried to match the mainstream CRT defaults, and so
    have been built to the 9300K or 6500K points - but there is certainly
    nothing stopping someone from making 5000K or 5500K LCDs,
    and - surprise, surprise! - such devices aimed specifically at the
    pre-press and similar markets have done just that. In fact, as of
    this year, solid-state backlighting is starting to make the LCD the
    display of choice for color-critical document applications, as the gamut
    of such displays will far exceed that which is possible with CRT
    phosphors - out to well in excess of the NTSC standard gamut, and
    a very close match to the Adobe RGB space.


    > Based on the current core technologies, CRT monitors are able to
    > display a wider color space than LCD monitors and deliver more
    > consistent brightness uniformity throughout the screen.

    Out of date information. You REALLY should check the actual
    numbers on current products before you believe everything you
    read.

    > Gee, Bob, that seems to blow your entire argument, that the CRT's
    > thick glass causes it to have inferior luminance uniformity, right out
    > of the water.

    And gee, NOW who's being - to again use your favorite word -
    "dishonest" here. I never said that the CRT glass was the cause of
    inferior luminance uniformity, only that this IS a valid concern that
    comes with thicker faceplates. These DO contribute to perceived
    uniformity problems - but they're not the only factor. If you understood
    how a CRT operates better than you apparently do, you would
    realize that there is an unavoidable loss of luminance from center to
    edges, with the overall non-uniformity often as bad as 60-70%
    (edges of the image as compared to the center luminance), and
    THAT'S just in luminance alone. Next, if you like, we can discuss
    a little more sophisticated measurement of perceived uniformity, such
    as looking at the delta-E* measurements.

    Bob M.
  32. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "chrisv" <chrisv@nospam.invalid> wrote in message
    news:49bre019iaa9svp6lvu2p6prp8aj8vf3g0@4ax.com...
    > I never implied any such thing, Bob. I simply think it's less than
    > fair of you to nit-pick the CRT's flaws, while denying us the benefit
    > of your expert critique of the LCD's imperfections "in their
    > transmission of the graphics to your eyes".

    Ask and ye shall receive...

    If we want to compare the two technologies in terms of their
    performance re commonly-used metrics of image quality, I can
    certainly do that.

    1. Brightness: The CRT suffers from an inherent brightness/spot
    size tradeoff; in other words, it is extremely difficult to produce a
    tube which has both high light output and the spot size needed for
    a high resolution images. CRTs suitable for high-resolution desktop
    monitor use are thus generally limited to somewhere in the 100-200
    cd/m^2 range, while it is not at all uncommon for LCD products
    in this same market to at least double this. Further, increasing
    LCD brightness is, within reason, simply a matter of putting "more
    backlight" into the thing, and does not impact resolution, etc.. So
    the advantage here is to the LCD.

    2. Contrast: In current displays, this is an area basically dominated
    by the ability of the technology in question to produce a good black.
    LCDs, being "light valves," appear at first glance to be disadvantaged
    here, as their performance would depend on completely shutting off
    the backlight from transmission - whereas the CRT can actually be
    set so as to emit no light. However, setting up a CRT so that the
    "black level" of the input signal corresponds to a tube state below
    cutoff is problematic, as it makes the overall response very non-linear.
    To properly set up a CRT for best overall image reproduction in all
    but the simplest black/white applications (such as straight text display)
    requires that the "black" level actually be set slightly above cutoff.
    The bottom line is that a CRT will rarely achieve a delivered contrast
    ratio much above about 100:1 in typical applications, whereas an
    LCD these days will have little problem doubling or tripling this.
    In typical ambient conditions, the APPEARANCE of the two displays
    may be very similar in this regard, despite the quantitative advantage
    of the LCD. So - slight advantage for the LCD.

    3. Geometry, Linearity, Focus: LCDs are fixed-format devices (they
    actually have discrete, addressable physical pixels), whereas the CRT
    is not. Therefore, the geometry, linearity, and "focus" of an LCD is
    unchangeable and essentially "perfect." (Pixel size and location
    errors are on the order of microns, and invisible to the viewer.)
    Clearly, this is not and cannot be the case with the CRT. Advantage:
    LCD.

    4. Viewing angle: State-of-the-art LCDs offer viewing angles,
    generally defined as the point where the contrast decreases below a
    certain threshold (10:1 is common) of 160 degrees or better.
    Further increases are possible, and will appear in the market soon -
    however, clearly the LCD will never quite match the viewing angle
    of a display which actually produces light at the screen surface, such
    as the CRT. (Wait for OLEDs for the real viewing-angle champion!).
    On the other hand, not many people use their displays at extreme
    angle anyway. So a slight advantage here to the CRT.

    5. Resolution: This is among the most misunderstood areas in this
    discussion, and also one which does not lend itself to an apples-to-apples
    comparison between the two technologies. It gets back to the LCD
    being a fixed-format type, where the CRT is not. As a fixed-format
    display, the "resolution" (in the proper sense of the term - how much
    detail can be visibly resolved per unit distance) of the LCD is also
    fixed; you get a certain number of pixels per inch, and that's that.
    A CRT, which does NOT have physical pixels at all (the color
    screen triads are NOT "pixels") does not have such a limitation;
    its ability to resolve detail is limited primarily by the spot size, and,
    in color types, by the pitch of the shadow mask and phosphor
    screen (not QUITE the same thing). That the "look" of the two is
    different, regardless of "resolution," is also indisputable, due to the
    "soft" edges of the CRT spot vs. the sharply-defined features of the
    CRT screen - but that is, to a very large degree, a matter of personal
    taste. In terms of the absolute limits in resolution for the two
    technologies
    - color LCDs have already been demonstrated with resolution well in
    excess of 300 pixels per inch, whereas color CRTs are definitely
    struggling to get much above 150 ppi. (Very specialized monochrome
    tubes have been made claiming about 300 ppi, but even those were
    unable to properly resolve single-pixel details at that level.) Due to
    the complexity of this issue, I won't state a clear winner here - but
    there's the information as I have it. Draw your own conclusions.

    3. Color convergence/purity: Short and sweet - the CRT is subject
    to problems in these areas (and these are MAJOR areas of concern
    in terms of the technology's susceptibility to external fields, see
    below), and the LCD simply can't have these problems due to a
    completely different operating principle. Advantage: LCD.

    4. Color accuracy/gamut: Until fairly recently, the CRT was the
    clear winner here, at least in terms of "accuracy" (which primarily
    had to do with the overall luminance response curve of the technology).
    However, this was not due to an inherent theoretical limit in the
    LCD, and as of this year, LCD displays are being produced which
    equal the CRT in response accuracy and greatly exceed it in
    gamut. Advantage: For the moment, in terms of what's out in the
    installed base and mainstream market, the CRT - but this won't
    last.

    5. Brightness/color uniformity: The CRT suffers from an inherent
    loss of luminance as one moves from the center of the screen
    to the edges (there's also a loss of "focus," which can be partially
    compensated for), and is also more subject to convergence/purity
    issues in the extremes as well. (All for the same reason - the
    changing geometry of the electron beam with respect to the screen.)
    In the LCD, color and luminance uniformity issues arise from
    non-uniformities in the light profile coming through the backlight &
    diffuser assembly, possible non-uniformities in the drivers, and to
    some slight degree non-uniformities in the color filter response.
    However, these do NOT inherently have the "good in the center,
    worse at the edges" characteristic of the CRT, and so are often
    less visually objectionable - and again, all can be addressed in the
    design (if the customer is willing to pay the price), as opposed to
    the CRT being inherently constrained by its basic operation.
    Advantage: Unclear - it's easier to make this "look good" in a
    relatively inexpensive CRT vs. a similarly bargain-basement
    LCD - but the best performance at ANY cost will likely be from a
    properly designed LCD product.

    There are a number of other areas of performance not directly
    related to image quality which generally factor into the buying
    decision. These include:

    Cost: Clear advantage for the CRT.

    Weight: Clear advantage for the LCD.

    Physical size (depth): Ditto

    Power: Clear advantage for the LCD.

    Susceptibility to external fields: Clear advantage for the LCD
    (and a MAJOR stumbling block for the CRT in many applications;
    there are numerous cases where the CRT simply is NOT an
    option for this reason alone).

    Emissions: In terms of straight RFI, both CAN be noisy -
    although I have generally found LCD designs easier to
    "clean up" (your mileage may vary). However, in terms of
    X-radiation and low-frequency electric and magnetic fields,
    the LCD is the obvious winner (esp. in areas where these
    are controlled by government regulation or the equivalent.)

    The bottom line is that the LCD already has the advantage in
    the majority of areas which are of concern to most customers, and
    is the only possible choice in some. This has, over the past few
    years, resulted in a split in the CRT market - the CRT is basically
    vanishing as a "mainstream" computer display, and is being
    pushed to the high and low ends of the market. At the high end,
    it is being chosen (in specialized forms) for very high-resolution
    applications where the LCD types are not yet cost-effective, and
    (for a while yet) in color-critical applications (but again, these aren't
    representative of mainstream CRT performance). And the
    CRT remains the display of choice for very, very cost-conscious
    markets (think China and India, for example), since it is still the
    absolute cheapest display possible for PCs, and will be for some
    time. Over the next few years, the LCD will begin to eat away at
    those remaining high-end applications as well, leaving the CRT
    as viable only in those markets where its cost advantages
    outweigh all other considerations.

    Bob M.
  33. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "Bob Myers" <nospamplease@address.invalid> wrote in message
    news:J9kHc.5579$3B3.4419@news.cpqcorp.net...
    > Ask and ye shall receive...

    Ooops - forgot one, and I really shouldn't've, since it's been
    one of the major areas of concern re the LCD over the past
    N years -

    Response time: Again, the two technologies have very, very
    different operating modes. In the case of the CRT, the image
    is "drawn" essentially a pixel at a time (if the thing HAD pixels,
    that is) by a scanned electron beam - whereas in the LCD, the
    pixels are set to their desired light transmittance state a row at
    a time, and pretty much left there until the next frame time.
    Also, over most of their history, LCD materials have had
    relatively slow response times compared to CRT phosphors
    (at least low tens of milliseconds for the LCD, compared to
    nanosecond "on" times for the phosphor followed by a few
    hundred microseconds' persistence in the case of phosphors,
    at least for most color CRT blends). All of this comes under
    the broader heading of what a display technologist refers to as
    the temporal response of the display.

    Since the CRT phosphor does NOT stay illuminated for long,
    it has to be driven very, very hard - the actual illuminated spot
    (when the beam is striking it) is a good deal brighter than you
    think, and then the light output falls off dramatically. Were it
    not for this high initial spike making for a reasonable average
    (over time) light output, and the persistence of vision, we would
    see CRT images as just a flying spot of bright light, coupled
    with a very low-contrast image. (Which some cameras DO
    see when pointed at a CRT.) On the other hand, the LCD has
    relatively flat light output over time, but until recently suffered from
    much longer on/off transients.

    The bottom line here is that the CRT, until recently, was a
    better match to fast-moving imagery (frame rates on the
    order of tens per second) vs. the LCD, which was best
    suited to the display of relatively static images (which make
    up a good deal of typical PC usage). The flip side of this
    is that the CRT suffers from a severe flicker problem, which
    gets worse with increased brightness (another reason that
    PC CRT displays with the brightness of the current LCD crop
    were never really a good idea) - whereas the LCD provides
    a much more stable image, with essentially no flicker at all.

    Over the last couple of years, however, LCD response times
    (which confusingly are often given as the SUM of the on and
    off times!) have fallen below the frame or field time of video
    needed for good motion rendition (somewhere in the 20-60
    FPS range), and are continuing to decline. Response times
    in the 12-16 ms range are common now, and faster displays
    (down to a few ms response time) have already been
    demonstrated. At this point, the LCD is a very acceptable
    display for video-rate imagery, as evidenced (and driven, for
    that matter) by its entry into the television market).

    Bottom line - advantage: until recently, the CRT, unless you
    couldn't live with the flicker. From here on out, this
    advantage will be basically gone in all but extremely
    specialized applications.

    To sum up, then, we have the following scorecard:

    Brightness: LCD
    Contrast: LCD
    Geom., etc.: LCD
    View. angl. CRT (slight)
    Resolution: LCD (slight in mainstream apps)
    Color conv./purity: LCD
    Color acc./gamut: CRT at present; LCD within a year or two
    Lum./color uniformity: Was CRT, now LCD in best case; essentially even now
    for mainstream designs
    Cost: CRT (and will be the winner here for a long time)
    Weight: LCD
    Size (depth): LCD
    Power: LCD
    Field susc.: LCD
    Emissions: LCD
    Resp. time: Even in majority of current applications.


    Yeah, it's pretty hard to see why the market is switching over to the
    LCD....:-)


    Bob M.
  34. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    For what it's worth, the reason why I am working on TFT/LCD is the DVI-I /
    DVI-D connector, it seems next to impossible to get a decent 1600x1200
    resolution from videocards unless you try half a dozen, and I upgrade now
    and then, and every single instance in the past 4-5 years the 1024x768 is
    best these devices can do unless like "smoothed" signal, ie. blurry / poorly
    defined image in higher resolutions. I know there are people with good
    experiences with DB15 and CRT's, good for you! I had such experience myself
    with Matrox Millennum and 486/DX4-100 back in 1990's.. since then power of
    the graphics cards have increased and analogic image quality decreased..
    feel free to disagree because I know you will be right, that is not in
    question.

    13w -> bnc gives a sharp, well defined image at 1600x1200 but I don't have
    IA32 compatible PC with 13w->bnc with good dac's like those found in SGI or
    SUN workstations do the job great. Since the work is on PC (Linux and
    Windows) for me, this means DVI is way to go to get good, sharp image
    inexpensively. It's just the price/performance that wins in that regard in
    DVI's. I could buy $2000+ "workstation class" display card, but it would
    cost more than rest of the workstation (A64 3000+, 1GB, ..)

    Summary: good image quality at affordable price in resolution I like to work
    with (more would be even better, but 1600x1200 has enough real-estate to get
    the job done w/o stacking workspaces). I'm currently using dual 1600x1200...

    It is inevidable that some smart ass steps in and says: "I got X brand CRT
    and Y brand gfx card and 1600x1200 is sharp as razor, you're an idiot
    because you don't know this" -- that sort of stuff happens novadays in the
    Usenet ALL THE TIME. The point is that with DVI, I got one thing less to
    worry about when choosing a graphics solution for next upgrade cycle. I know
    the image will be crisp and sharp, and concentrate on features, performance,
    robustness and price of the solution. It's not like there is abundance of
    choise if we are realistic. ATI and NV, that's pretty much it. Matrox could
    fix their drivers, they corrupt simple GDI rendering so that's not a very
    good thing, they're off the game, zap.

    3DLabs might be worth consideration, and Quadro's from NV but I don't really
    need the features I would be paying for. DVI = inexpensive, good quality. It
    seems to be somehow hard, difficult to make decent analogic high-resolution
    high-refresh to work with decent image quality. If it is so damn hard, DVI
    with digital connection is inexpensive and efficient solution. Why they
    don't make CRT's with DVI connectors? Then the "last mile" in image quality
    would be in the monitor and you could judge the image quality purely on the
    monitor/display device, not:

    - graphics card (dac mainly)
    - cable (poor quality cables exist, ya' know?)
    - display device (=monitor)

    Too many things that can go wrong. With DVI it's only the "last mile" that
    can go wrong (explanation: digital means something either works or doesn't
    work at all-- ok shitty joke, flame me..)

    I'm so worried about color being "the wrong shade", when I am not doing
    printing or other color sensitive work... if red is not blue or green, I'm
    fine. <- flame me for that aswell, very stupid thing to say.. now list
    (statistically 2.89 reasons are mentioned to defeat silly argument in
    Usenet) 3 reasons why I am a retard for not having brain to understand that
    poor color definition is a Bad Thing for me. :)

    Hehheh.
  35. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "assaarpa" <redterminator@fap.net> wrote in message
    news:ccu0ll$stn$1@phys-news1.kolumbus.fi...
    > For what it's worth, the reason why I am working on TFT/LCD is the DVI-I /
    > DVI-D connector, it seems next to impossible to get a decent 1600x1200
    > resolution from videocards unless you try half a dozen, and I upgrade now
    > and then, and every single instance in the past 4-5 years the 1024x768 is
    > best these devices can do unless like "smoothed" signal, ie. blurry /
    poorly
    > defined image in higher resolutions. I know there are people with good
    > experiences with DB15 and CRT's, good for you!

    No, you point out another important distinction between the
    two technologies. This again gets back to the fact that the LCD
    is a fixed-format technology (which doesn't mean that it is somehow
    "inherently digital" - it's not, and in fact LCDs are analog devices
    at the pixel level), which requires very precise timing, at the pixel
    clock level, in order to properly sample the video data stream
    (whether it's sent in analog OR digital form). The classic "VGA"
    analog interface simply doesn't provide this level of timing information,
    whereas the "digital" interfaces DO (they have to - as you noted,
    they wouldn't work at all without it). There IS a VESA standard
    in the works which would address this shortcoming of the VGA
    interface, but it's uncertain if it will be adopted by the industry or
    if we'll all just wind up making the painful transition to digital.


    > Why they
    > don't make CRT's with DVI connectors?

    Well, quite simply because there would be no advantage to it.
    "Digital" CRT monitor designs have been proposed, but the CRT
    by its nature simply doesn't need that level of timing accuracy,
    since it could not possibly care less about where the "pixels" are
    in the analog video stream. And, believe it or not, it really IS
    the timing that is responsible for most of the problems you see
    with analog interfaces on LCDs, not the "bandwidth" or other
    possible image quality factors. (On the other hand, God knows
    there is certainly no end of the truly lousy VGA cable
    assemblies that are out there, and they do NOT do you any
    good in this regard at all.)


    Bob M.
  36. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    > workstations are solid engineering unlike most consumer graphics cards..
    and
    > you notice it on the pricetag!)

    And that, my friends, is the crux of the matter at hand. DVI is inexpensive
    way to eliminate the errors from the graphics card and cabling. Only the
    display device now have to accurately present the signal. TFT does this
    pretty well, due to their nature. CRT's would, agreed, get less benefit from
    this arrangement.. but.. consider this:

    If a sgi->13w->bnc cable can do 1600x1200 without breaking a sweat and
    geforce6800u->db15->db15 has trouble coming up with adequate 1280x960, using
    _same_ display (1200nf) on both cases, what's wrong with this picture?

    I tell you what's wrong: it seems to be prohibitively (literally) expensive
    or difficult to make a good digital-to-analogic converter, this is why $8000
    workstation does the job properly and $650 graphics card doesn't. This is
    logical. Now look at DVI: inexpensive way to do what with analogic signaling
    takes $8K workstation and by defnition is not something $650 graphics card
    can be expected by default to do. Not for years.

    If you claim that it is just as infeasible to put a good digital-to-analogic
    converter into a CRT than it is to put one in graphics card, what makes you
    think CRT then can handle the analogic signal any better than the graphics
    card, considering there can be a shoddy cable between the two to add into
    the confusion? Wouldn't it make much more sense to avoid unnecessary signal
    loss as long as possible, especially when tranfering the signal digitally is
    now cheap?

    Let me guess. It would require a shift which market is not 'prepared' for,
    in otherwords, it is uncertain that market would follow. Why you think Intel
    is still selling products based on x86 architechture? Because it is good
    enough and the market is used to bying that, in short, "that's the way
    things are" -- it is VERY difficult to change "the way things are".. human
    mind resist change, they really do.

    "Technological progress", is not change, it is just rehashing of the old
    ideas for most part. Graphics cards are still based on the same principles
    they were 10 and 20 years ago. This is not surprising, "so why don't you
    think of something better?", indeed that would be interesting question. The
    current graphics architechture 3D cards implement is fundamentally flawed
    when it comes to handling of translucent primitive, for example. Raytracing
    for instance solves this elegantly, however, it is architechture which is
    not as well suited for on-chip implementation as it involves hierarchical
    database with *random*, arbitrary dependent queries! GPU's are executing the
    same instructions for all pixels, this is very deterministic and reasonably
    easy to do in silicon and in parallel.

    Remember when NV last time tried curved surfaces? Way before GeForce's or
    TNT's? What did the market and developers say? They said nothing or at best
    they said, quote, "DUH?" -- me included. It didn't even fundamentally change
    or break the existing pipeline! This was just example of the ways human
    mind, or atleast communities have natural resistance to new ideas (both good
    and bad, what is needed is luck and good presentation :)

    DVI + CRT? DUH? What a stupid idea, it been proven that it doesn't offer
    anything over DB15. <- ring a bell? I say it would when done properly. But
    you see the market doesn't care so the vendors don't care. The market has
    TFT's and they sell well. Market wants to re-sell the same thing for us over
    and over again so that profit can be made and people can be employed and
    that economy stays in good shape and everyone will have a good time.

    How many VCR's did you guys buy? What's wrong with the previous ones? They
    did break down didn't they? Or some new fancy feature you just had to have
    was introduced. You know what I mean. How many of you still using VCR? How
    many even have one still in the house, or praytell, plugged in? How many
    find himself buying the movies again in DVD they had as VHS already, because
    of better image, better sound (AC3 baby! =), extras on the DVD, whatever?
    How many thinks he won't be buying some of these movies again when HIGH
    DEFITION DVD (or whatever the next Big Thing in Video will be)? Oh, that
    many? I see a lot of people good at deceiving themselves.

    The point is that there is no incentive to introduce a line of products with
    CRT+DVI, because there isn't market for it. Few exceptions like me don't
    make a market, unless the prices are orbital. Now you think I would pay
    orbital price for something I can get for peanuts? Not very many intelligent
    people would! And _that_ is what I think about this, thanks for trying to
    "prove" me wrong, but you see this is just opinion and opinions don't
    necessarily be 'correct' to be valid.. I ask you guys one question, please
    answer, I been supplying a LOT of extra information to back up WHAT
    PRECISELY my opinion is BASED on. Now, this extra information is taken apart
    with clinical accuracy, which I have no objection to. However, it is ironic
    that the original point is completely and utterly avoided and/or not even
    understood (I would be biased towards 'ignored', though).

    :)
  37. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "Bob Myers" <nospamplease@address.invalid> wrote:

    >Ask and ye shall receive...

    Well, actually I asked you to nit-pick the LCD the way you did the
    CRT, but instead I got a summary devoid of extreme nit-picking, which
    I suppose suits me fine, anyway.

    >(snip summary)

    I note that there was nothing in the summary regarding disadvantages
    of the CRT due to it's thick glass front. Apparently the effects are
    so small as to not be worthy of your reasonably complete summary.
    This, of course, bolsters my case that the effects of the thick glass
    do indeed NOT cause a "big difference" relative to the LCD.

    In other words, I've been right all along, and you knew it, Bob.
  38. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "assaarpa" <redterminator@fap.net> wrote in message
    news:ccudag$hmb$1@phys-news1.kolumbus.fi...
    > > > Why they
    > > > don't make CRT's with DVI connectors?
    > >
    > > Well, quite simply because there would be no advantage to it.
    >
    > Why not? I get crystal sharp 1600x1200 with 13w -> bnc with 22" SyncMaster
    > 1200NF, same monitor, dozen difrent DB15 -> BNC or DB15->DB15 cables with
    > over $400 PC graphics cards: image is blurry, out-of-focus, etc. It seems
    > that the analogic is great if the analogic sources are great.

    Or if the analog cabling is "great" - a major problem is that a
    LOT of VGA (HD15) cabling isn't, and in fact isn't even really
    fit to carry baseband TV video. Noting that you can get "crystal
    sharp" images with a 13W3 or BNC connectors should be
    taken as an indication that it's not necessarily the analog signal
    sources or the analog interface standard itself that is the limiting
    factor here.

    Again, a VERY major factor in the perceived benefit of the
    "digital" interfaces with LCDs and similar fixed-format displays
    is the presence of much better pixel-level timing information, which
    such displays absolutely require for optimum performance. The
    CRT has no such requirement, and so does not clearly benefit
    from these interfaces. (It makes little difference that the video
    information itself is transmitted in "analog" or "digital" form in
    either display technology; 8-bit accuracy, which is all that's provided
    on the current digital standards, isn't that hard to achieve in
    analog video. And note that both the CRT and LCD are
    fundamentally analog-controlled devices - it's just that in the case
    of the LCD, a digital-to-analog conversion typically takes place
    within the display panel itself.)

    This is not to say that there aren't poor analog video sources -
    there clearly are - however, it really doesn't take all that much
    to clean these up, IF the designer knows what he or she is
    doing.


    Bob M.
  39. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "assaarpa" <redterminator@fap.net> wrote in message
    news:ccuvdf$66d$1@phys-news1.kolumbus.fi...


    > I tell you what's wrong: it seems to be prohibitively (literally)
    expensive
    > or difficult to make a good digital-to-analogic converter, this is why
    $8000
    > workstation does the job properly and $650 graphics card doesn't. This is
    > logical. Now look at DVI: inexpensive way to do what with analogic
    signaling
    > takes $8K workstation and by defnition is not something $650 graphics card
    > can be expected by default to do. Not for years.

    The term, first of all, is usually "analog" - I don't believe I've
    ever encountered the construct "analogic" before.

    Second - again, for the CRT (or for that matter, the LCD), the
    digital information must be turned into an analog voltage at some
    point. The difference between the two technologies in this area is
    that for the CRT, this conversion MUST occur at the pixel rate
    (whereas in the LCD, it occurs at essentially the line rate). It is
    clearly not less expensive to include the D/A function in the CRT
    than in the graphics card - and in fact, it will generally be MORE
    expensive, all else being equal, since in the monitor this function
    would have to be done via a discrete component rather than being
    integrated into existing silicon. Further, the present digital interfaces
    have very limited capacity compared to what can be achieved with
    even moderately-good analog video implementations; single-link
    DVI tops out a 24 bits per pixel at a 165 MHz pixel rate - enough
    for 1600 x 1200 at 60 Hz, but not much beyond that. It's not
    hard at all to get well over 200 MHz pixel rates via analog video
    (to 2048 x 1536 and beyond, and further better-than-8-bit/color
    grayscale performance is achievable with careful design and
    implementation. It's generally better, then, in the CRT case to permit the
    CRT to be as flexible as possible, and place the burden of decent
    output design on the graphics card designer. But those makers will
    deliver only what the market requests, you know.

    As to the "$8000 workstation vs. $650 graphics card" example -
    where do you think the workstation manufacturers get their graphics?

    Bob M.
  40. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    > No, I'm not "being vague". Both VNC and Terminal Services let you use a
    > Unix box as a display for a Windows PC.

    Sounds quite a bit more expensive solution for general user to buy SGI
    workstation than just getting on with the program and eliminating shitty
    DAC's in PC consumer gfx cards completely out of the picture with DVI and
    whatever you can plug into the other end. Inexpensive and simple solution.
    Even simpler would be that the DAC's and cables were good to begin with, but
    practise has shown that is not the case in the real world we live in: if you
    want high quality you pay. DVI means high quality is affordable, thanks to
    the technology that is being used to solve the problem.

    Since graphics cards now come with DVI as pretty much standard, why would it
    be 'unreasonable' to get rid of the DB15 completely, especially considering
    that TFT's are slowly getting rid of them while we speak? If I connect a PC
    to a Plasma Display, which input port I choose, you think, the DVI or DB15?
    Definitely not the component or scart, s-video or good riddance, composite.
    HDMI is also beginning to show it's head in home electronics, some DLP
    projection screens have it and some plasmas have it. In a few years from
    now, who knows, we might begin to see it in PC graphics cards as option
    aswell.

    Your position of defending DB15 is a bit silly, I understand it from the
    backward compatibility point of view but let's face it the market is moving
    away from it slowly but steadily. DVI-I still includes DVI-A, so DVI-I
    compliant devices are still able to give out DB15 for years to come. This
    means extra cost, so I don't think the vendors will put too much effort into
    making it crips and the primary interface will be DVI-D anyway for most
    users, especially in the coming years.

    This is why I am "whining" about it.. ( I myself prefer to call it
    "observing a trend" ).. why we are moving into this direction is cost
    efficiency and high quality. The trouble with DVI is the BANDWIDTH. For
    3800x2400 you need two DVI channels, if you want something like that you are
    going to need two separate, or one non-standard cable and connector. But for
    mass market uses, DVI is adequate. In upper limits of the bandwidth it wins
    the analogic signaling hands down and does only be precise to the required
    tolerances, after that it's fair game to malfunction: that's what digital is
    all about, tresholds and tolerances.

    Analogic display device getting digital signal would indeed introduce
    inaccuracy of it's own: you cannot change voltage in instant from one to
    other, the voltage will go through all values between the two, how sharply
    and how the signal handles at ends is up to the quality of the components.
    Making that precise square waveform is, expensive, "or so it seems", if I
    may quote my earlier posts.

    Next thing we know, I will get another angry response about "stop trolling"
    and "stop whining", ohwell.. :)


    > > Is this so hard to accept? Give me more than one good reason to switch
    > > back to CRT with PC.
    >
    > Why would I care if you "switch back to CRT with PC"? You're the one
    > whining about being unable to do so.

    I am not whining, pay attention: I am rejoicing that it is inexpensive and
    easy to get 1600x1200 crystal sharp image with PC gfx cards. You seem to
    have problem that there is a solution that works out for me, or why so
    critical?

    It's LOTTO to get a decent 1600x1200 with CRT and anything that is
    associated with DB15. That's a beautiful point: no need for VNC, Terminal
    Services, etc.. buy cheap off-the-shelf hardware and get on with the
    program. Simple and cheap. The way it should be.
  41. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    assaarpa wrote:

    >> No, I'm not "being vague". Both VNC and Terminal Services let you use a
    >> Unix box as a display for a Windows PC.
    >
    > Sounds quite a bit more expensive solution for general user

    I wasn't talking about the "general user", I was talking about _you_.

    > to buy SGI
    > workstation than just getting on with the program and eliminating shitty
    > DAC's in PC consumer gfx cards completely out of the picture with DVI and
    > whatever you can plug into the other end. Inexpensive and simple solution.

    Again you fail to demonstrate that what can't be done in an "inexpensive"
    video board can be done in an "inexpensive" monitor.

    > Even simpler would be that the DAC's and cables were good to begin with,
    > but practise has shown that is not the case in the real world we live in:
    > if you want high quality you pay. DVI means high quality is affordable,
    > thanks to the technology that is being used to solve the problem.

    What "technology" do you believe "is being used to solve the problem"? DVI
    with an LCD display works aroudn the problem, it doesn't solve it.

    > Since graphics cards now come with DVI as pretty much standard, why would
    > it be 'unreasonable' to get rid of the DB15 completely,

    How does changing the connector improve the situation?

    > especially
    > considering that TFT's are slowly getting rid of them while we speak?

    If you believe that you might want to check the stock at CompUSA and the
    like.

    > If I
    > connect a PC to a Plasma Display, which input port I choose, you think,
    > the DVI or DB15?

    What of it? A plasma display can process a digital signal directly, with
    the only analog conversion being the intensity at each pixel.

    > Definitely not the component or scart, s-video or good
    > riddance, composite.

    So? The only data-grade monitors I know of whcih have coponent, s-video,
    composite, or scart inputs are projectors.

    > HDMI is also beginning to show it's head in home
    > electronics, some DLP projection screens have it and some plasmas have it.
    > In a few years from now, who knows, we might begin to see it in PC
    > graphics cards as option aswell.

    Why would one need an audio interface on a PC graphics card?

    > Your position of defending DB15 is a bit silly

    What position is that? I am not "defending DB-15". I am stating the _fact_
    that putting the DAC in the CRT and putting a DVI interface on it has not,
    in the real world, proven to be a satisfactory solution. If you believe
    that it is possible to implement such a monitor satisfactorily and that
    there is a market for such monitors, then I suggest you start taking
    advantage of your superior insight into the market and form a company to
    produce them, thus making yourself fabulously wealthy.

    >, I understand it from the
    > backward compatibility point of view but let's face it the market is
    > moving away from it slowly but steadily. DVI-I still includes DVI-A, so
    > DVI-I compliant devices are still able to give out DB15 for years to come.
    > This means extra cost, so I don't think the vendors will put too much
    > effort into making it crips

    "crips"? What does a street gang have to do with anything?

    > and the primary interface will be DVI-D anyway
    > for most users, especially in the coming years.

    By which time LCDs will be so highly perfected and so cheap that nobody will
    want a CRT anyway.

    > This is why I am "whining" about it.. ( I myself prefer to call it
    > "observing a trend" )..

    Observing at great length and in ad nauseum detail. Fine, there is a trend
    from analog interfaces on non-CRT monitors to digital interfaces. So what?

    When someone tells you that a few years ago there was a similar trend with
    CRT monitors with the results being universally dismal and with a resulting
    reversal in that trend you accuse them of "defending DB-15".

    If you don't like the message there is little point in shooting the
    messenger.

    > why we are moving into this direction is cost
    > efficiency and high quality.

    It's only "cost efficient" if you can eliminate digital-to-analog
    conversion. With a CRT, unless you redesign the thing from the ground up
    along different principles from those that have become established in the
    industry over the past 70 years or so, you cannot eliminate that
    conversion.

    > The trouble with DVI is the BANDWIDTH. For
    > 3800x2400 you need two DVI channels, if you want something like that you
    > are going to need two separate, or one non-standard cable and connector.

    That's also the trouble with analog. An analog cable that can carry
    3800x2400 without ghosting is not cheap. What of it?

    > But for mass market uses, DVI is adequate.

    Who has claimed otherwise?

    > In upper limits of the
    > bandwidth it wins the analogic signaling hands down and does only be
    > precise to the required tolerances, after that it's fair game to
    > malfunction: that's what digital is all about, tresholds and tolerances.

    Are you Amish?

    > Analogic display device getting digital signal would indeed introduce
    > inaccuracy of it's own: you cannot change voltage in instant from one to
    > other, the voltage will go through all values between the two, how sharply
    > and how the signal handles at ends is up to the quality of the components.
    > Making that precise square waveform is, expensive, "or so it seems", if I
    > may quote my earlier posts.

    So what?

    > Next thing we know, I will get another angry response about "stop
    > trolling" and "stop whining", ohwell.. :)

    Since you won't accept any suggestions on how to get the result you want
    with the hardware you have, you clearly have no interest in doing anything
    but complaining idly. It may come as a shock to you, but nobody who has
    the power to implement the changes you want in the monitor industry is
    reading this newsgroup with bated breath waiting to gain the benefits of
    your superior insights.

    >> > Is this so hard to accept? Give me more than one good reason to switch
    >> > back to CRT with PC.
    >>
    >> Why would I care if you "switch back to CRT with PC"? You're the one
    >> whining about being unable to do so.
    >
    > I am not whining, pay attention: I am rejoicing that it is inexpensive and
    > easy to get 1600x1200 crystal sharp image with PC gfx cards. You seem to
    > have problem that there is a solution that works out for me, or why so
    > critical?

    If there is a solution that works for you then why are you creating these
    vast posts in which you express your dissatisfaction with the hardware that
    you can buy?

    > It's LOTTO to get a decent 1600x1200 with CRT and anything that is
    > associated with DB15.

    Hardly "lotto". Just not cheap if you want SGI quality. Most people don't
    want SGI quality, and to tell you the truth if I liked to read tiny little
    type I would be quite happy with the 1600x1200 that I get out of my $50 ATI
    board and CRT monitor.

    > That's a beautiful point: no need for VNC, Terminal
    > Services, etc.. buy cheap off-the-shelf hardware and get on with the
    > program. Simple and cheap. The way it should be.

    So figure out a way to make "cheap off the shelf hardware" that does what
    you want and get on with it.


    --
    --John
    Reply to jclarke at ae tee tee global dot net
    (was jclarke at eye bee em dot net)
  42. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    > program. Simple and cheap. The way it should be.

    Oh, and the point: if CRT vendors *1) are interested in my money, they
    should think about how I could transmit a crystal clear image from the
    latest ATI and NV graphics products *2) to the display 1600x1200 minimum.
    Before that is adequately solved, don't bother.

    Then a comment about VNC. How you propose I transfer the framebuffer
    generated by GPU to the SGI workstation? You DO know that reading from the
    GPU local memory to system memory is very, very slow, right? Then
    transfering the data over LAN, you don't think there would be any latency (I
    have Gigabit switch and ethernet adapter in my Windows box, but the Octane2
    only has FastEthernet). All in all, I wouldn't expect performance in par
    with working locally. I could, ofcourse, jump back and forth between the two
    workstations from one room into the other.

    But why bother, when I get crisp output with DVI-D? I still at loss what you
    are suggesting me to do when I have everything I need sorted out neatly and
    cheaply! Doesn't make any sense why you are having such a difficult time
    with this?


    1) As-if we could say there are dedicated "CRT" and "TFT" vendors, maybe
    there are one or two, or maybe you could call vendors who dropped CRT's out
    of their product lineup "TFT" vendors -- I wouldn't. The "CRT vendor" in the
    context of above sentence merely means any vendor who is selling CRT's and
    want to compete for my money against DVI based solution such as TFT display.
    I don't claim to be relevant individual, but I wouldn't say that my point of
    view would be unique, that would be flattering myself.

    2) As we all know most ATI and NV based graphics products in < $1000
    price-range have inferior image quality for analogic output. I don't say I
    couldn't go for other vendors but looking at their offerings they are either
    very expensive or don't have the other criteria met these two vendors have,
    too bad about their DAC's, though.
  43. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "assaarpa" <redterminator@fap.net> wrote in message
    news:cd11l6$d16$1@phys-news1.kolumbus.fi...
    > Oh, and the point: if CRT vendors *1) are interested in my money, they
    > should think about how I could transmit a crystal clear image from the
    > latest ATI and NV graphics products *2) to the display 1600x1200 minimum.
    > Before that is adequately solved, don't bother.

    However, this brings up another point which was addressed much
    earlier in this thread - there is VERY little new development going
    on these days in CRT displays, since they are rapidly being displaced
    in all segments of the market (with the exception of the very low-cost
    end) by other technologies. And clearly, those low-end displays are
    NOT going to be an area where you see the introduction of a lot
    of gee-whiz, expensive new features.

    Bob M.
  44. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    Please clarify something.

    Is everything in your life is perfect or is there something you want to
    change?

    If everything in your life is perfect then what are you on about?

    If there is something you want to change please state clearly, and
    succinctly, in one paragraph of less than five lines what it is, so that
    someone of my meager intellect can comprehend the issue.

    assaarpa wrote:

    >> program. Simple and cheap. The way it should be.
    >
    > Oh, and the point: if CRT vendors *1) are interested in my money, they
    > should think about how I could transmit a crystal clear image from the
    > latest ATI and NV graphics products *2) to the display 1600x1200 minimum.
    > Before that is adequately solved, don't bother.

    I suspect that the CRT vendors couldn't care less if you personally bought
    from them.

    > Then a comment about VNC. How you propose I transfer the framebuffer
    > generated by GPU to the SGI workstation? You DO know that reading from the
    > GPU local memory to system memory is very, very slow, right? Then
    > transfering the data over LAN, you don't think there would be any latency
    > (I have Gigabit switch and ethernet adapter in my Windows box, but the
    > Octane2 only has FastEthernet). All in all, I wouldn't expect performance
    > in par with working locally. I could, ofcourse, jump back and forth
    > between the two workstations from one room into the other.

    Have you tried it?

    > But why bother, when I get crisp output with DVI-D? I still at loss what
    > you are suggesting me to do when I have everything I need sorted out
    > neatly and cheaply! Doesn't make any sense why you are having such a
    > difficult time with this?

    If you don't have a problem then why do you keep going on about your
    difficulties interfacing your CRT to a PC?

    > 1) As-if we could say there are dedicated "CRT" and "TFT" vendors, maybe
    > there are one or two, or maybe you could call vendors who dropped CRT's
    > out of their product lineup "TFT" vendors -- I wouldn't. The "CRT vendor"
    > in the context of above sentence merely means any vendor who is selling
    > CRT's and want to compete for my money against DVI based solution such as
    > TFT display.

    What leads you believe that anybody but you wants to do this?

    > I don't claim to be relevant individual, but I wouldn't say
    > that my point of view would be unique, that would be flattering myself.

    Are there enough like you to pay for product development?

    > 2) As we all know most ATI and NV based graphics products in < $1000
    > price-range have inferior image quality for analogic output. I don't say I
    > couldn't go for other vendors but looking at their offerings they are
    > either very expensive or don't have the other criteria met these two
    > vendors have, too bad about their DAC's, though.

    If it is your intention to become a market analyst for the display industry,
    don't quit your day job.

    --
    --John
    Reply to jclarke at ae tee tee global dot net
    (was jclarke at eye bee em dot net)
  45. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    > fundamentally analog-controlled devices - it's just that in the case
    > of the LCD, a digital-to-analog conversion typically takes place
    > within the display panel itself.)

    J Clarke says that is infeasible to have the da converter in the display,
    comments on that?


    > This is not to say that there aren't poor analog video sources -
    > there clearly are - however, it really doesn't take all that much
    > to clean these up, IF the designer knows what he or she is
    > doing.

    A friend of mine "cleaned" his GeForce4 by removing resistors from his
    card's BCP few years back, which lead me to look into the articles on the
    topic back then. The generic idea in the article was that the resistors were
    there to dampen the signal so that the cabling wouldn't emit too much
    interference, RFC regulation related anyway, the article might still be
    online somewhere.

    I'm not disputing what you say, you are easy to agree with.
  46. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    > As to the "$8000 workstation vs. $650 graphics card" example -

    Octane2 is using SGI's own design, V6,V8,V10 and V12.


    > where do you think the workstation manufacturers get their graphics?

    I think they get the graphics from 3DLabs, nVidia, ATI, E&S, IBM, .. one of
    the differences to consumer-priced products would be the quality of
    engineering for hardware and software (drivers), and diffrent feature-set
    enabled in the chip (either different mask or externally disabled
    functionality either in the BCP, firmware or driver) and little details like
    that.

    Also if you want to run VERY high resolution display you're better prepared
    to spend some cash. Help, mercy! 1600x1200 is practical for consumers now,
    that's why I am happy not complaining like JC claims. :)
  47. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "chrisv" <chrisv@nospam.invalid> wrote in message
    news:koh8f0pg7tlbv8kj35npn8rlqllim7slnf@4ax.com...
    > Well, actually I asked you to nit-pick the LCD the way you did the
    > CRT, but instead I got a summary devoid of extreme nit-picking, which
    > I suppose suits me fine, anyway.

    If you can think of some relevant "extreme nit-picking" to
    add, please be my guest. If, on the other hand, you think
    anyone here is in any way required to write whatever you
    request, think again.

    > I note that there was nothing in the summary regarding disadvantages
    > of the CRT due to it's thick glass front. Apparently the effects are
    > so small as to not be worthy of your reasonably complete summary.

    Obviously, getting into every detail of both technologies would
    require a book. I've done a book before, and it's not all that
    much fun - and it's certainly not something I'm going to be doing
    for the sheer hell of it here.

    > This, of course, bolsters my case that the effects of the thick glass
    > do indeed NOT cause a "big difference" relative to the LCD.

    Boy, when you get hung up on something, you really get hung
    up on it, don't you?

    Is it as "big" a difference as, say, the problems with color purity
    or convergence or susceptibility to external fields? No, not in the
    minds of most customers. Still, the optical effects of the thicker
    faceplate IS a very obvious and visible difference (and yes, it IS
    one that I have known to be a deciding factor in the purchasing
    decision in some cases). But, as with all things, it's going to depend
    a lot on what a particular customer considers to be "big."

    > I've been right all along, and you knew it, Bob.

    Hey, Chris, if it makes you feel good to keep score on this, and
    you think you "got one" here, by all means - knock yourself out.
    It makes very, very little difference to me one way or another.
    You're certainly "right" in that you do not see this as a significant
    difference. You are equally certainly wrong if you believe that
    everyone trying to decide between these types shares that view.
    All I can do is to try to explain what differences exist, where they
    come from, and what if anything can be done to compensate for
    them.

    Bob M.
  48. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "assaarpa" <redterminator@fap.net> wrote in message
    news:cd1sdf$muh$1@phys-news1.kolumbus.fi...
    > Excuse the point was that DVI is inexpensive way to get a crisp signal to
    > the display device. All the better if the device is digital.

    Within its limits, yes. However, I would have to at this point note
    that many of the devices you may think are somehow "digital"
    in reality are not, and are not strongly tied to digital interfaces.
    (Would you be surprised, for example, to learn that a major
    manufacturer of LCD panels and monitors for years produced
    panels which kept the video signal in analog form all the way through
    to the pixel level?)


    > Elimination of two parts in signal degration can be eliminated: the source
    > (which is primary problem these days) and the cable (which is minor
    problem
    > overall compared to the signal source being poor to begin with).

    That's not an altogether correct assessment. There are poor
    sources and good sources; there are poor cables and good cables.
    In my experience, there are far more extremely poor cable
    assemblies out there than there are analog-video-capable ICs
    which are beyond all hope.


    > Now the display could have a shot at doing best possible job it can
    > presenting the digitally accurate data that comes in.

    Digital does not automatically equate to "accurate." You
    have to look at the specifics of the given implementations
    being compared, both "analog" and "digital."


    > Making square waveform at high frequency is very difficult hence
    expensive.
    > Analogic signal will be smoothed and inprecise because of this.

    Why do you believe that a "square waveform" is desirable? Pixels
    are not, in terms of the data itself, to be considered as
    "little squares," but rather as point samples of the intended "original"
    image (even if that image is CG). All that is required of either
    transmission
    system is that the intended luminance value for each video channel
    can be unambiguously determined for the sample in question - which
    generally means simply that the signal be at that value at the defined
    sample time. (Once again, this shows the importance of pixel-level
    timing to a fixed-format display.)

    > Digital
    > signaling has tolerance and treshold which determine how the signal is
    > interpreted: high or low voltage. This is what digital means.

    Well, no, actually what you have described is a particular form
    of "digital" encoding. Most generically, "digital" simply means
    that the transmitted information is being provided in the form of
    numeric values, as opposed to levels analogous to the original
    information (which is what "analog" means). How those numeric
    values are encoded and transmitted varies depending on the
    specific implementation in question. (For instance, what a
    modem works on is not simply "high or low voltages"; nor is
    that a good description of what goes over the air in digital
    television systems. Both of these are undeniably "digital" in
    operation, though.)

    > Still with me?
    > When we want to design and manufacture a circuit which does this is with
    > current manufacturing processes inexpensive, the problem solved itself
    when
    > the manufacturing processes got better and faster chips could be produced
    at
    > cheaper prices.

    Yes, but there's nothing particularly efficient about simple binary
    encoding, nor is it necessarily easier to make a chip which will
    transmit information in that form at the required rates (which is
    why, for instance, a single-link DVI connection still doesn't provide
    the data capacity of some analog RGB video connections).


    > How much you think a Silicon Image DVI controller costs? How much you
    think
    > accurate digital-to-analogic converter would cost?

    The total system costs of these two are actually not all that
    different. (Remember that a DVI link running at the spec maximum
    limit is carrying serial data on each of its three pairs at a raw rate
    of slightly over 1.6 Gbits/sec; that is NOT a simple trick to pull
    off...)


    > > What of it? A plasma display can process a digital signal directly,
    with
    > > the only analog conversion being the intensity at each pixel.
    >
    > The what that DVI is gaining ground even on television sets.

    Ummmm...I'm not sure what you meant to say there, but if
    it was a supposedly rhetorical questions as to why DVI is
    "gaining ground" in the television market, the main reason is
    the fact that it supports a fairly robust content-protection
    system. It is not because the video is better carried in digital
    form (especially for CRT-based televisions).


    > > So? The only data-grade monitors I know of whcih have coponent,
    s-video,
    > > composite, or scart inputs are projectors.
    >
    > I was talking more in the context of television use for the Plasma
    Displays,
    > for instance the DVI input has better image quality than the DB15. Infact
    > the DB15 image quality is much worse than s-video from DVD set-top-box!

    Apples and oranges. An HD-15 (what you're calling a "DB15")
    connection - more commonly called just a "VGA output" - will
    hardly ever be seen carrying video at less than a 31 kHz line rate
    and about a 25 MHz pixel rate, and generally will be outputting
    something a good deal faster. However, an S-Video output is
    by definition not carrying anything beyond baseband "TV" video,
    which is well under 10 MHz total bandwidth (closer to 5 MHz,
    tops, in fact).


    > > Why would one need an audio interface on a PC graphics card?
    >
    > HDMI is a digital A/V connector, sorry to burst your bubble, you might
    want
    > to check out:

    Well, yes and no. The version of HDMI appearing on
    certain CE equipment carries audio. There's also a dual-link
    version that was intended for the PC market, which has to date
    never seen much usage, which doesn't necessarily provide
    audio support. HDMI, when it does appear on PC products
    these days, is generally there as an interface to "TV" products,
    much as PCs have in the past provided S-Video outputs.


    > That may be true from the *technical* perspective, but I was steering away
    > from the DB15 not because of the theoretical specs of the connector but
    the
    > poor quality of real-world implementations. My claim is that DVI doesn't
    > leave any room for error in the signal carrying media like connector or
    > cabling.

    If the spec is followed, no - but then, that's true of any system.
    And Gawd knows there have been plenty of examples of DVI
    implementations that didn't play well with others.


    > The display would be one package. Either it would be poor quality, or it
    > would be high quality. Now the burden is put into three distinct parts:
    > display, cable, graphics card. Each of which have to use high-quality
    > components so that none of them are the weakest link. DVI allows
    inexpensive
    > components to be used due to nature of the technology used.

    Not really, with either technology - and it certainly does not
    reduce the total system cost for CRT displays, for reasons
    previously discussed.


    > > That's also the trouble with analog. An analog cable that can carry
    > > 3800x2400 without ghosting is not cheap. What of it?
    >
    > DVI cable that can, is inexpensive. QED.

    Except that you currently need TWO dual-link DVI
    cables (note that this is a total of eight discrete connector
    components, and two cable assemblies each including
    seven individually-shielded pairs) to support this; I can
    guarantee you that neither interface type is "cheap" when
    it comes to supporting these sorts of data rates. Neither
    interface type has an inherent edge over the other in this
    regard. Please note that DVI cable assemblies which
    meet the specification requirements are NOT as
    inexpensive as the cheapest VGA assemblies on the market;
    conversely, for what you spend for a DVI cable, you
    can get a VGA assembly of comparable performance.


    Bob M.
  49. Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

    "assaarpa" <redterminator@fap.net> wrote in message
    news:cd1sne$nc2$1@phys-news1.kolumbus.fi...
    > > fundamentally analog-controlled devices - it's just that in the case
    > > of the LCD, a digital-to-analog conversion typically takes place
    > > within the display panel itself.)
    >
    > J Clarke says that is infeasible to have the da converter in the display,
    > comments on that?

    Specifically, he said that it wasn't advantageous to have a
    DAC in a CRT monitor, and he is correct about that. In the
    case of an LCD panel, the "DAC" function happens in the
    column drivers - it's a whole different sort of animal, and these
    run much more slowly than the pixel rate due to the nature of
    how an LCD panel is driven.


    > A friend of mine "cleaned" his GeForce4 by removing resistors from his
    > card's BCP few years back, which lead me to look into the articles on the
    > topic back then. The generic idea in the article was that the resistors
    were
    > there to dampen the signal so that the cabling wouldn't emit too much
    > interference, RFC regulation related anyway, the article might still be
    > online somewhere.

    Actually, what was removed was most likely small capacitors
    (which in surface mount form, are difficult to distinguish from
    SMT resistors). This can in some cases improve signal rise times,
    but on the other hand it's certainly possible to put EMI-reducing
    filter networks on video outputs that have no visible impact at all
    on the signal (I've done this in several designs). Removing
    RESISTORS from the outputs of video cards, if that's really what's
    being done, generally isn't a good idea, as it is changing the
    cable termination at the source end (which can lead to some
    severe "ghosting" problems in many cases).

    Bob M.
Ask a new question

Read More

Graphics Cards Monitors Graphics