Sign in with
Sign up | Sign in
Your question

flat panel monitors

Last response: in Graphics & Displays
Share
Anonymous
a b U Graphics card
December 9, 2004 12:46:48 PM

Archived from groups: alt.comp.hardware,comp.sys.ibm.pc.hardware.video (More info?)

I've heard that flat panel monitors are easier on the eyes. Is this true?
If so, then what would be a good choice for a 17" for gaming?

More about : flat panel monitors

Anonymous
a b U Graphics card
December 10, 2004 12:57:02 AM

Archived from groups: alt.comp.hardware,comp.sys.ibm.pc.hardware.video (More info?)

On Thu, 9 Dec 2004 09:46:48 -0800, Adam Russell wrote:
> I've heard that flat panel monitors are easier on the eyes. Is this true?
> If so, then what would be a good choice for a 17" for gaming?

Make sure you get a fast response time. There are monitors with 12ms
now, like the Samsung 710T I think that's what I'm going to get... I
just haven't decided yet if having a DVI interface is worth another $100

--
* John Oliver http://www.john-oliver.net/ *
* California gun owners - protect your rights and join the CRPA today! *
* http://www.crpa.org/ Free 3 month trial membership available *
* San Diego shooters come to http://groups.yahoo.com/group/sdshooting/ *
Anonymous
a b U Graphics card
December 14, 2004 3:11:32 AM

Archived from groups: alt.comp.hardware,comp.sys.ibm.pc.hardware.video (More info?)

Adam Russell writes:

> I've heard that flat panel monitors are easier on the eyes.
> Is this true?

Any monitor with a clear, sharp, bright image and minimum flicker (which
depends on your personal sensitivity to flicker) is easy on the eyes.
It can be a CRT or a flat panel, as long as the image is of good
quality.

I try never to skimp on monitors. Eyes are worth taking care of.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Related resources
Anonymous
a b U Graphics card
December 14, 2004 12:35:33 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"John Oliver" <joliver@john-oliver.net> wrote in message
news:slrncrhih9.nco.joliver@ns.sdsitehosting.net...
> just haven't decided yet if having a DVI interface is worth another $100

No question, I wouldn't buy one without it.

Michael
Anonymous
a b U Graphics card
December 14, 2004 12:35:34 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Michael C writes:

> No question, I wouldn't buy one without it.

Is there really that much difference? I have only an analog input to
mine, but even under a loupe the individual pixels are resolved as
sharply as can be. How could a digital interface improve on that?

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 14, 2004 12:35:35 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> No question, I wouldn't buy one without it. [DVI-D]

> Is there really that much difference?

Yes. Several considerations:

There is noise resulting from taking the original
digital raster to analog and back to digital.
This might display, for example as horizontal
artifacts, unstable picture regions, etc.

Square waves? No chance. Think of a pattern of
alternating white/black 1-pix dots. In analog,
these need to exhibit sharp transitions and flat
tops to emulate what you get for free with DVI-D.
Bandwidth limits in the analog channels are apt
to smear this fine detail.

Group delay with analog introduces some risk that
the pixel data won't exactly precisely align with
the LCD triads upon reconstruction. Suppose the
analog signal has a little group delay (time shift)
from the DAC, or in the cable, or in the ADC (or
just one of the colors does). Our hypothetical white
and black dots might become a gray moire morass.

Even what black and white levels are, becomes
uncertain with analog.

I just compared the same screen in HD15 analog and
DVI-D digital on my LCD, and the analog image has
less "contrast", text characters are not as sharp,
and due to the grayscale tracking limits of this
monitor, black characters on white backgrounds have
a tiny annoying pink fringe in analog.

Go DVI-D. By the way, expect images and text to perhaps
be startlingly sharper until you get used to it.
The limitations of analog were providing some
full-screen anti-aliasing at no extra charge.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 14, 2004 12:35:36 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> There is noise resulting from taking the original
> digital raster to analog and back to digital.
> This might display, for example as horizontal
> artifacts, unstable picture regions, etc.
>
> Square waves? No chance. Think of a pattern of
> alternating white/black 1-pix dots. In analog,
> these need to exhibit sharp transitions and flat
> tops to emulate what you get for free with DVI-D.
> Bandwidth limits in the analog channels are apt
> to smear this fine detail.

But the panel is just doing an analog to digital conversion, anyway, and
the connection is analog even when it's DVI-D, so doesn't it all just
wash?

The image on mine is really sharp, it seems, and contrast is excellent.
No artifacts even under a loupe. It makes me wonder how much better it
could get.

> Group delay with analog introduces some risk that
> the pixel data won't exactly precisely align with
> the LCD triads upon reconstruction.

Doesn't the panel correct the time base for incoming analog signals or
something, in order to avoid this? Like the TBC in some video
equipment?

> Suppose the
> analog signal has a little group delay (time shift)
> from the DAC, or in the cable, or in the ADC (or
> just one of the colors does). Our hypothetical white
> and black dots might become a gray moire morass.

But the panel could delay everything and resync it to a new time base
and eliminate any movement, like TBCs do for video.

> I just compared the same screen in HD15 analog and
> DVI-D digital on my LCD, and the analog image has
> less "contrast", text characters are not as sharp,
> and due to the grayscale tracking limits of this
> monitor, black characters on white backgrounds have
> a tiny annoying pink fringe in analog.
>
> Go DVI-D. By the way, expect images and text to perhaps
> be startlingly sharper until you get used to it.
> The limitations of analog were providing some
> full-screen anti-aliasing at no extra charge.

I don't know if my video card provides it. I have an NVidia GeForce2
something-or-other, but I didn't notice a DVI-D plug on the card.
There's one on the monitor, though.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 14, 2004 12:35:37 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

> But the panel is just doing an analog to digital
> conversion, anyway, ...

No. With a DVI-D connection, the discrete pixel digital
values are preserved from creation in the frame buffer
by the graphics driver all the way out to the individual
pixel drivers for the LCD triads.

> ... and the connection is analog even when it's DVI-D,

TMDS as I recall. Transition-Minimized Digital Signalling.
Ones and zeros. Everything is either a 1, a 0, or ignored.

>> ... group delay ...

> Doesn't the panel correct the time base for incoming
> analog signals or something, in order to avoid this?

I'd like to think so, but I wouldn't assume it.
Clearly, when we feed the monitor a non-native res,
it cannot match pixels, because the rasters don't map.
Interpolation time. It might still perform TBC to assure
that active signal period start/end precisely align with
the ADC's sampling aperture.

>> Go DVI-D.

> I don't know if my video card provides it.

Most cards provide DVI-I ports, which have both one link
of DVI-D digital and RGB analog (sometimes called DVI-A,
plus a breakout-to-HD15 cable for analog use). By DVI-D,
I mean use the card's DVI port, and a DVI cable, and
assure yourself that if both signals are present, the
monitor is using the digital, and not the analog.

I'm using an aftermarket DVI-D cable which doesn't even
have the RGB wires - turned out to not be necessary to
be that drastic, but I didn't know that when I ordered
the cable.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 14, 2004 12:35:38 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> No. With a DVI-D connection, the discrete pixel digital
> values are preserved from creation in the frame buffer
> by the graphics driver all the way out to the individual
> pixel drivers for the LCD triads.

How many levels per pixel? An analog connection can have any number of
levels, depending only on the quality of the connection and hardware.
My card already generates 32-bit color, although my flat panel can't use
all that resolution.

> Most cards provide DVI-I ports, which have both one link
> of DVI-D digital and RGB analog (sometimes called DVI-A,
> plus a breakout-to-HD15 cable for analog use). By DVI-D,
> I mean use the card's DVI port, and a DVI cable, and
> assure yourself that if both signals are present, the
> monitor is using the digital, and not the analog.

I'll look again and see if there's a DVI-D plug, but I rather doubt it.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 14, 2004 12:35:39 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> With a DVI-D connection, the discrete pixel digital
>> values are preserved from creation in the frame buffer
>> by the graphics driver all the way out to the individual
>> pixel drivers for the LCD triads.

> How many levels per pixel?

I haven't looked at the DVI spec since 1.0, but at
that time, single-link DVI was limited to 24 bpp,
or 8-bits per R, G or B.

> My card already generates 32-bit color, although my
> flat panel can't use all that resolution.

It's not clear to me that contemporary LCD panels can
even deliver 24-bit color. They accept such signals,
but what they paint on screen is another matter.

> I'll look again and see if there's a DVI-D plug,
> but I rather doubt it.

The DVI(-I) connector is a D-sub, usually white body,
with an 8x3 grid of pins, plus a 2x2 grid with
cruciform ground planes for the RGB. If the
connector is DVI-D (digital only), it omits the
2x2 grid array and has only one of the ground blades.

If your card only has 15-pin Dsub(s) (usually blue),
then it only has analog video out. You cannot use a
pure digital connection, athough you could invest in
a monitor with DVI, and use it in analog mode until
your next graphics card upgrade.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 14, 2004 12:35:40 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> I haven't looked at the DVI spec since 1.0, but at
> that time, single-link DVI was limited to 24 bpp,
> or 8-bits per R, G or B.

24-bit color isn't good enough for some applications. It doesn't
provide enough levels in some parts of the gamut, such as blue (the eye
is extremely sensitive to differences in intensity in the blue end of
the spectrum, so any kind of posterization is very easy to spot).

> It's not clear to me that contemporary LCD panels can
> even deliver 24-bit color. They accept such signals,
> but what they paint on screen is another matter.

True for many cheaper CRTs, too. But it worries me that the standard
apparently was designed in a way that permanently limits it to 24-bit
color.

> The DVI(-I) connector is a D-sub, usually white body,
> with an 8x3 grid of pins, plus a 2x2 grid with
> cruciform ground planes for the RGB. If the
> connector is DVI-D (digital only), it omits the
> 2x2 grid array and has only one of the ground blades.

That doesn't sound familiar at all. The Web references I've found seem
to claim that my card should have this, but I don't see it.

> If your card only has 15-pin Dsub(s) (usually blue),
> then it only has analog video out. You cannot use a
> pure digital connection, athough you could invest in
> a monitor with DVI, and use it in analog mode until
> your next graphics card upgrade.

This is what I've done. The performance even with analog input is very
impressive, and ClearType works very well, also, even though it's
supposedly designed for pure digital input.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 14, 2004 12:35:41 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

> But it worries me that the [DVI] standard
> apparently was designed in a way that
> permanently limits it to 24-bit color.

Not permanent - it just requires dual-link, and
the pin assignments are already present in the
existing connector.

But yes, DVI was short-sighted. Another obvious
limit was that a single link can only hit 1600x1200
with normal blanking. Reduced timing (common) is
hacked in to hit 1920x1200, but beyond that, dual-link
is again required.

>> The DVI(-I) connector is a D-sub, usually ...

> That doesn't sound familiar at all.

Here's a lousy photo of a bulkhead with both
DVI and HD15:
<http://www.hardocp.com/image.html?image=MTA4OTIzNDA3NzZ...;

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 14, 2004 7:03:54 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news:o psiyzfkcvft8z8r@news.individual.net...
> > Mxsmanic <mxsmanic@hotmail.com> wrote:
>
> >> No question, I wouldn't buy one without it. [DVI-D]
>
> > Is there really that much difference?
>
> Yes. Several considerations:

Bob, as much as I hate to disagree with you, I'm afraid
I'd have to vote "maybe" instead. For the most part, the
differences between an analog and a digital interface for
LCD monitors come down to questions of pixel timing,
which really have nothing at all to do with whether the
video information is in digital or analog form.

The main factor in determining how good the displayed
image is going to be with an analog interface is the generation
of the proper clock with which to sample the analog video,
which is a question of both getting the frequency right and
making sure the clock is properly aligned with the incoming
video such that the samples are taken where the "pixels" are
supposed to be. (Being a continuous signal, of course, there
is no information contained within the video itself which
identifies the pixels.) Usually, the clock frequency is obtained
by locking on to the horizontal sync pulese and multiplying THAT
rate up to the assumed pixel rate; getting the alignment correct
(the "phase" adjustment) is a matter of the interface circuitry
making some educated guesses. But if the clock generation can
be done properly, there is very little to be gained by simply having
the pixel information in "digital" form. (And please consider how
truly awful the digital interface would be if the pixel clock information
were removed from it - it would be totally unusable. Hence my
assertion that it is timing, not the encoding of the information, that
is the key difference between these two types of interface.)

>
> There is noise resulting from taking the original
> digital raster to analog and back to digital.
> This might display, for example as horizontal
> artifacts, unstable picture regions, etc.

Nope; all of the above have to do with the timing of
the pixel sampling process, not with noise in the video.
(Oddly enough, the LCD is NOT inherently a "digital"
device as is often assumed - fundamentally, the control
of the pixel brightness in any LCD is an analog process.
Simply having a discrete pixel array does not somehow
make a display "digital," nor does it necessarily mean that
a "digital" interface would have to be better.

> Square waves? No chance. Think of a pattern of
> alternating white/black 1-pix dots. In analog,
> these need to exhibit sharp transitions and flat
> tops to emulate what you get for free with DVI-D.
> Bandwidth limits in the analog channels are apt
> to smear this fine detail.

If we were talking about a display that actually shows
those edges, you'd have a point - but the LCD doesn't
work that way. Remember, we are dealing with a
SAMPLED analog video stream in this case; if the sample
points happen at the right time (which again is a question
of how well the pixel clock is generated), the pixel values
are taken right "in the middle" of the pixel times - making
the transitions completely irrelevant.

Note that "digital" interfaces also have what is in effect a
"bandwidth" limit (the peak pixel rate which can be supported),
and it is in current interfaces often significantly less than what
can be achieved with an "analog" connection. The single-link
TMDS-based interfaces such as DVI (in its single channel
form) and HDMI are both strictly limited to a pixel rate of
165 MHz, while analog connections (even with the lowly
VGA connector) routinely run with pixel rates in excess of
200 MHz.

> Group delay with analog introduces some risk that
> the pixel data won't exactly precisely align with
> the LCD triads upon reconstruction. Suppose the
> analog signal has a little group delay (time shift)
> from the DAC, or in the cable, or in the ADC (or
> just one of the colors does). Our hypothetical white
> and black dots might become a gray moire morass.

Right - but again, a timing issue, which gets back to the
question of the generation of the sampling clock, not the
encoding of the data (which is really all that the terms "analog"
and "digital" refer to). Again, take the clock away from
a digital interface, and see what THAT gives you.

So the logical question at this point is why no one has ever
bothered to include better timing information on the analog
interfaces. The answer now is: someone has. VESA released
a new analog interface standard this past year which does just
that - it includes a sampling clock reference, additional information
which helps to properly locate the sampling clock with respect
to the video stream, and even a system which makes the
determination of the white and black levels much more accurate.
This is called, oddly enough, the New Analog Video Interface
standard, or simply NAVI. NAVI is supportable on a standard
VGA connector, but the standard also includes the definition of a
new, higher-performance analog connector (similar to the analog
section of a DVI) for higher bandwidth and other features). It's
not clear yet how well NAVI will be accepted in the industry, but
it IS available if anyone chooses to use it.

Bob M.
Anonymous
a b U Graphics card
December 14, 2004 7:03:55 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Bob Myers <nospamplease@address.invalid> wrote:

>>>> No question, I wouldn't buy one without it. [DVI-D]
>>> Is there really that much difference?
>> Yes. Several considerations:

> Bob, as much as I hate to disagree with you, ...

Sorry, but if I'm mistaken, netnews rules require you
to disagree :-)

> For the most part, the differences between an analog
> and a digital interface for LCD monitors come down to
> questions of pixel timing, which really have nothing
> at all to do with whether the video information is in
> digital or analog form.

But there are opportunities for the signal to get
visibly degraded if it goes to analog before it gets
to the LCD panel lattice. In the entirely unscientific
test I just ran, where I saw exactly what I expected to
see, the analog happened to be running through two 2m
lengths of HD15 cable and a KVM switch. The LCD image
went from pixel-perfect to slightly fuzzy, and perhaps
also reduced "contrast".

> (And please consider how truly awful the digital
> interface would be if the pixel clock information
> were removed from it - it would be totally unusable.

Well, that's the classic promise and peril of digital.
It's either as perfect as it ever gets, or it's not
there at all, whereas analog may never be perfect
enough, and opportunities for degradation abound.

>> There is noise resulting from taking the original
>> digital raster to analog and back to digital.
>> This might display, for example as horizontal
>> artifacts, unstable picture regions, etc.
>
> Nope; all of the above have to do with the timing of
> the pixel sampling process, not with noise in the video.

Umm, if the bits in the frame buffer are going thru a
DAC (which can introduce noise and distortion), then
thru a cable (which <ditto>), even if the LCD is not using
an ADC, and is using the analog signal directly, that
extra noise and distortion may show up on screen.

> (Oddly enough, the LCD is NOT inherently a "digital"
> device as is often assumed - fundamentally, the control
> of the pixel brightness in any LCD is an analog process.

I sorta suspected that, but in the DVI-D model, the
signal remains digital until it hits the rows & columns, no?

Does the typical analog-only LCD have a DAC? Or does it
just sample the analog signal and route values to drivers?
My guess is that due to the interpolation required for
handling arbitrary resolutions, there is a local frame
buffer, and the analog video is [re]digitized before
hitting the pel drivers.

> If we were talking about a display that actually shows
> those edges, you'd have a point - but the LCD doesn't
> work that way. Remember, we are dealing with a
> SAMPLED analog video stream in this case; if the sample
> points happen at the right time (which again is a question
> of how well the pixel clock is generated), the pixel values
> are taken right "in the middle" of the pixel times - making
> the transitions completely irrelevant.

Even if the clocks align, there's also the matter of
whether or not the analog signal has completely slewed
to the value needed. If the DAC-cable-ADC path has
bandwidth-limited (softened) the transitions, or
introduced color-to-color skews, that will show up.
I see it, or something like it, doing analog on my LCD.

> ... the New Analog Video Interface standard, or simply
> NAVI. ... It's not clear yet how well NAVI will be
> accepted in the industry, but it IS available if
> anyone chooses to use it.

I suspect it's irrelevant at this point. Analog is
the "economy" graphics connect now, and what we have
is sufficient for the market.

I think it more likely that the analog economy model
will be replaced by a digital economy model, where PC
main RAM is used for frame buffer, and the graphics
"card" (if any) is just a TMDS driver chip with a
DVI-D connector on the bulkhead, something like the
"ADD2" cards I see at <www.molex.com&gt;.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 14, 2004 7:06:31 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:ip9sr0pah4uh0eido84mnj4vov20tadouk@4ax.com...
> But the panel is just doing an analog to digital conversion, anyway, and
> the connection is analog even when it's DVI-D, so doesn't it all just
> wash?

And actually, the panel is THEN doing a digital to analog
conversion; the LCD column drivers are basically just a series of
D/A converters in parallel. The basic drive for an LCD is an
analog voltage.

> The image on mine is really sharp, it seems, and contrast is excellent.
> No artifacts even under a loupe. It makes me wonder how much better it
> could get.

Clearly, in your case, not much at all. You have a monitor with
a well-implemented analog front end.


> Doesn't the panel correct the time base for incoming analog signals or
> something, in order to avoid this? Like the TBC in some video
> equipment?

If the analog front end is doing its job properly, yes. This
comes in the form of aligning the sampling clock with the
incoming video stream.


Bob M.
Anonymous
a b U Graphics card
December 14, 2004 7:18:20 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news:o psiy0uld6ft8z8r@news.individual.net...

> No. With a DVI-D connection, the discrete pixel digital
> values are preserved from creation in the frame buffer
> by the graphics driver all the way out to the individual
> pixel drivers for the LCD triads.

Well, they MIGHT be. In either an analog or digital
interfaced LCD monitor, there is typically a look-up table
in the monitor's front end which converts these values into
somewhat different ones, in order to correct for the rather
S-shaped (as opposed to a nice CRT-like "gamma" curve)
response of the typical LC panel. In any event, though,
whether or not having the information preserved in digital
form is an advantage in terms of accuracy depends solely on
whether or not the analog video signal is generated with, and
can be read with, similar accuracy. 8 bits/color accuracy in
an 0.7V analog signal says that the value of the LSB is about
0.7/255 = 2.7 mV. At least with a VGA connection, it is
difficult to get that level of accuracy on an instantaneous sample,
but fortunately in a video situation what is actually perceived
is the average of many samples, so this sort of visual
performance is not out of the question. The best test, in
either case, would be to take a look at a "gray scale" test
pattern with the appropriate number of values, and see if you're
satisfied with the result.


> TMDS as I recall. Transition-Minimized Digital Signalling.
> Ones and zeros. Everything is either a 1, a 0, or ignored.

Close - Transition Minimized Differential Signalling, referring
to both the encoding method used and the fact that the individual
data connections are current-differential pairs (sort of). But
the notion that everything in a "digital" connection is either a
1 or a 0 or ignored is somewhat misleading; nothing is ignored,
and it's a question of both the original transmitted data AND
noise on the line as to whether the received information will be
interpreted as a 1 or a 0 ("ignored" to the receiver is not possible;
it HAS to be interpreted as a 1 or a 0, as those are the only
possible outputs.) Digital connections are certainly not immune
to noise - they simply respond to it in a different manner. (Analog
degrades more gracefully in the presence of noise, as the LSBs
are effectively lost first; in "digital," everything stays great right
up to the point where the noise margin is exceeded, and then
everything is lost completely.)

>
> >> ... group delay ...
>

> I'd like to think so, but I wouldn't assume it.
> Clearly, when we feed the monitor a non-native res,
> it cannot match pixels, because the rasters don't map.

Again, this is not a distinction between analog and digital
interfaces. In both cases, the incoming video information
is sampled at its "native" mode (i.e., if you have an analog
interface carrying, say, a 1280 x 1024 image, then there will
be 1280 samples taken per active line, no matter what the
panel format is). Image scaling is done later in the pipe, in
both analog- and digital-input cases. (It would be far worse
in the digital case if this were not true.)


Bob M.
Anonymous
a b U Graphics card
December 14, 2004 7:18:21 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Bob Myers <nospamplease@address.invalid> wrote:

> In either an analog or digital interfaced LCD monitor,
> there is typically a look-up table in the monitor's
> front end which converts these values into somewhat
> different ones, in order to correct for the rather
> S-shaped (as opposed to a nice CRT-like "gamma" curve)

The monitor knows that the incoming data will be
pre-compensated to a gamma (log curve) in the 1.8 ... 2.6
range, or maybe be linear (no re-comp).

Why doesn't the look-up more fully adjust-out the
S-curve, so that color errors that can be corrected
with the simple exponent adjustment of typical graphics
card gamma control menus?

My guess is that because LCD subpixels are just barely
8-bit, a full correction might minimize color errors at
the expense of introducing visible terracing in gradients.

And the solution relies on future 10-bit panels.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 14, 2004 7:22:58 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:p 1csr01gsvonbu3om7cb7i9gdd64v9t1fq@4ax.com...
> How many levels per pixel? An analog connection can have any number of
> levels, depending only on the quality of the connection and hardware.
> My card already generates 32-bit color, although my flat panel can't use
> all that resolution.

Slight correction time here - "32-bit color" generally does NOT
imply more than 8 bits per primary. What is called "32 bit color"
in the PC world is really just a way to align 24 bits of information
(8 each of RGB) within a four-byte space, for ease of handling the
data (as opposed to having the video information coming in three-byte
packets).


> > Most cards provide DVI-I ports, which have both one link
> > of DVI-D digital and RGB analog (sometimes called DVI-A,
> > plus a breakout-to-HD15 cable for analog use). By DVI-D,
> > I mean use the card's DVI port, and a DVI cable, and
> > assure yourself that if both signals are present, the
> > monitor is using the digital, and not the analog.
>
> I'll look again and see if there's a DVI-D plug, but I rather doubt it.

To clarify the DVI terminology here:

DVI-I is a DVI implementation in which RGB analog signals
are provided along with one OR two TMDS links; either interface
may be used, although they might not have identical capabilities.
(Often, two different EDID files will be provided in the monitor,
each of which describes the monitor's capabilities on one of the
two interfaces.)

DVI-D is a variant of DVI which does not carry analog video,
and so does not provide pins in that part of the connector. It
too may provide either one or two TMDS links, AKA "channels."
Each channel carries three data pairs, and has a capacity of
up to 165 Mpixels/second, 24 bits/pixel.


Bob M.
Anonymous
a b U Graphics card
December 14, 2004 7:26:18 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:i6usr0td89lmed0d0svan3jtmrj4d83kva@4ax.com...

> 24-bit color isn't good enough for some applications. It doesn't
> provide enough levels in some parts of the gamut, such as blue (the eye
> is extremely sensitive to differences in intensity in the blue end of
> the spectrum, so any kind of posterization is very easy to spot).

Well, actually, it's the green region of the spectrum
where the eye has its best discrimination ability, but that's
beside the point. You're right in noting that 8 bits/color is
not sufficient for many demanding applications, especially if
a linear encoding is assumed. Somewhere in the 10-12 bit
region is generally considered adequate for just about anything,
though.

>
> > It's not clear to me that contemporary LCD panels can
> > even deliver 24-bit color. They accept such signals,
> > but what they paint on screen is another matter.
>
> True for many cheaper CRTs, too. But it worries me that the standard
> apparently was designed in a way that permanently limits it to 24-bit
> color.

Yes and no. DVI DOES provide the option of a second
link, which could be used for either greater "color depth"
or support for higher "resolutions" (in the pixel format
sense of the word). It's just rarely used in this manner, in
part due to the lack of panels which support more than 8 bits
per primary at present.


Bob M.
Anonymous
a b U Graphics card
December 15, 2004 12:08:01 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> Here's a lousy photo of a bulkhead with both
> DVI and HD15:

Nope, I don't have that.

By the way, does anyone build video cards optimized for 2D, photographic
and prepress use, instead of always emphasizing animation and 3D
performance?

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 12:08:02 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

> By the way, does anyone build video cards optimized
> for 2D, photographic and prepress use, instead of
> always emphasizing animation and 3D performance?

Yes. Matrox (which is what I use):
<http://www.matrox.com/mga/home.htm&gt;

They are way behind in 3D perf, and only just
announced their first PCI-Express card.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 15, 2004 12:20:11 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> Bob, as much as I hate to disagree with you, I'm afraid
> I'd have to vote "maybe" instead. For the most part, the
> differences between an analog and a digital interface for
> LCD monitors come down to questions of pixel timing,
> which really have nothing at all to do with whether the
> video information is in digital or analog form.

The best analog system will always beat the performance of the best
digital system. There's nothing about analog technology that makes it
intrinsically inferior to digital, so a good video card and a good
monitor should meet or beat any digital interface, I should think.

This is why the _best_ analog audio systems can consistently beat the
best digital systems. However, the superior performance comes at a
price that is usually all out of proportion with the increment of gain
over digital.

> Oddly enough, the LCD is NOT inherently a "digital"
> device as is often assumed - fundamentally, the control
> of the pixel brightness in any LCD is an analog process.

Every interface between the digital world and the physical world is
analog, so all input and output devices are ultimately analog devices.
"Digital" only means something in the conceptual world of information
representation.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 12:20:12 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

> The best analog system will always beat the
> performance of the best digital system.

Depending on how you define "best", as we saw with the
early debates about CD audio. Now that purists can get
48-bit 96 KHz digital audio, I don't see that debate
anymore.

> Every interface between the digital world and the
> physical world is analog, ...

Not at the quantum level.
Expect the physicists to sail in here and dispute that :-)

Is anyone prepared to argue that using an HD15 analog
connection to an LCD monitor provides a "better" presentation?

It's conceivable, due to the anti-aliasing provided by the
analog blur. I was actually a bit startled by how crisp
the screen was using the DVI-D connection. In my CAD work,
I now always see stair-casing of angled and curved lines,
whereas on the CRT monitor (same res), they were smooth.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 15, 2004 12:24:19 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> Well, that's the classic promise and peril of digital.
> It's either as perfect as it ever gets, or it's not
> there at all, whereas analog may never be perfect
> enough, and opportunities for degradation abound.

Analog can also be more perfect than digital. In fact, it is always
possible to build an analog system that is superior to any given digital
system--if money is no object.

> Umm, if the bits in the frame buffer are going thru a
> DAC (which can introduce noise and distortion), then
> thru a cable (which <ditto>), even if the LCD is not using
> an ADC, and is using the analog signal directly, that
> extra noise and distortion may show up on screen.

Sure, but the question is whether or not it actually does to any visible
extent in the real world.

I've found that, in many respects, PC video systems perform better than
they are supposed to. For all the noise one hears about the horrors of
analog systems, in real life they perform amazingly well. Look no
further than the continuing superiority of CRTs for most aspects of
image quality for proof.

> I suspect it's irrelevant at this point. Analog is
> the "economy" graphics connect now, and what we have
> is sufficient for the market.

Economy perhaps, but that isn't always correlated with quality.

> I think it more likely that the analog economy model
> will be replaced by a digital economy model, where PC
> main RAM is used for frame buffer, and the graphics
> "card" (if any) is just a TMDS driver chip with a
> DVI-D connector on the bulkhead, something like the
> "ADD2" cards I see at <www.molex.com&gt;.

I suspect the current "high-performance" digital models will become the
"digital economy" models, in time.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 12:31:35 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> My guess is that because LCD subpixels are just barely
> 8-bit, a full correction might minimize color errors at
> the expense of introducing visible terracing in gradients.

The incoming data might be 8-bit, but there's no reason why the internal
correction of the monitor can't be carried out with much higher
granularity.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 12:41:22 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> They are way behind in 3D perf, and only just
> announced their first PCI-Express card.

But are they ahead in 2D performance and image quality? I have a
Millennium II card in my oldest PC, which as always served very well.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 12:41:23 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> They are way behind in 3D perf, and only just
>> announced their first PCI-Express card.

> But are they ahead in 2D performance and image quality? I have a
> Millennium II card in my oldest PC, which as always served very well.

It depends on your applications, operating system,
PC, and graphics slot (AGP, PCI, PCI-X or PCIe).
You need to hit some forums devoted to your key
apps and get advice.

The two most graphics-intensive things I do, Photoshop
and IMSI TurboCAD, seem to get no particular benefit
from the accelerations available on ATI and Nvidia cards,
and perform quite adequately on a Matrox Parhelia.

Photoshop is compute and bus-bound.

TC uses OGL, but only for modes where performance isn't
an issue anyway. In fully-rendered mode, it's doing that
entirely in host software, and is purely compute-bound.

If I ran games, the config might have been different.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 15, 2004 12:42:56 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> I was actually a bit startled by how crisp
> the screen was using the DVI-D connection. In my CAD work,
> I now always see stair-casing of angled and curved lines,
> whereas on the CRT monitor (same res), they were smooth.

I doubt that this is a result of switching to a digital connection.

Note also that aliasing is usually a sign of lower resolution, not
higher resolution.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 12:42:57 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> I was actually a bit startled by how crisp
>> the screen was using the DVI-D connection. In my CAD work,
>> I now always see stair-casing of angled and curved lines,
>> whereas on the CRT monitor (same res), they were smooth.

> I doubt that this is a result of switching to a digital connection.

Re-running the comparison, I see that it was partly due
to going digital, but mostly due to switching to LCD.
The former CRT (same res) was providing some additional
de-crisping :-)

> Note also that aliasing is usually a sign of lower
> resolution, not higher resolution.

In this case, I'm making no changes to the video setup
when I switch between CRT and LCD, or analog and digital
on the LCD.

Just playing around in analog mode on the LCD, I see
not only the pink halo on black-on-white objects, but
also some ghosting (or ringing). Likely a result of the
KVM switch and extra cable in that path.

And painting a test pattern with alternating single-pixel
white-black, the white is not pure (but, impressively,
the alignment of the data and display rasters is perfect);
no gray moire.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 15, 2004 1:42:47 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news:o psi0je4o1ft8z8r@news.individual.net...

> The monitor knows that the incoming data will be
> pre-compensated to a gamma (log curve) in the 1.8 ... 2.6
> range, or maybe be linear (no re-comp).

No, the monitor knows nothing about how the incoming
video data is biased; the video source (the host PC) MAY
apply a pre-compensation based on what it knows of the
monitor's response curve (based on the gamma value given
in EDID). But the "correction" the host applies to the
video data is not the issue here. (Whether or not any
correction SHOULD be applied is another matter, and one
that probably deserves some attention later on.) But all
the monitor really knows is that it's getting such-and-such
an input level.

The problem is that while the CRT provides, just by its
nature, a nice "gamma" curve (it's nice for a number of
reasons, not the least of which is that it's a very good match
to the inverse of the human eye's own response curve -
the bottom line result being that linear increases in the input
video level LOOK linear to the eye, even though the actual
output of light from the tube is varying in an objectively
non-linear fashion), the LCD does not do this. The LCD's
natural response curve, from a perceptual standpoint, is
ugly - a S-shaped curve which is sort of linear in the
middle and flattens out at both the black and white ends.


> Why doesn't the look-up more fully adjust-out the
> S-curve, so that color errors that can be corrected
> with the simple exponent adjustment of typical graphics
> card gamma control menus?
>
> My guess is that because LCD subpixels are just barely
> 8-bit, a full correction might minimize color errors at
> the expense of introducing visible terracing in gradients.

Even if they're fully eight bits, that's not enough IF you
are also advertising to the outside world (i.e., to those
devices ahead of the LUT) that you're providing a true
eight-bit accuracy. You've already mapped some of those
values off what they're expected to be, which in effect
will compress the curve in some areas and cause, for
instance, two successive input values to result in the same
ONE output value. You need finer control of the pixel
gray level, relative to the specified accuracy of the input
data, to be able to both compensate the response curve
AND provide that specified accuracy at all levels.

Bob M.
Anonymous
a b U Graphics card
December 15, 2004 1:42:48 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Bob Myers <nospamplease@address.invalid> wrote:

>> The monitor knows that the incoming data will be
>> pre-compensated to a gamma (log curve) in the 1.8 ... 2.6
>> range, or maybe be linear (no re-comp).

> No, the monitor knows nothing about how the incoming
> video data is biased; the video source (the host PC) MAY
> apply a pre-compensation based on what it knows of the
> monitor's response curve (based on the gamma value given
> in EDID).

I was using "know" in the metaphorical sense. The
monitor maker knows that the signal is apt to be
either linear, or pre-comped in the 1.8 - 2.6 gamma
range ...

.... and that if the user has any tool for dealing with
a mismatch of expectations, it's apt to be just a simple
exponent control, and maybe ganged (can't separately
adjust R, G and B).

> (Whether or not any correction SHOULD be applied is
> another matter, and one that probably deserves some
> attention later on.)

Is a gamma standard a topic of any of the follow-on
standards to DVI? Packet? Send-changed-data-only?

> Even if they're fully eight bits, that's not enough IF you
> are also advertising to the outside world (i.e., to those
> devices ahead of the LUT) that you're providing a true
> eight-bit accuracy. You've already mapped some of those
> values off what they're expected to be, which in effect
> will compress the curve in some areas and cause, for
> instance, two successive input values to result in the same
> ONE output value. You need finer control of the pixel
> gray level, relative to the specified accuracy of the input
> data, to be able to both compensate the response curve
> AND provide that specified accuracy at all levels.

No problem, just do error-diffused dithering in the
monitor's full-frame buffer :-)

Now this could be done in the host, but then we'd need
some new VESA standard for reading back the tables of
stuck values.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 15, 2004 1:46:11 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:acjur0la3u84nchrn17juprb3d0iuadssb@4ax.com...
> The incoming data might be 8-bit, but there's no reason why the internal
> correction of the monitor can't be carried out with much higher
> granularity.

The "granularity" of the look-up table data is not the
limiting factor; it's the number of bits you have at the
input to the panel, vs. the numer of bits you claim to
have at the input to the overall system. If I map 8-bit
input data to, say, 10-bit outputs from the look up
table, I don't get as good a result as I want if the panel
itself has only 8 bits of accuracy. I need to at the very
least call in some additional tricks (which ARE available
- some frame-to-frame dithering can help, for example)
to be able to take advantage of the greater accuracy in
in the middle of the chain.

Bob M.
Anonymous
a b U Graphics card
December 15, 2004 1:59:21 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news:o psi0bkbqcft8z8r@news.individual.net...

> But there are opportunities for the signal to get
> visibly degraded if it goes to analog before it gets
> to the LCD panel lattice. In the entirely unscientific
> test I just ran, where I saw exactly what I expected to
> see, the analog happened to be running through two 2m
> lengths of HD15 cable and a KVM switch. The LCD image
> went from pixel-perfect to slightly fuzzy, and perhaps
> also reduced "contrast".

Oh, sure - but then, that's a bad thing to do to any connection.
Have you tried the corresponding experiment with a
digital interface running at its max. pixel rate? (Nope -
because passive switchboxes and the like simply don't
work with digital interfaces.) In an apples-to-apples
comparison, say a VGA vs. a DVI over the standard
2-3 meters of good quality cable in each case, the
differences you will see are due to sampling errors in the
analog case. Or in other words, the advantage of the digital
interface is that it brings its "sampling clock" along with
the data.


> Umm, if the bits in the frame buffer are going thru a
> DAC (which can introduce noise and distortion), then
> thru a cable (which <ditto>), even if the LCD is not using
> an ADC, and is using the analog signal directly, that
> extra noise and distortion may show up on screen.

Sure; the question is always going to be whether or not
that "noise and distortion" is below the level we care
about. Digital interfaces are not error-free, either; that
they are acceptable, when they are, is the result of the bit
rate being below perceivable levels. Similarly, if the analog
interface delivers a stable image with the video data to
the desired level of amplitude accuracy (in most cases here,
to an 8 bit/sample level, or an accuracy of about +/- 1,5 mV
in "analog" terms), the difference between the two interfaces
will not be distinguishable. It is ALWAYS a matter of how
good is good enough, and neither type of connection is
ever truly "perfect."


> I sorta suspected that, but in the DVI-D model, the
> signal remains digital until it hits the rows & columns, no?

Well, until it hits the column drivers, yes. On the other hand,
there HAVE been LCD panels made, notably by NEC,
which preserved the analog video signal in analog form clear
through to the pixel level.


> Does the typical analog-only LCD have a DAC? Or does it
> just sample the analog signal and route values to drivers?

It has an ADC right up front - it generally has to, especially
if it supports any sort of image scaling, which is definitely
something best done in the digital domain. Scaling does
not necessarily imply a full frame buffer; modern scalers
make do with a few lines' worth of buffering, unless
frame rate conversion is also required - in which case at
least a good deal of a frame's worth of data must be stored,
and in the best versions a full frame buffer or two of memory
is used.



> Even if the clocks align, there's also the matter of
> whether or not the analog signal has completely slewed
> to the value needed. If the DAC-cable-ADC path has
> bandwidth-limited (softened) the transitions, or
> introduced color-to-color skews, that will show up.
> I see it, or something like it, doing analog on my LCD.

Sure - but you can't really lay the blame for having a BAD
analog interface on analog connections in general. The
point is that a very good interface is still most definitely possible
in the analog domain, and is in fact achieved quite often. There
are also analog systems which take advantage of the rather
forgiving nature of analog to enable truly cheap and nasty
cables, connectors, etc., at the expense of performance.
Digital, as noted, either works or it doesn't - which is a big
part of the reason that digital interfaces are not as inexpensive
as the cheapest (and lowest quality!) of the analog types.
You simply HAVE to meet a certain minimum level of
performance with digital, or you don't get to play AT ALL.

> > ... the New Analog Video Interface standard, or simply
> > NAVI. ... It's not clear yet how well NAVI will be
> > accepted in the industry, but it IS available if
> > anyone chooses to use it.
>
> I suspect it's irrelevant at this point. Analog is
> the "economy" graphics connect now, and what we have
> is sufficient for the market.

Possibly; we'll see how it plays out. While digital
interfaces are becoming a lot more popular, analog
connections still account for well over 80% of the
video actually being used in the desktop monitor
market, even though LCDs took over from CRTs
as the unit volume leader this past year. As you know,
a gargantuan installed base has certain advantages
(or problems, which is often a different word for the
same thing! :-)).

Bob M.
Anonymous
a b U Graphics card
December 15, 2004 1:59:22 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Bob Myers <nospamplease@address.invalid> wrote:

>> > ... the New Analog Video Interface standard, or simply
>> > NAVI. ... It's not clear yet how well NAVI will be
>> > accepted in the industry, but it IS available if
>> > anyone chooses to use it.

>> I suspect it's irrelevant at this point. Analog is
>> the "economy" graphics connect now, and what we have
>> is sufficient for the market.

> Possibly; we'll see how it plays out. While digital
> interfaces are becoming a lot more popular, analog
> connections still account for well over 80% of the
> video actually being used in the desktop monitor
> market, even though LCDs took over from CRTs
> as the unit volume leader this past year. As you know,
> a gargantuan installed base has certain advantages
> (or problems, which is often a different word for the
> same thing! :-)).

Does NAVI bring any benefits to the installed base of
CRTs? Does it matter if it does?

If it does bring benefits to LCD via analog connect,
does that matter? I suspect the users who care about
whatever NAVI promises, will tend to go digital.

And I have a suspicion that the temptation on entry-
level PCs in the near future will be an analog-free
connection. A dumb UMA frame buffer, exposed thru a
TMDS chip thru a DVI-D (only) port on the back panel,
thru a DVI-D (only) cable, to a DVI-D (only) monitor.
Omits a couple of buffers, a DAC, an ADC (maybe) and
some copper. Maybe only runs at native res. Does DVI
allow captive cable at display?

The entire concept of "high end CRT" is already dead,
and increasingly what remains of new CRTs in the market
will tend toward junk (or be seen as so). The momentum
to flat panel (LCD or not) may cause the entire analog
graphics connection to go the way of the impact printer
before NAVI can get a foothold.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 15, 2004 2:01:38 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:voiur0ltbpk88i1aehf4g17qsotq7cumb5@4ax.com...
> Analog can also be more perfect than digital. In fact, it is always
> possible to build an analog system that is superior to any given digital
> system--if money is no object.

Exactly. Both are simply means of encoding information
for transmission; when comparing "analog" to "digital," the
best that you can ever do is to compare one given
implementation of "analog" vs.a given implementation of
"digital." Neither "analog" nor "digital" is inherently
superior to the other, per se. Each has its own advantages
and disadvantages, and there is a lot of misunderstanding
as to just what those are in each of these.

Bob M.
Anonymous
a b U Graphics card
December 15, 2004 2:13:33 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:fdiur09qghsmlp0pa3bscnpk7ts7iidocb@4ax.com...
> The best analog system will always beat the performance of the best
> digital system.

Unfortunately, I'm going to have to disagree with that, as
well; as I noted in another response here, neither type of
interface, per se, is inherently superior to the other.
Both are ultimately limited by the Gospel According to
St. Shannon, which puts strict limits on how much data
you can get through a given channel REGARDLESS of
how that data is encoded. Now, a particular sort of
a digital interface may or may not be superior to a
particular sort of analog; it depends on the specific
characteristics of the interfaces in question, and just what
is important, in a given application, in determining
"superior."


> This is why the _best_ analog audio systems can consistently beat the
> best digital systems.

That's not the only reason for this; high-end audio also
incorporates huge dollops of what can only be seen as
"religious" beliefs, with no basis in reasoning or evidence,
re a given individuals' views on what is "superior." (I
mean no disrespect to religion in saying this; I am simply
noting that there is a difference in kind between a belief
held solely on faith, and one arrived at through a careful
and objective consideration of evidence.) In the case of
audio, an awful lot of what has been claimed for the various
"digital" and "analog" systems is quite simply wrong.
(This isn't the place for that discussion - I'm sure it
continues, unfortunately quite healthy after all these years,
over in rec.audio.high-end, a group I left a long time ago
for just this reason. There's just no sense in discussing
something when very few are interested in anything
other than argument by vigorous assertion.)



>
> > Oddly enough, the LCD is NOT inherently a "digital"
> > device as is often assumed - fundamentally, the control
> > of the pixel brightness in any LCD is an analog process.
>
> Every interface between the digital world and the physical world is
> analog, so all input and output devices are ultimately analog devices.

No. This is a common misconception regarding what is
meant by the term "analog." It does NOT necessarily mean
a system which is "continuous," "linear," etc., even though
in the most common forms of analog systems these are
often also true. "Analog" simply refers to a means of encoding
information in which one parameter is varied in a manner
ANALOGOUS TO (and hence the name) another - for
example, voltage varying in a manner analogous to the original
variations in brightness or sound level. The real world is
not "analog" - it is simply the real world. "Analog" points
to one means of describing real-world events, as does
"digital."

> "Digital" only means something in the conceptual world of information
> representation.

"Digital" is simply another means of representing information;
one in which the information is described as a series of
"digits" (numbers), and again, this is reflected in the name.
It is neither inherently less accurate or more accurate than
"analog" per se - that comparison always depends on the
specifics of the two implementations in question.

If you want a truly painful and detailed treatment of this
question (well, it HAS been one of my hot butons), I
spent a whole chapter on the subject in my book
"Display Interfaces: Fundamentals & Standards."


Bob M.
Anonymous
a b U Graphics card
December 15, 2004 2:14:19 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news:o psi0l3q0rft8z8r@news.individual.net...
> Is anyone prepared to argue that using an HD15 analog
> connection to an LCD monitor provides a "better" presentation?

Sure - but first, you have to define "better." :-)

Bob M.
Anonymous
a b U Graphics card
December 15, 2004 2:16:43 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> Re-running the comparison, I see that it was partly due
> to going digital, but mostly due to switching to LCD.
> The former CRT (same res) was providing some additional
> de-crisping :-)

Remember that, in theory, there's no fixed upper limit to horizontal
resolution on a CRT, although the mask or grille spacing imposes some
practical restrictions. So you could be seeing additional detail on the
CRT that the LCD cannot display, in some cases.

> Just playing around in analog mode on the LCD, I see
> not only the pink halo on black-on-white objects, but
> also some ghosting (or ringing). Likely a result of the
> KVM switch and extra cable in that path.

It has to be distortion of the signal. The panel is just going to
sample the signal, so if there's a pink halo on the screen, there's one
in the signal.

I'm happy to say that I see no such artifacts on my screen. I just have
a simple 2-metre cable betwixt PC and panel (the cable supplied with the
panel).

> And painting a test pattern with alternating single-pixel
> white-black, the white is not pure (but, impressively,
> the alignment of the data and display rasters is perfect);
> no gray moire.

Maybe you just need to remove the switch and cable.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 2:16:44 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> Re-running the comparison, I see that it was partly due
>> to going digital, but mostly due to switching to LCD.
>> The former CRT (same res) was providing some additional
>> de-crisping :-)

> Remember that, in theory, there's no fixed upper limit to
> horizontal resolution on a CRT, although the mask or grille
> spacing imposes some practical restrictions.

Not to mention circuit bandwidth, beam spot size,
beam focus and grill diffraction.

> So you could be seeing additional detail on the
> CRT that the LCD cannot display, in some cases.

My impression is less detail on the CRT. Each LCD triad
definitely represents one graphics card frame buffer pixel.
On the CRT, each fb pixel gets smeared into its neighbors
a bit, via one or more of the above mechanisms.

>> Just playing around in analog mode on the LCD, I see
>> not only the pink halo on black-on-white objects, but
>> also some ghosting (or ringing). Likely a result of the
>> KVM switch and extra cable in that path.

> It has to be distortion of the signal. The panel is just
> going to sample the signal, so if there's a pink halo on
> the screen, there's one in the signal.

I've little doubt that the artifacts are due to the analog
connection outside the monitor. And they probably would
improve if I used a single shorter run of HD15 cable.

> Maybe you just need to remove the switch and cable.

Normally, it's only used for temporary PC connections,
so it's not an on-going issue.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 15, 2004 2:17:00 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:k0kur0l41j26inl877ikc58uoqpnpi160s@4ax.com...
> Note also that aliasing is usually a sign of lower resolution, not
> higher resolution.
>

Well, no - "aliasing," if that's truly what a given observation
is all about, is always a sign of improper sampling, whether
it's in an analog situation or a digital one. See "Nyquist
sampling theorem" for further details.

The classic sort of "aliasing" in displays is the good ol'
Moire pattern common to CRTs. What few people realize
is that such patterns were in the past seen as GOOD things
when a CRT maker was testing a new tube, as being able
to see the Moire pattern was a visible indication that the
tube was able to focus sufficiently well!

Bob M.
Anonymous
a b U Graphics card
December 15, 2004 2:23:01 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:1cpur0po7frloek9q1vn306asak5ogji42@4ax.com...
> Remember that, in theory, there's no fixed upper limit to horizontal
> resolution on a CRT,

No, there is ALWAYS an upper limit to the resolution of
a CRT - for the simple reason that, even in theory, an
infinite bandwidth channel is not possible. Any limitation
on bandwidth in the video signal path represents a
resolution limit. And with respect to the CRT specifically,
other resolution limits come in due to the lower limits on the
physical spot size and the ability of the tube to switch the
beam on and off (i.e., you can't make a CRT without
capacitance in the gun structure, so you can never get an
infinitely short rise/fall time unless you can come up with a
video amp that's a perfect voltage source, capable of
delivering infinite current when needed).

> although the mask or grille spacing imposes some
> practical restrictions. So you could be seeing additional detail on the
> CRT that the LCD cannot display, in some cases.

And the mask or grille, along with the phosphor dot structure,
places some very similar limits on the resolution available
from the CRT as does the physical "pixel" structure of the
LCD or other FPD type. (Whether or not the limits are the
SAME for a given pixel pitch is really more a question of
such things as whether or not the LCD in question permits
sub-pixel addressing, which few so far do.)


Bob M.
Anonymous
a b U Graphics card
December 15, 2004 8:39:51 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> Possibly; we'll see how it plays out. While digital
> interfaces are becoming a lot more popular, analog
> connections still account for well over 80% of the
> video actually being used in the desktop monitor
> market, even though LCDs took over from CRTs
> as the unit volume leader this past year.

If my memory serves me correctly, the earliest monitor connection
interfaces for PCs (CGA and EGA, for example) were _digital_ connection
interfaces. VGA went "backwards" to analog to provide higher
resolutions and color depths, and greater flexibility.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 8:45:02 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> The entire concept of "high end CRT" is already dead ...

Not for the most critical uses. A high-end CRT is still the best image
quality overall, if you really need the absolute best.

CRTs also still dominate at the low end, since they are ten times
cheaper than flat panels.

As in so many other domains, the advantages of digital do not involve
actual quality, but instead they involve convenience. And in the
imaging field, the usual cost advantage of digital doesn't exist,
either--digital imaging equipment is at least as expensive as analog
equipment, because of the bandwidths required.

> and increasingly what remains of new CRTs in the market
> will tend toward junk (or be seen as so).

CRTs are projected to be clear leaders on the market for years to come.
Flat panels receive all the media hype, but they are not actually
running the show. It all reminds me of the very similar situation in
"digital" (electronic) photography vs. film photography.

> The momentum
> to flat panel (LCD or not) may cause the entire analog
> graphics connection to go the way of the impact printer
> before NAVI can get a foothold.

Not likely any time soon. The inertia of the computer industry today is
enormous; things no longer change over night. The VGA interface may be
around indefinitely, and some users are still using earlier interfaces
(which, ironically, were digital).

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 8:45:03 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> The entire concept of "high end CRT" is already dead ...

From a market standpoint, I hasten to add.

Sony, for example, has ditched all but one of
their CRTs, most recently the GDM-FW900 24" wide,
even though it sold for less than the 23" LCD
that replaced it. The entire Sony entry-level,
mid-range and hi-end consumer and business CRT
product line is done for. Sony was selling CRTs
using a "higher quality" positioning. The customers
took the extra cash and spent in on LCD.

> Not for the most critical uses. A high-end CRT
> is still the best image quality overall, if you
> really need the absolute best.

And you pay dearly for that. The remaining Sony GDM-C520K
is a $2000 product. But customers other than graphics
professionals, who have $2K to spend, are spending
it on LCD. The wider market for "quality" CRTs is gone.

> CRTs also still dominate at the low end, since
> they are ten times cheaper than flat panels.

Not 10x. LCD prices have been collapsing. Using Wal-Mart
as a low-end reseller, their low-end 17"LCD is only
1.3x of their low-end 17"CRT. True, you can get into a
CRT for $70, and their cheapest LCD is $188, but that's
still only 2.7x.

You can watch the Asian press lament the near-daily LCD
pricing collapse at: <http://www.digitimes.com/&gt;

> As in so many other domains, the advantages of digital
> do not involve actual quality, but instead they involve
> convenience.

It has ever been thus. In addition to being trendy and
cool, LCDs are cheaper to ship, use less power, turn on
faster, are easier to install and move around, take up
less space and are less of a problem at disposal time.
The small premium they still command is something an
increasing number of average users are willing to pay.

> CRTs are projected to be clear leaders on the market for
> years to come.

Only if someone is still making them.

> It all reminds me of the very similar situation in
> "digital" (electronic) photography vs. film photography.

Yep. I dumped all my 35mm gear on eBay last year, went
all-digital, and haven't regretted it for a moment.
Silver halide is racing CRT to the exit, but both will
be around for a while yet.

> The VGA interface may be around indefinitely, and some
> users are still using earlier interfaces (which,
> ironically, were digital).

Yep, we've come full circle to CGA and EGA :-)

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
December 15, 2004 8:49:29 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> Not to mention circuit bandwidth ...

Circuit bandwidth places an even greater restriction on digital
transmission. For any given channel speed, the real-world capacity of
the channel is always lower for digital transmission than for analog
transmission.

Remember that digital transmission is nothing more than declaring an
arbitrary signal level as a noise threshold, and considering anything
below it as noise and anything above it as information. Inevitably,
this reduces the information-carrying capacity of the channel.

> ... beam spot size, beam focus and grill diffraction.

True, but CRT manufacture is extremely mature, and amazing things can be
done.

There was a time when NTSC meant "never the same color," but even NTSC
is amazingly precise these days--more so than many people would have
ever thought possible.

> My impression is less detail on the CRT. Each LCD triad
> definitely represents one graphics card frame buffer pixel.
> On the CRT, each fb pixel gets smeared into its neighbors
> a bit, via one or more of the above mechanisms.

The total information content on the screen is the same, though.

Some high-end CRTs for broadcast video use have a mode that deliberately
reduces bandwidth in order to produce a more natural-looking image
through the filtering of high-frequency signal that bandwidth
restriction produces. CRTs can handle extremely high resolutions if
need be.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 8:52:39 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> No, there is ALWAYS an upper limit to the resolution of
> a CRT - for the simple reason that, even in theory, an
> infinite bandwidth channel is not possible.

But I said no _fixed_ upper limit. The upper limit depends on the
performance of all the components in the chain. Ideally it is equal to
or better than the design limit of those components.

So a CRT might be designed to provide x resolution, but in fact it might
stretch to x+10% resolution. Of course, when digital elements are
present in the chain, the extra resolution, if any, is wasted.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 9:07:52 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> Unfortunately, I'm going to have to disagree with that, as
> well; as I noted in another response here, neither type of
> interface, per se, is inherently superior to the other.

But all digital systems are simply analog systems operated in a
predefined way that declares anything below a certain threshold to be
noise. So the capacity of a digital system is always inferior to that
of an analog system with similar components and bandwidth.

Furthermore, the physical interface at either end of any system is
_always_ analog, so the system as a whole is never better than the
analog input and output components.

It's possible to surpass analog if you are building a system that does
not interface with the physical world. For example, if the system
handles _only_ information (such as accounting data), then you can
easily surpass analog performance with digital methods. But for any
system that requires a physical interface--audio, video, etc.--no
digital system can ever be better than the best possible analog system.
This is inevitable because all digital systems of this kind are just
special cases of analog systems.

> Both are ultimately limited by the Gospel According to
> St. Shannon, which puts strict limits on how much data
> you can get through a given channel REGARDLESS of
> how that data is encoded.

Yes. If the channel is analog, the limit of the channel's capacity is
equal to the limit imposed by Shannon. But if the channel is digital,
the limit on capacity is always below the theoretical limit, because you
always declare some portion of the capacity to be noise, whether it
actually is noise or not. This is the only way to achieve error-free
transmission, which is the advantage of digital.

In analog systems, there is no lower threshold for noise, but you can
use the full capacity of the channel, in theory, and in practice you're
limited only by the quality of your components. In digital systems, you
declare _de jure_ that anything below a certain level is noise, so you
sacrifice a part of the channel capacity, but in exchange for this you
can enjoy guaranteed error-free transmission up to a certain speed.

> That's not the only reason for this; high-end audio also
> incorporates huge dollops of what can only be seen as
> "religious" beliefs, with no basis in reasoning or evidence,
> re a given individuals' views on what is "superior."

Not necessary. Ultimately, audio systems (and imaging systems) depend
on analog devices for input and output. So no system can ever be better
than the best analog system. This is inevitable for any system that
requires interfaces with the physical world, such as displays,
microphones, speakers, etc., all of which _must_ be analog.

The real problem with analog is not its ability to provide quality
(which is limited only by the limits of information theory) but the
extremely high expense and inconvenience of obtaining the best possible
quality. Digital provides a slightly lower quality for a dramatically
lower price.

Just look at flat panels: they provide defect-free images at a fixed
resolution, but they don't provide any higher resolutions. CRTs have no
fixed upper limit on resolution, but they never provide defect-free
images.

> No. This is a common misconception regarding what is
> meant by the term "analog." It does NOT necessarily mean
> a system which is "continuous," "linear," etc., even though
> in the most common forms of analog systems these are
> often also true. "Analog" simply refers to a means of encoding
> information in which one parameter is varied in a manner
> ANALOGOUS TO (and hence the name) another - for
> example, voltage varying in a manner analogous to the original
> variations in brightness or sound level. The real world is
> not "analog" - it is simply the real world. "Analog" points
> to one means of describing real-world events, as does
> "digital."

Analog reduces to using the entire channel capacity to carry
information, and tolerating the losses if the channel is not noise-free.
Digital reduces to sacrificing part of channel capacity in order to
guarantee lossless transmission at some speed that is below the maximum
channel capacity. With digital, you sacrifice capacity in order to
eliminate errors. With analog, you tolerate errors in order to gain
capacity.

Only analog systems can reach the actual limits of a channel in theory,
but ironically digital systems usually do better in practice. Part of
this arises from the fact that analog systems introduce cumulative
errors, whereas digital systems can remain error-free over any number of
components in a chain, as long as some of the theoretical capacity of
the chain is sacrificed in exchange for this.

I used to go with the "analogy" explanation for digital vs. analog, but
since everything in reality can be seen as _either_ a digital or analog
representation, this explanation tends to break down under close
examination. The explanation I give above does not, and it is
compatible with other explanations (for example, representing things
with symbols is just another form of the arbitrary threshold for noise
that I describe above).

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 9:11:08 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> The "granularity" of the look-up table data is not the
> limiting factor; it's the number of bits you have at the
> input to the panel, vs. the numer of bits you claim to
> have at the input to the overall system. If I map 8-bit
> input data to, say, 10-bit outputs from the look up
> table, I don't get as good a result as I want if the panel
> itself has only 8 bits of accuracy.

But the panel is driving analog pixels. If you get a 10-bit value from
the LUT, why can't you just change this directly to an analog voltage
and drive the pixels from it? You'll still be limited to 256 discrete
luminosity levels for a pixel, but east of those levels can be chosen
from a palette of 1024 steps between black and white. So you have more
precise control of gamma on output. You could use more bits to make it
even more precise.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 10:23:52 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> From a market standpoint, I hasten to add.

Even that I wonder about. Flat panels are the rage in developed
countries, but CRTs still have a market elsewhere, since they are so
cheap.

> Sony, for example, has ditched all but one of
> their CRTs, most recently the GDM-FW900 24" wide,
> even though it sold for less than the 23" LCD
> that replaced it.

I'm not sure that this was a good decision on Sony's part, but then
again, Mr. Morita has been dead for quite a while now.

> The entire Sony entry-level,
> mid-range and hi-end consumer and business CRT
> product line is done for. Sony was selling CRTs
> using a "higher quality" positioning. The customers
> took the extra cash and spent in on LCD.

So all the Artisan buyers chose LCDs instead? That's hard to believe.

> And you pay dearly for that. The remaining Sony GDM-C520K
> is a $2000 product.

About the same as any decent mid-range LCD. My little flat panel cost
that much.

> Not 10x. LCD prices have been collapsing.

You can get CRTs for $60 or so.

> True, you can get into a
> CRT for $70, and their cheapest LCD is $188, but that's
> still only 2.7x.

For a large segment of the market, that's a lot.

> You can watch the Asian press lament the near-daily LCD
> pricing collapse at: <http://www.digitimes.com/&gt;

Why do they have a problem with it? I thought margins were small.

> Only if someone is still making them.

They will likely be made in Asia for quite some time. There are still
several billion people there without monitors.

> Yep. I dumped all my 35mm gear on eBay last year, went
> all-digital, and haven't regretted it for a moment.

I still shoot film.

> Silver halide is racing CRT to the exit, but both will
> be around for a while yet.

The demise of CRTs has been predicted for forty years, and we are still
waiting.

> Yep, we've come full circle to CGA and EGA :-)

A lot of the people making the decisions today are too young to remember
CGA and EGA, so they think they're inventing something new.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Anonymous
a b U Graphics card
December 15, 2004 10:23:53 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

> So all the Artisan buyers chose LCDs instead?

No, the last remaining Sony CRT, the GDM-C520K,
is an Artisan.

> You can get CRTs for $60 or so.

Even though CD audio media was higher priced than
LP, and CD players were substantially higher priced
than turntables, CD still killed LP surprisingly rapidly.
Just because the old stuff is cheaper, and arguably
"better", may not save it. Market forces have a
logic of their own that isn't necessarily logical.

> The demise of CRTs has been predicted for forty
> years, and we are still waiting.

Well, flat panel TV had been only ten years away
for the last 50 years. It's here now. When the
existing TVs in this household fail, they'll get
replaced by something flat, for any number of
reasons.

Note Bob Myers observation that LCD sales eclipsed
CRT within the last year. That's a fairly important
event, and won't go unnoticed by industry planners.

Curiously, I also note that Apple has entirely
dropped CRTs from their product line. That really
surprised me, because I'm not convinced that LCD
is really ready yet for pre-press, broadcast DCC,
video post and movie post (entirely apart from
the recent user complaints about the color
uniformity and stability of the Cinema 23).

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
!