flat panel monitors

G

Guest

Guest
Archived from groups: alt.comp.hardware,comp.sys.ibm.pc.hardware.video (More info?)

I've heard that flat panel monitors are easier on the eyes. Is this true?
If so, then what would be a good choice for a 17" for gaming?
 
G

Guest

Guest
Archived from groups: alt.comp.hardware,comp.sys.ibm.pc.hardware.video (More info?)

On Thu, 9 Dec 2004 09:46:48 -0800, Adam Russell wrote:
> I've heard that flat panel monitors are easier on the eyes. Is this true?
> If so, then what would be a good choice for a 17" for gaming?

Make sure you get a fast response time. There are monitors with 12ms
now, like the Samsung 710T I think that's what I'm going to get... I
just haven't decided yet if having a DVI interface is worth another $100

--
* John Oliver http://www.john-oliver.net/ *
* California gun owners - protect your rights and join the CRPA today! *
* http://www.crpa.org/ Free 3 month trial membership available *
* San Diego shooters come to http://groups.yahoo.com/group/sdshooting/ *
 
G

Guest

Guest
Archived from groups: alt.comp.hardware,comp.sys.ibm.pc.hardware.video (More info?)

Adam Russell writes:

> I've heard that flat panel monitors are easier on the eyes.
> Is this true?

Any monitor with a clear, sharp, bright image and minimum flicker (which
depends on your personal sensitivity to flicker) is easy on the eyes.
It can be a CRT or a flat panel, as long as the image is of good
quality.

I try never to skimp on monitors. Eyes are worth taking care of.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"John Oliver" <joliver@john-oliver.net> wrote in message
news:slrncrhih9.nco.joliver@ns.sdsitehosting.net...
> just haven't decided yet if having a DVI interface is worth another $100

No question, I wouldn't buy one without it.

Michael
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Michael C writes:

> No question, I wouldn't buy one without it.

Is there really that much difference? I have only an analog input to
mine, but even under a loupe the individual pixels are resolved as
sharply as can be. How could a digital interface improve on that?

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> No question, I wouldn't buy one without it. [DVI-D]

> Is there really that much difference?

Yes. Several considerations:

There is noise resulting from taking the original
digital raster to analog and back to digital.
This might display, for example as horizontal
artifacts, unstable picture regions, etc.

Square waves? No chance. Think of a pattern of
alternating white/black 1-pix dots. In analog,
these need to exhibit sharp transitions and flat
tops to emulate what you get for free with DVI-D.
Bandwidth limits in the analog channels are apt
to smear this fine detail.

Group delay with analog introduces some risk that
the pixel data won't exactly precisely align with
the LCD triads upon reconstruction. Suppose the
analog signal has a little group delay (time shift)
from the DAC, or in the cable, or in the ADC (or
just one of the colors does). Our hypothetical white
and black dots might become a gray moire morass.

Even what black and white levels are, becomes
uncertain with analog.

I just compared the same screen in HD15 analog and
DVI-D digital on my LCD, and the analog image has
less "contrast", text characters are not as sharp,
and due to the grayscale tracking limits of this
monitor, black characters on white backgrounds have
a tiny annoying pink fringe in analog.

Go DVI-D. By the way, expect images and text to perhaps
be startlingly sharper until you get used to it.
The limitations of analog were providing some
full-screen anti-aliasing at no extra charge.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> There is noise resulting from taking the original
> digital raster to analog and back to digital.
> This might display, for example as horizontal
> artifacts, unstable picture regions, etc.
>
> Square waves? No chance. Think of a pattern of
> alternating white/black 1-pix dots. In analog,
> these need to exhibit sharp transitions and flat
> tops to emulate what you get for free with DVI-D.
> Bandwidth limits in the analog channels are apt
> to smear this fine detail.

But the panel is just doing an analog to digital conversion, anyway, and
the connection is analog even when it's DVI-D, so doesn't it all just
wash?

The image on mine is really sharp, it seems, and contrast is excellent.
No artifacts even under a loupe. It makes me wonder how much better it
could get.

> Group delay with analog introduces some risk that
> the pixel data won't exactly precisely align with
> the LCD triads upon reconstruction.

Doesn't the panel correct the time base for incoming analog signals or
something, in order to avoid this? Like the TBC in some video
equipment?

> Suppose the
> analog signal has a little group delay (time shift)
> from the DAC, or in the cable, or in the ADC (or
> just one of the colors does). Our hypothetical white
> and black dots might become a gray moire morass.

But the panel could delay everything and resync it to a new time base
and eliminate any movement, like TBCs do for video.

> I just compared the same screen in HD15 analog and
> DVI-D digital on my LCD, and the analog image has
> less "contrast", text characters are not as sharp,
> and due to the grayscale tracking limits of this
> monitor, black characters on white backgrounds have
> a tiny annoying pink fringe in analog.
>
> Go DVI-D. By the way, expect images and text to perhaps
> be startlingly sharper until you get used to it.
> The limitations of analog were providing some
> full-screen anti-aliasing at no extra charge.

I don't know if my video card provides it. I have an NVidia GeForce2
something-or-other, but I didn't notice a DVI-D plug on the card.
There's one on the monitor, though.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

> But the panel is just doing an analog to digital
> conversion, anyway, ...

No. With a DVI-D connection, the discrete pixel digital
values are preserved from creation in the frame buffer
by the graphics driver all the way out to the individual
pixel drivers for the LCD triads.

> ... and the connection is analog even when it's DVI-D,

TMDS as I recall. Transition-Minimized Digital Signalling.
Ones and zeros. Everything is either a 1, a 0, or ignored.

>> ... group delay ...

> Doesn't the panel correct the time base for incoming
> analog signals or something, in order to avoid this?

I'd like to think so, but I wouldn't assume it.
Clearly, when we feed the monitor a non-native res,
it cannot match pixels, because the rasters don't map.
Interpolation time. It might still perform TBC to assure
that active signal period start/end precisely align with
the ADC's sampling aperture.

>> Go DVI-D.

> I don't know if my video card provides it.

Most cards provide DVI-I ports, which have both one link
of DVI-D digital and RGB analog (sometimes called DVI-A,
plus a breakout-to-HD15 cable for analog use). By DVI-D,
I mean use the card's DVI port, and a DVI cable, and
assure yourself that if both signals are present, the
monitor is using the digital, and not the analog.

I'm using an aftermarket DVI-D cable which doesn't even
have the RGB wires - turned out to not be necessary to
be that drastic, but I didn't know that when I ordered
the cable.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> No. With a DVI-D connection, the discrete pixel digital
> values are preserved from creation in the frame buffer
> by the graphics driver all the way out to the individual
> pixel drivers for the LCD triads.

How many levels per pixel? An analog connection can have any number of
levels, depending only on the quality of the connection and hardware.
My card already generates 32-bit color, although my flat panel can't use
all that resolution.

> Most cards provide DVI-I ports, which have both one link
> of DVI-D digital and RGB analog (sometimes called DVI-A,
> plus a breakout-to-HD15 cable for analog use). By DVI-D,
> I mean use the card's DVI port, and a DVI cable, and
> assure yourself that if both signals are present, the
> monitor is using the digital, and not the analog.

I'll look again and see if there's a DVI-D plug, but I rather doubt it.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> With a DVI-D connection, the discrete pixel digital
>> values are preserved from creation in the frame buffer
>> by the graphics driver all the way out to the individual
>> pixel drivers for the LCD triads.

> How many levels per pixel?

I haven't looked at the DVI spec since 1.0, but at
that time, single-link DVI was limited to 24 bpp,
or 8-bits per R, G or B.

> My card already generates 32-bit color, although my
> flat panel can't use all that resolution.

It's not clear to me that contemporary LCD panels can
even deliver 24-bit color. They accept such signals,
but what they paint on screen is another matter.

> I'll look again and see if there's a DVI-D plug,
> but I rather doubt it.

The DVI(-I) connector is a D-sub, usually white body,
with an 8x3 grid of pins, plus a 2x2 grid with
cruciform ground planes for the RGB. If the
connector is DVI-D (digital only), it omits the
2x2 grid array and has only one of the ground blades.

If your card only has 15-pin Dsub(s) (usually blue),
then it only has analog video out. You cannot use a
pure digital connection, athough you could invest in
a monitor with DVI, and use it in analog mode until
your next graphics card upgrade.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> I haven't looked at the DVI spec since 1.0, but at
> that time, single-link DVI was limited to 24 bpp,
> or 8-bits per R, G or B.

24-bit color isn't good enough for some applications. It doesn't
provide enough levels in some parts of the gamut, such as blue (the eye
is extremely sensitive to differences in intensity in the blue end of
the spectrum, so any kind of posterization is very easy to spot).

> It's not clear to me that contemporary LCD panels can
> even deliver 24-bit color. They accept such signals,
> but what they paint on screen is another matter.

True for many cheaper CRTs, too. But it worries me that the standard
apparently was designed in a way that permanently limits it to 24-bit
color.

> The DVI(-I) connector is a D-sub, usually white body,
> with an 8x3 grid of pins, plus a 2x2 grid with
> cruciform ground planes for the RGB. If the
> connector is DVI-D (digital only), it omits the
> 2x2 grid array and has only one of the ground blades.

That doesn't sound familiar at all. The Web references I've found seem
to claim that my card should have this, but I don't see it.

> If your card only has 15-pin Dsub(s) (usually blue),
> then it only has analog video out. You cannot use a
> pure digital connection, athough you could invest in
> a monitor with DVI, and use it in analog mode until
> your next graphics card upgrade.

This is what I've done. The performance even with analog input is very
impressive, and ClearType works very well, also, even though it's
supposedly designed for pure digital input.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

> But it worries me that the [DVI] standard
> apparently was designed in a way that
> permanently limits it to 24-bit color.

Not permanent - it just requires dual-link, and
the pin assignments are already present in the
existing connector.

But yes, DVI was short-sighted. Another obvious
limit was that a single link can only hit 1600x1200
with normal blanking. Reduced timing (common) is
hacked in to hit 1920x1200, but beyond that, dual-link
is again required.

>> The DVI(-I) connector is a D-sub, usually ...

> That doesn't sound familiar at all.

Here's a lousy photo of a bulkhead with both
DVI and HD15:
<http://www.hardocp.com/image.html?image=MTA4OTIzNDA3NzZ6ZkxtNW1xRXpfMV82X2wuanBn>

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news:eek:psiyzfkcvft8z8r@news.individual.net...
> > Mxsmanic <mxsmanic@hotmail.com> wrote:
>
> >> No question, I wouldn't buy one without it. [DVI-D]
>
> > Is there really that much difference?
>
> Yes. Several considerations:

Bob, as much as I hate to disagree with you, I'm afraid
I'd have to vote "maybe" instead. For the most part, the
differences between an analog and a digital interface for
LCD monitors come down to questions of pixel timing,
which really have nothing at all to do with whether the
video information is in digital or analog form.

The main factor in determining how good the displayed
image is going to be with an analog interface is the generation
of the proper clock with which to sample the analog video,
which is a question of both getting the frequency right and
making sure the clock is properly aligned with the incoming
video such that the samples are taken where the "pixels" are
supposed to be. (Being a continuous signal, of course, there
is no information contained within the video itself which
identifies the pixels.) Usually, the clock frequency is obtained
by locking on to the horizontal sync pulese and multiplying THAT
rate up to the assumed pixel rate; getting the alignment correct
(the "phase" adjustment) is a matter of the interface circuitry
making some educated guesses. But if the clock generation can
be done properly, there is very little to be gained by simply having
the pixel information in "digital" form. (And please consider how
truly awful the digital interface would be if the pixel clock information
were removed from it - it would be totally unusable. Hence my
assertion that it is timing, not the encoding of the information, that
is the key difference between these two types of interface.)

>
> There is noise resulting from taking the original
> digital raster to analog and back to digital.
> This might display, for example as horizontal
> artifacts, unstable picture regions, etc.

Nope; all of the above have to do with the timing of
the pixel sampling process, not with noise in the video.
(Oddly enough, the LCD is NOT inherently a "digital"
device as is often assumed - fundamentally, the control
of the pixel brightness in any LCD is an analog process.
Simply having a discrete pixel array does not somehow
make a display "digital," nor does it necessarily mean that
a "digital" interface would have to be better.

> Square waves? No chance. Think of a pattern of
> alternating white/black 1-pix dots. In analog,
> these need to exhibit sharp transitions and flat
> tops to emulate what you get for free with DVI-D.
> Bandwidth limits in the analog channels are apt
> to smear this fine detail.

If we were talking about a display that actually shows
those edges, you'd have a point - but the LCD doesn't
work that way. Remember, we are dealing with a
SAMPLED analog video stream in this case; if the sample
points happen at the right time (which again is a question
of how well the pixel clock is generated), the pixel values
are taken right "in the middle" of the pixel times - making
the transitions completely irrelevant.

Note that "digital" interfaces also have what is in effect a
"bandwidth" limit (the peak pixel rate which can be supported),
and it is in current interfaces often significantly less than what
can be achieved with an "analog" connection. The single-link
TMDS-based interfaces such as DVI (in its single channel
form) and HDMI are both strictly limited to a pixel rate of
165 MHz, while analog connections (even with the lowly
VGA connector) routinely run with pixel rates in excess of
200 MHz.

> Group delay with analog introduces some risk that
> the pixel data won't exactly precisely align with
> the LCD triads upon reconstruction. Suppose the
> analog signal has a little group delay (time shift)
> from the DAC, or in the cable, or in the ADC (or
> just one of the colors does). Our hypothetical white
> and black dots might become a gray moire morass.

Right - but again, a timing issue, which gets back to the
question of the generation of the sampling clock, not the
encoding of the data (which is really all that the terms "analog"
and "digital" refer to). Again, take the clock away from
a digital interface, and see what THAT gives you.

So the logical question at this point is why no one has ever
bothered to include better timing information on the analog
interfaces. The answer now is: someone has. VESA released
a new analog interface standard this past year which does just
that - it includes a sampling clock reference, additional information
which helps to properly locate the sampling clock with respect
to the video stream, and even a system which makes the
determination of the white and black levels much more accurate.
This is called, oddly enough, the New Analog Video Interface
standard, or simply NAVI. NAVI is supportable on a standard
VGA connector, but the standard also includes the definition of a
new, higher-performance analog connector (similar to the analog
section of a DVI) for higher bandwidth and other features). It's
not clear yet how well NAVI will be accepted in the industry, but
it IS available if anyone chooses to use it.

Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Bob Myers <nospamplease@address.invalid> wrote:

>>>> No question, I wouldn't buy one without it. [DVI-D]
>>> Is there really that much difference?
>> Yes. Several considerations:

> Bob, as much as I hate to disagree with you, ...

Sorry, but if I'm mistaken, netnews rules require you
to disagree :)

> For the most part, the differences between an analog
> and a digital interface for LCD monitors come down to
> questions of pixel timing, which really have nothing
> at all to do with whether the video information is in
> digital or analog form.

But there are opportunities for the signal to get
visibly degraded if it goes to analog before it gets
to the LCD panel lattice. In the entirely unscientific
test I just ran, where I saw exactly what I expected to
see, the analog happened to be running through two 2m
lengths of HD15 cable and a KVM switch. The LCD image
went from pixel-perfect to slightly fuzzy, and perhaps
also reduced "contrast".

> (And please consider how truly awful the digital
> interface would be if the pixel clock information
> were removed from it - it would be totally unusable.

Well, that's the classic promise and peril of digital.
It's either as perfect as it ever gets, or it's not
there at all, whereas analog may never be perfect
enough, and opportunities for degradation abound.

>> There is noise resulting from taking the original
>> digital raster to analog and back to digital.
>> This might display, for example as horizontal
>> artifacts, unstable picture regions, etc.
>
> Nope; all of the above have to do with the timing of
> the pixel sampling process, not with noise in the video.

Umm, if the bits in the frame buffer are going thru a
DAC (which can introduce noise and distortion), then
thru a cable (which <ditto>), even if the LCD is not using
an ADC, and is using the analog signal directly, that
extra noise and distortion may show up on screen.

> (Oddly enough, the LCD is NOT inherently a "digital"
> device as is often assumed - fundamentally, the control
> of the pixel brightness in any LCD is an analog process.

I sorta suspected that, but in the DVI-D model, the
signal remains digital until it hits the rows & columns, no?

Does the typical analog-only LCD have a DAC? Or does it
just sample the analog signal and route values to drivers?
My guess is that due to the interpolation required for
handling arbitrary resolutions, there is a local frame
buffer, and the analog video is [re]digitized before
hitting the pel drivers.

> If we were talking about a display that actually shows
> those edges, you'd have a point - but the LCD doesn't
> work that way. Remember, we are dealing with a
> SAMPLED analog video stream in this case; if the sample
> points happen at the right time (which again is a question
> of how well the pixel clock is generated), the pixel values
> are taken right "in the middle" of the pixel times - making
> the transitions completely irrelevant.

Even if the clocks align, there's also the matter of
whether or not the analog signal has completely slewed
to the value needed. If the DAC-cable-ADC path has
bandwidth-limited (softened) the transitions, or
introduced color-to-color skews, that will show up.
I see it, or something like it, doing analog on my LCD.

> ... the New Analog Video Interface standard, or simply
> NAVI. ... It's not clear yet how well NAVI will be
> accepted in the industry, but it IS available if
> anyone chooses to use it.

I suspect it's irrelevant at this point. Analog is
the "economy" graphics connect now, and what we have
is sufficient for the market.

I think it more likely that the analog economy model
will be replaced by a digital economy model, where PC
main RAM is used for frame buffer, and the graphics
"card" (if any) is just a TMDS driver chip with a
DVI-D connector on the bulkhead, something like the
"ADD2" cards I see at <www.molex.com>.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:ip9sr0pah4uh0eido84mnj4vov20tadouk@4ax.com...
> But the panel is just doing an analog to digital conversion, anyway, and
> the connection is analog even when it's DVI-D, so doesn't it all just
> wash?

And actually, the panel is THEN doing a digital to analog
conversion; the LCD column drivers are basically just a series of
D/A converters in parallel. The basic drive for an LCD is an
analog voltage.

> The image on mine is really sharp, it seems, and contrast is excellent.
> No artifacts even under a loupe. It makes me wonder how much better it
> could get.

Clearly, in your case, not much at all. You have a monitor with
a well-implemented analog front end.


> Doesn't the panel correct the time base for incoming analog signals or
> something, in order to avoid this? Like the TBC in some video
> equipment?

If the analog front end is doing its job properly, yes. This
comes in the form of aligning the sampling clock with the
incoming video stream.


Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news:eek:psiy0uld6ft8z8r@news.individual.net...

> No. With a DVI-D connection, the discrete pixel digital
> values are preserved from creation in the frame buffer
> by the graphics driver all the way out to the individual
> pixel drivers for the LCD triads.

Well, they MIGHT be. In either an analog or digital
interfaced LCD monitor, there is typically a look-up table
in the monitor's front end which converts these values into
somewhat different ones, in order to correct for the rather
S-shaped (as opposed to a nice CRT-like "gamma" curve)
response of the typical LC panel. In any event, though,
whether or not having the information preserved in digital
form is an advantage in terms of accuracy depends solely on
whether or not the analog video signal is generated with, and
can be read with, similar accuracy. 8 bits/color accuracy in
an 0.7V analog signal says that the value of the LSB is about
0.7/255 = 2.7 mV. At least with a VGA connection, it is
difficult to get that level of accuracy on an instantaneous sample,
but fortunately in a video situation what is actually perceived
is the average of many samples, so this sort of visual
performance is not out of the question. The best test, in
either case, would be to take a look at a "gray scale" test
pattern with the appropriate number of values, and see if you're
satisfied with the result.


> TMDS as I recall. Transition-Minimized Digital Signalling.
> Ones and zeros. Everything is either a 1, a 0, or ignored.

Close - Transition Minimized Differential Signalling, referring
to both the encoding method used and the fact that the individual
data connections are current-differential pairs (sort of). But
the notion that everything in a "digital" connection is either a
1 or a 0 or ignored is somewhat misleading; nothing is ignored,
and it's a question of both the original transmitted data AND
noise on the line as to whether the received information will be
interpreted as a 1 or a 0 ("ignored" to the receiver is not possible;
it HAS to be interpreted as a 1 or a 0, as those are the only
possible outputs.) Digital connections are certainly not immune
to noise - they simply respond to it in a different manner. (Analog
degrades more gracefully in the presence of noise, as the LSBs
are effectively lost first; in "digital," everything stays great right
up to the point where the noise margin is exceeded, and then
everything is lost completely.)

>
> >> ... group delay ...
>

> I'd like to think so, but I wouldn't assume it.
> Clearly, when we feed the monitor a non-native res,
> it cannot match pixels, because the rasters don't map.

Again, this is not a distinction between analog and digital
interfaces. In both cases, the incoming video information
is sampled at its "native" mode (i.e., if you have an analog
interface carrying, say, a 1280 x 1024 image, then there will
be 1280 samples taken per active line, no matter what the
panel format is). Image scaling is done later in the pipe, in
both analog- and digital-input cases. (It would be far worse
in the digital case if this were not true.)


Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Bob Myers <nospamplease@address.invalid> wrote:

> In either an analog or digital interfaced LCD monitor,
> there is typically a look-up table in the monitor's
> front end which converts these values into somewhat
> different ones, in order to correct for the rather
> S-shaped (as opposed to a nice CRT-like "gamma" curve)

The monitor knows that the incoming data will be
pre-compensated to a gamma (log curve) in the 1.8 ... 2.6
range, or maybe be linear (no re-comp).

Why doesn't the look-up more fully adjust-out the
S-curve, so that color errors that can be corrected
with the simple exponent adjustment of typical graphics
card gamma control menus?

My guess is that because LCD subpixels are just barely
8-bit, a full correction might minimize color errors at
the expense of introducing visible terracing in gradients.

And the solution relies on future 10-bit panels.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:p1csr01gsvonbu3om7cb7i9gdd64v9t1fq@4ax.com...
> How many levels per pixel? An analog connection can have any number of
> levels, depending only on the quality of the connection and hardware.
> My card already generates 32-bit color, although my flat panel can't use
> all that resolution.

Slight correction time here - "32-bit color" generally does NOT
imply more than 8 bits per primary. What is called "32 bit color"
in the PC world is really just a way to align 24 bits of information
(8 each of RGB) within a four-byte space, for ease of handling the
data (as opposed to having the video information coming in three-byte
packets).


> > Most cards provide DVI-I ports, which have both one link
> > of DVI-D digital and RGB analog (sometimes called DVI-A,
> > plus a breakout-to-HD15 cable for analog use). By DVI-D,
> > I mean use the card's DVI port, and a DVI cable, and
> > assure yourself that if both signals are present, the
> > monitor is using the digital, and not the analog.
>
> I'll look again and see if there's a DVI-D plug, but I rather doubt it.

To clarify the DVI terminology here:

DVI-I is a DVI implementation in which RGB analog signals
are provided along with one OR two TMDS links; either interface
may be used, although they might not have identical capabilities.
(Often, two different EDID files will be provided in the monitor,
each of which describes the monitor's capabilities on one of the
two interfaces.)

DVI-D is a variant of DVI which does not carry analog video,
and so does not provide pins in that part of the connector. It
too may provide either one or two TMDS links, AKA "channels."
Each channel carries three data pairs, and has a capacity of
up to 165 Mpixels/second, 24 bits/pixel.


Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:i6usr0td89lmed0d0svan3jtmrj4d83kva@4ax.com...

> 24-bit color isn't good enough for some applications. It doesn't
> provide enough levels in some parts of the gamut, such as blue (the eye
> is extremely sensitive to differences in intensity in the blue end of
> the spectrum, so any kind of posterization is very easy to spot).

Well, actually, it's the green region of the spectrum
where the eye has its best discrimination ability, but that's
beside the point. You're right in noting that 8 bits/color is
not sufficient for many demanding applications, especially if
a linear encoding is assumed. Somewhere in the 10-12 bit
region is generally considered adequate for just about anything,
though.

>
> > It's not clear to me that contemporary LCD panels can
> > even deliver 24-bit color. They accept such signals,
> > but what they paint on screen is another matter.
>
> True for many cheaper CRTs, too. But it worries me that the standard
> apparently was designed in a way that permanently limits it to 24-bit
> color.

Yes and no. DVI DOES provide the option of a second
link, which could be used for either greater "color depth"
or support for higher "resolutions" (in the pixel format
sense of the word). It's just rarely used in this manner, in
part due to the lack of panels which support more than 8 bits
per primary at present.


Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> Here's a lousy photo of a bulkhead with both
> DVI and HD15:

Nope, I don't have that.

By the way, does anyone build video cards optimized for 2D, photographic
and prepress use, instead of always emphasizing animation and 3D
performance?

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

> By the way, does anyone build video cards optimized
> for 2D, photographic and prepress use, instead of
> always emphasizing animation and 3D performance?

Yes. Matrox (which is what I use):
<http://www.matrox.com/mga/home.htm>

They are way behind in 3D perf, and only just
announced their first PCI-Express card.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> Bob, as much as I hate to disagree with you, I'm afraid
> I'd have to vote "maybe" instead. For the most part, the
> differences between an analog and a digital interface for
> LCD monitors come down to questions of pixel timing,
> which really have nothing at all to do with whether the
> video information is in digital or analog form.

The best analog system will always beat the performance of the best
digital system. There's nothing about analog technology that makes it
intrinsically inferior to digital, so a good video card and a good
monitor should meet or beat any digital interface, I should think.

This is why the _best_ analog audio systems can consistently beat the
best digital systems. However, the superior performance comes at a
price that is usually all out of proportion with the increment of gain
over digital.

> Oddly enough, the LCD is NOT inherently a "digital"
> device as is often assumed - fundamentally, the control
> of the pixel brightness in any LCD is an analog process.

Every interface between the digital world and the physical world is
analog, so all input and output devices are ultimately analog devices.
"Digital" only means something in the conceptual world of information
representation.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

> The best analog system will always beat the
> performance of the best digital system.

Depending on how you define "best", as we saw with the
early debates about CD audio. Now that purists can get
48-bit 96 KHz digital audio, I don't see that debate
anymore.

> Every interface between the digital world and the
> physical world is analog, ...

Not at the quantum level.
Expect the physicists to sail in here and dispute that :)

Is anyone prepared to argue that using an HD15 analog
connection to an LCD monitor provides a "better" presentation?

It's conceivable, due to the anti-aliasing provided by the
analog blur. I was actually a bit startled by how crisp
the screen was using the DVI-D connection. In my CAD work,
I now always see stair-casing of angled and curved lines,
whereas on the CRT monitor (same res), they were smooth.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> Well, that's the classic promise and peril of digital.
> It's either as perfect as it ever gets, or it's not
> there at all, whereas analog may never be perfect
> enough, and opportunities for degradation abound.

Analog can also be more perfect than digital. In fact, it is always
possible to build an analog system that is superior to any given digital
system--if money is no object.

> Umm, if the bits in the frame buffer are going thru a
> DAC (which can introduce noise and distortion), then
> thru a cable (which <ditto>), even if the LCD is not using
> an ADC, and is using the analog signal directly, that
> extra noise and distortion may show up on screen.

Sure, but the question is whether or not it actually does to any visible
extent in the real world.

I've found that, in many respects, PC video systems perform better than
they are supposed to. For all the noise one hears about the horrors of
analog systems, in real life they perform amazingly well. Look no
further than the continuing superiority of CRTs for most aspects of
image quality for proof.

> I suspect it's irrelevant at this point. Analog is
> the "economy" graphics connect now, and what we have
> is sufficient for the market.

Economy perhaps, but that isn't always correlated with quality.

> I think it more likely that the analog economy model
> will be replaced by a digital economy model, where PC
> main RAM is used for frame buffer, and the graphics
> "card" (if any) is just a TMDS driver chip with a
> DVI-D connector on the bulkhead, something like the
> "ADD2" cards I see at <www.molex.com>.

I suspect the current "high-performance" digital models will become the
"digital economy" models, in time.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> My guess is that because LCD subpixels are just barely
> 8-bit, a full correction might minimize color errors at
> the expense of introducing visible terracing in gradients.

The incoming data might be 8-bit, but there's no reason why the internal
correction of the monitor can't be carried out with much higher
granularity.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.