Sign in with
Sign up | Sign in
Your question

Is DVI connector worth it?

Last response: in Graphics & Displays
Share
Anonymous
a b U Graphics card
July 6, 2005 10:50:55 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

A friend has got a new LCD monitor and he wants to know if connecting
via a DVI connector would improve the quality over the VGA connectors? I
would suspect it's imperceptible. He doesn't have a video card with a
DVI connector yet, but he wants to get one if there's any difference.

Yousuf Khan

More about : dvi connector worth

Anonymous
a b U Graphics card
July 6, 2005 10:50:56 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Using a DVI-out video card thru a DVI cable to a DVI- in LCD monitor is
considerably superior to using analog VGA inputs and outputs. When using
all DVI input/outputs the video signal remains constantly in the digital
domain. However, when using VGA input/outputs the signal originates as a
digital signal in the video card, is converted to an analog signal for the
video card ouput, travels thru the cable as an analog signal, and then at
the LCD has to be reconverted from an analog signal back to a digital signal
that the monitor can use internally. All these signal conversions corrupt
the signal leading to a degraded image on the LCD monitor.

--
DaveW



"Yousuf Khan" <bbbl67@ezrs.com> wrote in message
news:NaadnX-dvfzN_VHfRVn-iQ@rogers.com...
>A friend has got a new LCD monitor and he wants to know if connecting via a
>DVI connector would improve the quality over the VGA connectors? I would
>suspect it's imperceptible. He doesn't have a video card with a DVI
>connector yet, but he wants to get one if there's any difference.
>
> Yousuf Khan
Anonymous
a b U Graphics card
July 6, 2005 10:54:00 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

DaveW wrote:
> Using a DVI-out video card thru a DVI cable to a DVI- in LCD monitor is
> considerably superior to using analog VGA inputs and outputs. When using
> all DVI input/outputs the video signal remains constantly in the digital
> domain. However, when using VGA input/outputs the signal originates as a
> digital signal in the video card, is converted to an analog signal for the
> video card ouput, travels thru the cable as an analog signal, and then at
> the LCD has to be reconverted from an analog signal back to a digital signal
> that the monitor can use internally. All these signal conversions corrupt
> the signal leading to a degraded image on the LCD monitor.

Yes, yes, understood, that's the theory, what's the reality? Is there
any noticeable difference?

Yousuf Khan
Related resources
Anonymous
a b U Graphics card
July 7, 2005 1:18:32 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Yousuf Khan wrote:

> A friend has got a new LCD monitor and he wants to know if connecting
> via a DVI connector would improve the quality over the VGA connectors? I
> would suspect it's imperceptible. He doesn't have a video card with a
> DVI connector yet, but he wants to get one if there's any difference.

Only way to tell for sure is to try it. On some monitors there's a huge
difference, on others maybe an expert who is looking for differences can
tell but the average person can't. The newer monitors tend to do better
with analog signals than do the older ones.

> Yousuf Khan

--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
July 7, 2005 2:12:23 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

DaveW writes:

> Using a DVI-out video card thru a DVI cable to a DVI- in LCD monitor is
> considerably superior to using analog VGA inputs and outputs. When using
> all DVI input/outputs the video signal remains constantly in the digital
> domain. However, when using VGA input/outputs the signal originates as a
> digital signal in the video card, is converted to an analog signal for the
> video card ouput, travels thru the cable as an analog signal, and then at
> the LCD has to be reconverted from an analog signal back to a digital signal
> that the monitor can use internally.

The final signal is analog, not digital, even in an LCD monitor. LCDs
are not digital devices. There aren't any digital input or output
devices, in fact.

I keep hearing that digital is vastly better, but I can't see anything
wrong with the analog signal on my LCD monitor. What _exactly_ is
different with digital? The pixels look perfect on my monitor; how
can they become "more perfect"?
Anonymous
a b U Graphics card
July 7, 2005 2:30:22 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Yousuf Khan writes:

> Yes, yes, understood, that's the theory, what's the reality? Is there
> any noticeable difference?

I'm currently trying to get the DVI connection on my configuration
working to test this out. However, I can say that with an Asus video
card and a Eizo monitor, examination of the screen under a magnifying
glass reveals no "bleeding" of image from one pixel to the next, and
that is really the only thing that could be worse about analog. I
therefore wonder what DVI will bring me, which is why I've been hoping
to try it to see, now that I have both a card and a monitor with which
to try it.

Remember, no digital system can ever be superior to the best analog
system.
Anonymous
a b U Graphics card
July 7, 2005 3:33:27 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers wrote:
> Quite often, the answer is no.
>
> There's actually very little accumulated error in the video signal from the
> D/A and A/D conversions (Dave forgot to mention one - at the end,
> there's a conversion back to digital form, since fundamentally LCDs are
> analog-drive devices. This happens at the column drivers in the panel.).
> What amplitude error may be introduced in these averages out over
> successive frames, so that there is little or no visible impact on the image
> quality.

Yeah, that's basically what I've been hearing from asking around
recently. Several people that I've asked said they couldn't tell the
difference between the DVI interface and the VGA one.

> (Current digital interfaces may in some applications be at a
> disadvantage here, in fact, as analog video systems are very often capable
> of better than 8 bit/component accuracy).

I see what you mean, even if the digital cables were capable of greater
than 8 bits/component would the digital internals of the LCDs be able
to display anything greater than 8 bits/component? So far, my friend
has been unimpressed with the quality of the picture of his LCD
compared to his old CRT.

Yousuf Khan
Anonymous
a b U Graphics card
July 7, 2005 9:55:00 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Yousuf Khan" <yjkhan@gmail.com> wrote in message
news:1120701240.043850.145340@g43g2000cwa.googlegroups.com...
> DaveW wrote:
> > Using a DVI-out video card thru a DVI cable to a DVI- in LCD monitor is
> > considerably superior to using analog VGA inputs and outputs. When
using
> > all DVI input/outputs the video signal remains constantly in the digital
> > domain. However, when using VGA input/outputs the signal originates as
a
> > digital signal in the video card, is converted to an analog signal for
the
> > video card ouput, travels thru the cable as an analog signal, and then
at
> > the LCD has to be reconverted from an analog signal back to a digital
signal
> > that the monitor can use internally. All these signal conversions
corrupt
> > the signal leading to a degraded image on the LCD monitor.
>
> Yes, yes, understood, that's the theory, what's the reality? Is there
> any noticeable difference?

Quite often, the answer is no.

There's actually very little accumulated error in the video signal from the
D/A and A/D conversions (Dave forgot to mention one - at the end,
there's a conversion back to digital form, since fundamentally LCDs are
analog-drive devices. This happens at the column drivers in the panel.).
What amplitude error may be introduced in these averages out over
successive frames, so that there is little or no visible impact on the image
quality. (Current digital interfaces may in some applications be at a
disadvantage here, in fact, as analog video systems are very often capable
of better than 8 bit/component accuracy). The real visible difference
between "analog" and "digital" inputs on monitors has to do with the
pixel-level timing, and depends on how accurately the monitor can
produce a stable sampling clock with which to sample the incoming
analog video signals. (If the VGA interface carried pixel-level timing
information, visible differences between it and "digital" connection would
vanish in almost all mainstream applications. Standards which would
provide such information have been developed, but to date have not
been widely adopted.)

Bob M.
Anonymous
a b U Graphics card
July 7, 2005 9:57:01 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@gmail.com> wrote in message
news:7tppc11u7inl4kkvpo9f7e8eb3k2ksg218@4ax.com...
> Remember, no digital system can ever be superior to the best analog
> system.

This is also a common misconception. You cannot make sweeping
claims of superiority for either digital or analog encoding per se; the
best one can ever hope to do is to compare specific implementations
of these against whatever criteria are important for a given application.
Ideas that "analog" systems somehow provide "infinite" accuracy or
the equivalent are basically nonsense.

Bob M.
Anonymous
a b U Graphics card
July 7, 2005 11:13:30 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Yousuf Khan" <yjkhan@gmail.com> wrote in message
news:1120761207.548923.13170@g14g2000cwa.googlegroups.com...
> I see what you mean, even if the digital cables were capable of greater
> than 8 bits/component would the digital internals of the LCDs be able
> to display anything greater than 8 bits/component? So far, my friend
> has been unimpressed with the quality of the picture of his LCD
> compared to his old CRT.

So far, LCDs are typically 6 or 8 bits per primary. 10-bit and
even 12-bit drivers are now starting to come to the market, though
(this is primarily happening at the high end, i.e., the LCD TV panel
market), so eventually we will see better performance in this
regard.

Bob M.
Anonymous
a b U Graphics card
July 8, 2005 1:34:28 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> This is also a common misconception.

It's an unavoidable fact. You cannot build any digital system that
interfaces with the real world that is superior to the best analog
system, period. The reason for this is that all interfaces are
analog, therefore no system can ever be superior to the best analog
system.

> You cannot make sweeping claims of superiority for either digital
> or analog encoding per se ...

Encoding is not important. It's the physical interface with the real
world that is important. And it is always an analog interface.

> Ideas that "analog" systems somehow provide "infinite" accuracy or
> the equivalent are basically nonsense.

They provide it in theory, but not in practice. Conversely, some
digital systems provide nearly infinite accuracy in practice, but not
in theory.

The important thing to remember is that no physical interface can be
digital. Therefore no digital system that interfaces with the
physical world can ever be better than the best analog system.

An inevitable consequence of this is that it will always be possible
to build better analog audio or video systems than any digital system.
Anonymous
a b U Graphics card
July 8, 2005 2:41:14 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@gmail.com> wrote in message
news:jo0rc1tkagtr98ijt6timknaqsnkc5k485@4ax.com...
> Bob Myers writes:
>
> > This is also a common misconception.
>
> It's an unavoidable fact. You cannot build any digital system that
> interfaces with the real world that is superior to the best analog
> system, period. The reason for this is that all interfaces are
> analog, therefore no system can ever be superior to the best analog
> system.

But the "best analog system" is more than just an interface. As
long as the digital system is capable of capturing all information
presented by this hypothetical analog interface, and then
conveying it in a lossless manner, it would be superior than an
analog system which would by necessity introduce additional
noise into the signal once you're past the "interface."

> > Ideas that "analog" systems somehow provide "infinite" accuracy or
> > the equivalent are basically nonsense.
>
> They provide it in theory, but not in practice. Conversely, some
> digital systems provide nearly infinite accuracy in practice, but not
> in theory.

Actually, analog systems cannot provide "infinite" accuracy
even in theory. ANY information transmission, regardless of
whether in "digital" or "analog" form, is limited in its information
capacity by the available channel bandwidth and noise level, per
Shannon's theorem. Infinite accuracy implies an infinite information
capacity (i.e., how many decimal places do you require for
"infinite" precision?), and this would require infinite bandwidth or
precisely zero noise, neither of which is even theoretically possible.
Your last sentence is nonsensical; it implies that there are digital
systems which provide, in practice, better accuracy than can be
explained by the theory underlying their operation!


> The important thing to remember is that no physical interface can be
> digital.

This is not correct. Transducers have been designed which essentially
do a direct conversion of certain types of real-world parameters
directly into numeric or "digital" form; it's just that they generally
have not been very practical to implement, or provided any real
advantages over other approaches.


Bob M.
Anonymous
a b U Graphics card
July 8, 2005 9:05:06 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> But the "best analog system" is more than just an interface.

Yes, but since the interface is always analog, the best digital system
can never be better than the best analog system.

Digital systems are simply analog systems with a non-zero threshold
for information content in the signal-to-noise ratio. Analog systems
treat noise as signal; digital systems ignore noise below a certain
threshold. Digital sacrifices capacity in order to do this.

> As long as the digital system is capable of capturing all information
> presented by this hypothetical analog interface, and then
> conveying it in a lossless manner, it would be superior than an
> analog system which would by necessity introduce additional
> noise into the signal once you're past the "interface."

It depends. See above. Digital systems achieve lossless information
recording and transfer by sacrificing bandwidth. This works as long
as one remains in the realm of pure information. It doesn't and
cannot work for the final physical interfaces (which are always
analog) at either end.

Ultimately, then, you can build an analog system that will meet or
beat any digital system. The reason this doesn't actually happen is
that, up to a certain point, analog systems of this kind are much more
expensive than digital systems. By sacrificing capacity you can
greatly reduce cost and keep errors arbitrarily low (although you
cannot eliminate them).

> Actually, analog systems cannot provide "infinite" accuracy
> even in theory.

In theory, they can provide perfect accuracy; in fact, in theory, they
do this by definition.

> Infinite accuracy implies an infinite information
> capacity (i.e., how many decimal places do you require for
> "infinite" precision?), and this would require infinite bandwidth or
> precisely zero noise, neither of which is even theoretically possible.

There is nothing that theoretically forbids noise. At the quantum
level some interactions are lossless, in fact, but they are hard to
use in a practical way. Think superconduction, for example.

> This is not correct. Transducers have been designed which essentially
> do a direct conversion of certain types of real-world parameters
> directly into numeric or "digital" form; it's just that they generally
> have not been very practical to implement, or provided any real
> advantages over other approaches.

No, they are analog devices as well. All such interfaces are analog.
Anonymous
a b U Graphics card
July 8, 2005 11:38:57 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers wrote:
> So far, LCDs are typically 6 or 8 bits per primary. 10-bit and
> even 12-bit drivers are now starting to come to the market, though
> (this is primarily happening at the high end, i.e., the LCD TV panel
> market), so eventually we will see better performance in this
> regard.

Oh really? They don't typically advertise this feature on LCDs do they?
They talk about brightness & contrast ratios, update speeds, etc., but
not internal precision.

Yousuf Khan
Anonymous
a b U Graphics card
July 8, 2005 2:46:30 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Mxsmanic wrote:

>Ultimately, then, you can build an analog system that will meet or
>beat any digital system. The reason this doesn't actually happen is
>that, up to a certain point, analog systems of this kind are much more
>expensive than digital systems. By sacrificing capacity you can
>greatly reduce cost and keep errors arbitrarily low (although you
>cannot eliminate them).

Not true.

>> Actually, analog systems cannot provide "infinite" accuracy
>> even in theory.
>
>In theory, they can provide perfect accuracy; in fact, in theory, they
>do this by definition.

Wrong again.
Anonymous
a b U Graphics card
July 8, 2005 9:57:42 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@gmail.com> wrote in message
news:o vqrc1ptd23opso0l5upt2g8rbvi3srpht@4ax.com...
> Bob Myers writes:
>
> > But the "best analog system" is more than just an interface.
>
> Yes, but since the interface is always analog, the best digital system
> can never be better than the best analog system.

OK, well, we've had this discussion before. I've said what I have
to say both here and in previous posts on the subject, and those
(and other sources) are available to anyone who at this point still cares
enough to look further into this. These same sources are available to
you as well, so feel free to investigate this further. As you should.

Bob M.
Anonymous
a b U Graphics card
July 8, 2005 9:58:50 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Yousuf Khan" <yjkhan@gmail.com> wrote in message
news:1120833537.803045.181350@g44g2000cwa.googlegroups.com...
> Bob Myers wrote:
> > So far, LCDs are typically 6 or 8 bits per primary. 10-bit and
> > even 12-bit drivers are now starting to come to the market, though
> > (this is primarily happening at the high end, i.e., the LCD TV panel
> > market), so eventually we will see better performance in this
> > regard.
>
> Oh really? They don't typically advertise this feature on LCDs do they?
> They talk about brightness & contrast ratios, update speeds, etc., but
> not internal precision.

Well, it's not exactly a secret. And the vast majority of display
interfaces and systems only provide 8 bits per component anyway,
so the point is in most cases moot.

Bob M.
Anonymous
a b U Graphics card
July 9, 2005 1:32:11 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers wrote >>
>> So far, LCDs are typically 6 or 8 bits per primary.

Yousuf Khan wrote >
> Oh really? They don't typically advertise this feature on LCDs do they?

Nope. They hope you don't notice that light grays are pink
(for example, your results may vary), and that any gamma
correction in your app, card driver or the monitor itself
cannot fully fix it without introducing other more
objectionable artifacts, such as visible terracing.
This is independent of whether the connection is analog
or digital.

Anyone doing carefully color-managed work needs to test
any contemplated LCD display. My guess is that few can
track a gray scale as accurately as, say, a Sony Artisan
(GDM-C520K, the only CRT computer display Sony still makes).

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
Anonymous
a b U Graphics card
July 9, 2005 3:03:22 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

chrisv writes:

> Not true.

It is both true and inevitable. It is widely unknown, however. Most
people have no clue about what "digital" actually means.
Anonymous
a b U Graphics card
July 11, 2005 1:41:33 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@gmail.com> wrote in message
news:7sopc1p3ibnetlouo541jmttj7onknec6c@4ax.com...
> DaveW writes:
>
>> Using a DVI-out video card thru a DVI cable to a DVI- in LCD monitor is
>> considerably superior to using analog VGA inputs and outputs. When using
>> all DVI input/outputs the video signal remains constantly in the digital
>> domain. However, when using VGA input/outputs the signal originates as a
>> digital signal in the video card, is converted to an analog signal for
>> the
>> video card ouput, travels thru the cable as an analog signal, and then at
>> the LCD has to be reconverted from an analog signal back to a digital
>> signal
>> that the monitor can use internally.
>
> The final signal is analog, not digital, even in an LCD monitor. LCDs
> are not digital devices. There aren't any digital input or output
> devices, in fact.
>
> I keep hearing that digital is vastly better, but I can't see anything
> wrong with the analog signal on my LCD monitor. What _exactly_ is
> different with digital? The pixels look perfect on my monitor; how
> can they become "more perfect"?

Indeed the final signal to the LCD cell must have an analog value to be
able to adjust the proportional transmissibility of the cell according to
the
original value determined in the graphics bitmap. What is at issue in the
discussion is the integrity of passing that origional value (which
incidentally
is numeric) to become the analog value of need at the cell.
Historically there have been issues with integrity of passing the values
along the signal chain in analog fashion, so a lot of thought was put into
a way to maintain the numeric value integrity until the final conversion
is made. That resulted in "digital" interfaces, which in fact did
demonstrate
the ability to maintain a higher level of integrity in preserving the
desired value.
However, the integrity of the analog method has also been improved in the
recent years, so these days it is practically impossible to detect a visual
difference in the transmission method for many displays, especially if
operating in the native pixel format.

Hence - ya gotta see it to decide!! In most cases for today it is not
easy
to justify exchanging cards *only* to gain a "digital channel". Of course
all the companies making new hardware really WANT you to make the
change..... OTOH, if changing cards provides for desirable new features,
you have your justification and satisfaction.

My $0.02
NGA
Anonymous
a b U Graphics card
July 11, 2005 6:28:06 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Yousuf Khan <bbbl67@ezrs.com> :
>A friend has got a new LCD monitor and he wants to know if connecting
>via a DVI connector would improve the quality over the VGA connectors? I
>would suspect it's imperceptible. He doesn't have a video card with a
>DVI connector yet, but he wants to get one if there's any difference.
>
> Yousuf Khan

In my experience it is imperceptable. I recently ordered 2 identical 19"
LCDs (some cheap brand, viewera, i think) and hooked them up to a geforce
6800 that had one analog and one DVI output. No difference. Changed the
connections on the LCDs and still no difference. I was expecting to see
some difference because this PC is in an area with a lot of stray RF.
All things being equal I would use the DVI connection, but only if it
doesn't cost you any extra money.
Anonymous
a b U Graphics card
July 11, 2005 9:11:12 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Not Gimpy Anymore writes:

> Historically there have been issues with integrity of passing the values
> along the signal chain in analog fashion, so a lot of thought was put into
> a way to maintain the numeric value integrity until the final conversion
> is made. That resulted in "digital" interfaces, which in fact did
> demonstrate the ability to maintain a higher level of integrity in
> preserving the desired value.

Then one must wonder why the original video standards such as CGA were
digital, but were replaced by more "advanced" standards that were
analog, such as VGA.

Digital is less flexible than analog--conceivably a VGA cable can
carry just about any resolution or color depth. Digital provides
fewer errors in exchange for reduced bandwidth and flexibility.

> Hence - ya gotta see it to decide!! In most cases for today it is not
> easy to justify exchanging cards *only* to gain a "digital channel".

In my case both the card and the monitor support digital, but I can't
get it to work, so I can't really compare.
Anonymous
a b U Graphics card
July 11, 2005 9:11:13 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@gmail.com> wrote in message
news:3mo3d154hr02rddu54tmlb9gi71adalhld@4ax.com...
> Not Gimpy Anymore writes:
>
> > Historically there have been issues with integrity of passing the values
> > along the signal chain in analog fashion, so a lot of thought was put
into
> > a way to maintain the numeric value integrity until the final conversion
> > is made. That resulted in "digital" interfaces, which in fact did
> > demonstrate the ability to maintain a higher level of integrity in
> > preserving the desired value.
>
> Then one must wonder why the original video standards such as CGA were
> digital, but were replaced by more "advanced" standards that were
> analog, such as VGA.

Simplicity. The graphics systems used in the early days
of the PC had only a very limited "bit depth" (number of
shades per color), and the easiest interface to implement for
such a simple system was either one or two bits each, directly
from the graphics output to the input stage of the CRT - i.e.,
just run the bits out through a TTL buffer and be done with
it. With a greater number of bits/color, a D/A converter at the
output and transmitting the video information in "analog" form
(which is what the CRT's going to want at the cathode, anyway)
becomes a more cost-effective solution than trying to deliver the
information as a parallel-digital signal (which is basically what the
"CGA" style of interface was) and instead ship it up on three
coaxes. If it were simply a question of which method were more
"advanced," it would also be legitimate to ask why television
is now moving from "analog" to "digital" transmission, and
computer interfaces are beginning to do the same.

The answer to just about ANY question in engineering which
begins with "Why did they do THIS...?" is generally "because
it was the most cost-effective means of achieving the desired
level of performance." The first computer graphics system I
myself was responsible for used what I supposed would be
called a "digital" output in this discussion, for this very reason.
It was the first "high-resolution" display system in our product
line, way, way back around 1982 - all of 1024 x 768 pixels
at 60 Hz, and we didn't have the option of a D/A to make
"analog" video for us simply because we couldn't put enough
memory into the thing for more than 2 bits per pixel. So we used
a couple of open-collector outputs at the computer end of the
cable, and used the signal coming from those to switch a few
resistors in the monitor (which was also custom) and get a cheap
"D/A" effect - at all of four levels of gray! (Monochrome -
we weren't working on the color version, which would follow
a bit later.)


> Digital is less flexible than analog--conceivably a VGA cable can
> carry just about any resolution or color depth. Digital provides
> fewer errors in exchange for reduced bandwidth and flexibility.

This is incorrect. For one thing, it assumes that simple binary
encoding is the only thing that could possibly be considered under
the heading "digital" (which is wrong, and in fact several examples
of other fully-digital systems exist even in the area of video
interfaces; for instance, the 8-VSB or COFDM encodings
used in broadcast HDTV). The other point where the above
statement is incorrect is the notion that the VGA analog interface
could carry "just about any resolution or color depth." The
fundamental limitations of the VGA specification, including
an unavoidable noise floor (given the 75 ohm system impedance
requirement) and the overall bandwidth constrains the data
capacity of the interface, as it must for ANY practical interface
definition. Over short distances, and at lower video frequencies,
the VGA system is undoubtedly good for better than 8 bits
per color; it is very unlikely in any case that it could exceed,
say, an effective 12 bits/color or so; I haven't run the math to
figure out just what the limit is, though, so I wouldn't want anyone
to consider that the final word.

But the bottom line is that the information capacity of any real-world
channel is limited, in terms of effective bits per second. (Note
that stating this limit in "bits per second" actually says nothing
about whether the channel in question is carrying "analog" or
"digital" transmissions; this is bit/sec. in the information theory
usage of the term. In practice, this limit (called the Shannon
limit) is generally more readily achieved in "digital" systems
than "analog," due to the simple fact that most analog systems
give more margin to the MSBs than the LSBs of the data.
Whatever data capacity is achieved may be used either for
greater bits/component (bits/symbol, in the generic case) or
more pixels/second (symbols/second, generically), but the limit
still remains.

The actual difference between "analog" and
"digital" systems here is not one of "errors" vs."bandwidth,"
but rather where those errors occur; as noted, the typical
analog system preserves the most-significant-bit data vs.
least-significant (e.g., you can still make out the picture, even
when the noise level makes it pretty "snowy") - or in other words,
analog "degrades gracefully." Most simple digital encodings
leave all bits equally vulnerable to noise, which makes for a
"cliff effect" - digital transmissions tend to be "perfect" up to a
given noise level, at which point everything is lost at once.

Bob M.
Anonymous
a b U Graphics card
July 11, 2005 6:20:10 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Myers" <nospamplease@address.invalid> wrote in message
news:ufmAe.8232$rr7.8001@news.cpqcorp.net...
>
> "Mxsmanic" <mxsmanic@gmail.com> wrote in message
> news:3mo3d154hr02rddu54tmlb9gi71adalhld@4ax.com...
>> Not Gimpy Anymore writes:
>>
>> > Historically there have been issues with integrity of passing the
>> > values
>> > along the signal chain in analog fashion, so a lot of thought was put
> into
>> > a way to maintain the numeric value integrity until the final
>> > conversion
>> > is made. That resulted in "digital" interfaces, which in fact did
>> > demonstrate the ability to maintain a higher level of integrity in
>> > preserving the desired value.
>>
>> Then one must wonder why the original video standards such as CGA were
>> digital, but were replaced by more "advanced" standards that were
>> analog, such as VGA.
>
> Simplicity. The graphics systems used in the early days
> of the PC had only a very limited "bit depth" (number of
> shades per color), and the easiest interface to implement for
> such a simple system was either one or two bits each, directly
> from the graphics output to the input stage of the CRT - i.e.,
> just run the bits out through a TTL buffer and be done with
> it. With a greater number of bits/color, a D/A converter at the
> output and transmitting the video information in "analog" form
> (which is what the CRT's going to want at the cathode, anyway)
> becomes a more cost-effective solution than trying to deliver the
> information as a parallel-digital signal (which is basically what the
> "CGA" style of interface was) and instead ship it up on three
> coaxes. If it were simply a question of which method were more
> "advanced," it would also be legitimate to ask why television
> is now moving from "analog" to "digital" transmission, and
> computer interfaces are beginning to do the same.
>
> The answer to just about ANY question in engineering which
> begins with "Why did they do THIS...?" is generally "because
(snip)

The remaining issue with "VGA" (the analog signal path of
"choice" today) is the source and termination impedance
variations due to normal production tolerances. This can result
in a "unbalanced" white point, and colorimetric inaccuracies just
due to those tolerances. However, there are a host of other
contributors to the colorimetric accuracy, as Bob and others
are well aware - in the case of the DVI method, the idea was
to try and maintain the numerically encoded pixel value across
the transmission medium. Additionally, the DVI includes a more
reliable way to recover the pixel clock, allowing closer to perfect
establishment of the proper numeric value into each "bucket".

>
>
>> Digital is less flexible than analog--conceivably a VGA cable can
>> carry just about any resolution or color depth. Digital provides
>> fewer errors in exchange for reduced bandwidth and flexibility.
>
> This is incorrect. For one thing, it assumes that simple binary
> encoding is the only thing that could possibly be considered under
> the heading "digital" (which is wrong, and in fact several examples
> of other fully-digital systems exist even in the area of video
> interfaces; for instance, the 8-VSB or COFDM encodings
> used in broadcast HDTV).
(snip)

To add another point to Bobs, the DVI signal *is* further
encoded (TMDS), which does include a probablilty that the
encoded value may have some (acceptable) error. Without
TMDS, there is still not enough bandwidth available today to
pass the entire bitmap of values within the refresh time. But
any error related to encoding is not perceptable to the typical
user - it's just a theoretical point to be aware of.

Also, because the analog signal clock recovery method uses
phase locked loop technology, there does remain a possibility
of PLL types of errors within displays using a VGA interface,
separate from any scaling issue related to displaying images
in "non-native" formats.
To best see such artifacts, try displaying a half tone type
of image, and look for background noise in the image. Some
display companies may provide the user with a "calibration
image" in the CD version of the user documentation. This
image will contain some halftone details, and the user should
be directed to activate the "autoadjust" (AKA auto) control
which should put the PLL through its optimization process.
This portion of the "analog chain" technology is continuing
to improve, and we users can benefit from that improvement
when we use some of the more recent products available.
That aspect of performance is one that may be compromised
in "lower end" LCD monitors. For many users it is not
important, but discriminating users should be aware of the
possibility.

Regards,
NGA
Anonymous
a b U Graphics card
July 11, 2005 11:19:26 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> This is incorrect. For one thing, it assumes that simple binary
> encoding is the only thing that could possibly be considered under
> the heading "digital" (which is wrong, and in fact several examples
> of other fully-digital systems exist even in the area of video
> interfaces; for instance, the 8-VSB or COFDM encodings
> used in broadcast HDTV).

No. Digital is nothing more than analog with an arbitrary, non-zero
threshold separating "signal" from "noise." Digital systems always
have less bandwidth than analog systems operating over the same
physical channels, because they set a non-zero threshold for the noise
floor. This allows digital systems to guarantee errors below a
certain level under certain conditions, but it sacrifices bandwidth to
do so. It all comes out analog (or digital) in the end, depending
only on how you look at it.

> But the bottom line is that the information capacity of any real-world
> channel is limited, in terms of effective bits per second.

And all digital systems set their design capacities _below_ that
theoretical limit, whereas analog systems are constrained by precisely
that limit. If you set the noise threshold above zero, you reduce
bandwidth and you make it possible to hold errors below a threshold
that is a function of the noise threshold; your system is then
digital. If you set the noise threshold to zero, you have an analog
system, limited only by the absolute bandwidth of the channel but
without any guarantees concerning error levels.

> The actual difference between "analog" and
> "digital" systems here is not one of "errors" vs."bandwidth,"
> but rather where those errors occur; as noted, the typical
> analog system preserves the most-significant-bit data vs.
> least-significant (e.g., you can still make out the picture, even
> when the noise level makes it pretty "snowy") - or in other words,
> analog "degrades gracefully." Most simple digital encodings
> leave all bits equally vulnerable to noise, which makes for a
> "cliff effect" - digital transmissions tend to be "perfect" up to a
> given noise level, at which point everything is lost at once.

Same as above.
Anonymous
a b U Graphics card
July 12, 2005 1:53:34 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Mxsmanic wrote:

>chrisv writes:
>
>> Not true.
>
>It is both true and inevitable.

Nope.

>It is widely unknown, however. Most
>people have no clue about what "digital" actually means.

I do.
Anonymous
a b U Graphics card
July 13, 2005 5:35:43 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"" wrote:
> Mxsmanic wrote:
>
> >chrisv writes:
> >
> >> Not true.
> >
> >It is both true and inevitable.
>
> Nope.
>
> >It is widely unknown, however. Most
> >people have no clue about what "digital" actually means.
>
> I do.

For you all arguing about analog and digital

Analog != infinite different values. Know that when we talk about
energy, we have a "digital reality". Accoring to quantum physics,
there is a
smallest amount of energy, and all other amounts of energy are the
smallest amount times [0,1,2,3,4,5,6,7,8,9,10,...,1245,1246,...] any
whole number. Every thing we interact with is energy, even mass
(E=mc2). Light, temperature, sound and all other things base on
different amounts of energy.

So in fact, the "real world" isn’t all analogic, it is mostly
digital! So all interfaces a person has with the world around him, are
digital and it means that "the perfect analogic system" would be, in
fact, digital.
The only truly analogic thing that comes to my mind is distance, but
even distance is measured digitally. I can’t think of any analogic
interface between a person and the world.

Infinite precision is not possible for any interface, but in turuth
everything basing on energy has a value so precise, comparing to macro
level energies, that it can be considered to be "infinitely
precise". But this amount is quntitated and so it’s best to be put in
digital form for processing.Example: although we can’t have infinite
precision about their mass, if we take two pieces of iron, it is
posible that the bigger one is exactly three times heavier than the
other one, with infinite precision!
But according to Heisenbergs Uncertainty principle it can’t be
measured, even in theory. Given that, I must say that even as
interfaces are
digital, the uncertainty of measurements makes it impossible to make a
perfect system analog or digital.

Of course this theory will never be in practice. So I would say that
the dvi
can be better, but in most cases the difference is too small to be
noticeable, in my work I see a lot of vga&lcds in use and most of the
time
no errors visible to the naked eye. A couple of times we have had a
lcd
monitor that looked bad with vga (in the native resolution), and there
the
dvi-interface has helped. My own favourite is still digital, though
some
analogic systems will provide better resolution, the digital systems
will
have less errors and with thecknologic advance: a sufficient
resolution.

And sorry for my bad english.

--
Posted using the http://www.hardwareforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.hardwareforumz.com/General-DVI-connector-wor...
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.hardwareforumz.com/eform.php?p=298172
Anonymous
a b U Graphics card
July 14, 2005 12:41:58 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"syka" <UseLinkToEmail@HardwareForumz.com> wrote in message
news:7_298172_357a46dae081bdf4cbc0c158f0243310@hardwareforumz.com...
> Analog != infinite different values. Know that when we talk about
> energy, we have a "digital reality". Accoring to quantum physics,
> there is a
> smallest amount of energy, and all other amounts of energy are the
> smallest amount times [0,1,2,3,4,5,6,7,8,9,10,...,1245,1246,...] any
> whole number. Every thing we interact with is energy, even mass
> (E=mc2). Light, temperature, sound and all other things base on
> different amounts of energy.

Not quite. While I agree with you that analog does not by
any means imply "infinite different values" (or infinite capacity,
accuracy, or any other such nonsense), it is not the case that
we have a "digital reality." Such a statement makes the very
common error of confusing "digital" with "quantized"; while it
is true that almost all practical digital systems have fixed
limits imposed by quantization, the two words do not mean
quite the same thing. It is certainly possible to have a quantized
analog representation; it is possible in theory (although certainly
far less commonly encountered) to have a "digital" system in
which the quantization limit is not fixed by the system itself
(for instance, simply by allowing for a variable-bit-length
representation in which the maximum possible bit length is
greatly in excess of the accuracy of the information which can
even POSSIBLY be provided).

The "real world" is neither "digital" nor "analog"- it is simply
the real world, and represents the source of information that
these two encoding methods attempts to capture and convey.
The key to what "digital" and "analog" really mean are right there
in the words themselves. A "digital" representation is simply
any system where information is conveyed as numeric values
(symbols which are to be interpreted as numbers, rather than
having some quality which directly corresponds to the "level"
of the information being transmitted), whereas an "analog"
representation is just that. It is any system in which one
value is made to directly represent another, through varying
in an "analogous" fashion (e.g., this voltage is varying in a
manner very similar to the way that air pressure varied, hence
this is an "analog" transmission of sound). In the extreme of this
perspective, we might go so far as to say that there is truly no
such thing as "analog" or "digital" electronics, per se - there are
just electrical signals, which all obey the same laws of physics.
It is how we choose (or are intended) to interpret these signals
that classifies them as "analog" or "digital." (Power systems,
for instance, are in fact neither, as they are not in the normal
sense "carrying information.")

Our friend Mxsmanic is also in error regarding the capacity
limits of analog systems vs. digital; it is practically NEVER the
case that an analog system is actually carrying information at
anything even approaching the Shannon limit (in part due to the
extreme information redundancy of most analog transmission systems).
A perfect example of this is the case of "analog" vs."digital" television.
The Shannon limit of a standard 6 MHz TV channel at a 10 dB
signal-to-noise ratio is a little over 20 Mbits/sec, which is a
good deal greater than the actual information content of a
standard analog broadcast signal. The U.S. HDTV standard,
on the other hand, actually runs close to this limit (the standard
bit rate is in excess of 19 Mbit/sec), and through compression
techniques available only in a digital system manages to convey
an image that in analog form would require a channel of much
higher bandwidth (although these compression methods, since
some of the steps are lossy, should not really be seen as
somehow beating the Shannon limit).

Bob M.
Anonymous
a b U Graphics card
July 14, 2005 12:41:59 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers wrote:

>
> "syka" <UseLinkToEmail@HardwareForumz.com> wrote in message
> news:7_298172_357a46dae081bdf4cbc0c158f0243310@hardwareforumz.com...
>> Analog != infinite different values. Know that when we talk about
>> energy, we have a "digital reality". Accoring to quantum physics,
>> there is a
>> smallest amount of energy, and all other amounts of energy are the
>> smallest amount times [0,1,2,3,4,5,6,7,8,9,10,...,1245,1246,...] any
>> whole number. Every thing we interact with is energy, even mass
>> (E=mc2). Light, temperature, sound and all other things base on
>> different amounts of energy.
>
> Not quite. While I agree with you that analog does not by
> any means imply "infinite different values" (or infinite capacity,
> accuracy, or any other such nonsense), it is not the case that
> we have a "digital reality." Such a statement makes the very
> common error of confusing "digital" with "quantized"; while it
> is true that almost all practical digital systems have fixed
> limits imposed by quantization, the two words do not mean
> quite the same thing. It is certainly possible to have a quantized
> analog representation; it is possible in theory (although certainly
> far less commonly encountered) to have a "digital" system in
> which the quantization limit is not fixed by the system itself
> (for instance, simply by allowing for a variable-bit-length
> representation in which the maximum possible bit length is
> greatly in excess of the accuracy of the information which can
> even POSSIBLY be provided).
>
> The "real world" is neither "digital" nor "analog"- it is simply
> the real world, and represents the source of information that
> these two encoding methods attempts to capture and convey.
> The key to what "digital" and "analog" really mean are right there
> in the words themselves. A "digital" representation is simply
> any system where information is conveyed as numeric values
> (symbols which are to be interpreted as numbers, rather than
> having some quality which directly corresponds to the "level"
> of the information being transmitted), whereas an "analog"
> representation is just that. It is any system in which one
> value is made to directly represent another, through varying
> in an "analogous" fashion (e.g., this voltage is varying in a
> manner very similar to the way that air pressure varied, hence
> this is an "analog" transmission of sound). In the extreme of this
> perspective, we might go so far as to say that there is truly no
> such thing as "analog" or "digital" electronics, per se - there are
> just electrical signals, which all obey the same laws of physics.
> It is how we choose (or are intended) to interpret these signals
> that classifies them as "analog" or "digital." (Power systems,
> for instance, are in fact neither, as they are not in the normal
> sense "carrying information.")
>
> Our friend Mxsmanic is also in error regarding the capacity
> limits of analog systems vs. digital; it is practically NEVER the
> case that an analog system is actually carrying information at
> anything even approaching the Shannon limit (in part due to the
> extreme information redundancy of most analog transmission systems).
> A perfect example of this is the case of "analog" vs."digital" television.
> The Shannon limit of a standard 6 MHz TV channel at a 10 dB
> signal-to-noise ratio is a little over 20 Mbits/sec, which is a
> good deal greater than the actual information content of a
> standard analog broadcast signal. The U.S. HDTV standard,
> on the other hand, actually runs close to this limit (the standard
> bit rate is in excess of 19 Mbit/sec), and through compression
> techniques available only in a digital system manages to convey
> an image that in analog form would require a channel of much
> higher bandwidth (although these compression methods, since
> some of the steps are lossy, should not really be seen as
> somehow beating the Shannon limit).

In practice this sometimes shows--digital TV has no redundancy to speak
of--it either has a perfect image or it has dead air, there is no gradual
degradation like there is with analog.

> Bob M.

--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
July 14, 2005 1:37:27 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> Our friend Mxsmanic is also in error regarding the capacity
> limits of analog systems vs. digital; it is practically NEVER the
> case that an analog system is actually carrying information at
> anything even approaching the Shannon limit (in part due to the
> extreme information redundancy of most analog transmission systems).

True, but the key difference is that analog systems are theoretically
_capable_ of doing this (and can sometimes approach it quite closely,
with proper design), whereas digital systems, by their very nature,
sacrifice some of this theoretical capacity in exchange for holding
errors below a predetermined threshold.

> A perfect example of this is the case of "analog" vs."digital" television.
> The Shannon limit of a standard 6 MHz TV channel at a 10 dB
> signal-to-noise ratio is a little over 20 Mbits/sec, which is a
> good deal greater than the actual information content of a
> standard analog broadcast signal. The U.S. HDTV standard,
> on the other hand, actually runs close to this limit (the standard
> bit rate is in excess of 19 Mbit/sec), and through compression
> techniques available only in a digital system manages to convey
> an image that in analog form would require a channel of much
> higher bandwidth (although these compression methods, since
> some of the steps are lossy, should not really be seen as
> somehow beating the Shannon limit).

The HDTV standard isn't doing anything that analog can't do. It is,
after all, using analog methods to transmit its "digital" information.
The only advantage of digital is that the error rate can be carefully
controlled, whereas in analog, there is no "error rate"--everything is
signal, even when it's not.
Anonymous
a b U Graphics card
July 14, 2005 1:40:11 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

J. Clarke writes:

> In practice this sometimes shows--digital TV has no redundancy to speak
> of--it either has a perfect image or it has dead air, there is no gradual
> degradation like there is with analog.

True for all digital systems. Analog systems always have some sort of
error, and this error increases gradually and gracefully as noise
increases. Digital systems draw an artificial line below which all is
noise and above which all is signal. As long as noise actually
remains below this line in the channel, digital transmission is
error-free. But if it rises above the line, there is a _sudden_ (and
often catastrophic) appearance of serious, uncorrectible errors in the
channel.

The whole idea of digital is to draw the line at the right place, so
that you always have error-free transmission. You sacrifice the bit
of channel capacity below the line in order to get error-free
transmission at a slightly slower rate than analog might provide.
Anonymous
a b U Graphics card
July 14, 2005 5:20:59 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@gmail.com> wrote in message
news:hc5cd1tjf76v2ig7195pbrjfjbdsmp94ph@4ax.com...
> The HDTV standard isn't doing anything that analog can't do.

Actually, it's doing quite a bit that analog can't do (or more precisely,
it is easily doing things that would extremely difficult if not practically
impossible to do in analog form). Chief among these is permitting
a rather high degree of data compression while still keeping all
components of the signal completely independent and separable.
This points out one of the true advantages of a digital representation
over analog - it puts the information into a form that is easily
manipulated, mathematically, without necessarily introducing error
and loss into that manipulation. The 1920 x 1080, 60 Hz, interlaced-
scan format (which is the highest currently in use in the U.S. system)
would basically be impossible to transmit in an analog system. (The
best analog TV system ever put into use, Japan's MUSE (for MUltiple
Sub-Nyquist Encoding) didn't achieve the same level of quality, and
sacrificied quite a lot to squeeve a signal of somewhat lower resolution
into quite a bit more bandwidth. As noted earlier, this isn't a violation
of the Shannon limit, but (as is the case with just about all compression
methods) trading off redundancy for capacity).

Your comments regarding "analog" systems having an inherent
capability to more readily approach the Shannon limit than "digital"
again are based on the conventional, common examples of specific
implementations of the two (i.e., a straight equal-weight binary
representation for "digital," and the sort of simple examples of "analog"
that we're all used to). But this isn't really an inherent limitation of
digital in general. Digital representations can be (and have been)
designed) which do not place equal noise margin on all bits, etc.,
and provide a more "graceful" degradation in the presence of noise.
Shannon applies to ALL forms of information encoding, and none
have a particular advantage in theory in coming closest to this limit.


> after all, using analog methods to transmit its "digital" information.

Again, a confusion of "analog" and "digital" with other terms which
are closely associated but not identical (e.g., "continuous" vs.
"quantized," "linear," "sampled," and so forth. The methods used to
transmit HDTV are not "analog," they simply aren't the easy-to-grasp
examples of "digital" that we all see in classes or books on the basics.

Bob M.
Anonymous
a b U Graphics card
July 14, 2005 5:21:00 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers wrote:

>
> "Mxsmanic" <mxsmanic@gmail.com> wrote in message
> news:hc5cd1tjf76v2ig7195pbrjfjbdsmp94ph@4ax.com...
>> The HDTV standard isn't doing anything that analog can't do.
>
> Actually, it's doing quite a bit that analog can't do (or more precisely,
> it is easily doing things that would extremely difficult if not
> practically
> impossible to do in analog form). Chief among these is permitting
> a rather high degree of data compression while still keeping all
> components of the signal completely independent and separable.
> This points out one of the true advantages of a digital representation
> over analog - it puts the information into a form that is easily
> manipulated, mathematically, without necessarily introducing error
> and loss into that manipulation. The 1920 x 1080, 60 Hz, interlaced-
> scan format (which is the highest currently in use in the U.S. system)
> would basically be impossible to transmit in an analog system. (The
> best analog TV system ever put into use, Japan's MUSE (for MUltiple
> Sub-Nyquist Encoding) didn't achieve the same level of quality, and
> sacrificied quite a lot to squeeve a signal of somewhat lower resolution
> into quite a bit more bandwidth. As noted earlier, this isn't a violation
> of the Shannon limit, but (as is the case with just about all compression
> methods) trading off redundancy for capacity).
>
> Your comments regarding "analog" systems having an inherent
> capability to more readily approach the Shannon limit than "digital"
> again are based on the conventional, common examples of specific
> implementations of the two (i.e., a straight equal-weight binary
> representation for "digital," and the sort of simple examples of "analog"
> that we're all used to). But this isn't really an inherent limitation of
> digital in general. Digital representations can be (and have been)
> designed) which do not place equal noise margin on all bits, etc.,
> and provide a more "graceful" degradation in the presence of noise.
> Shannon applies to ALL forms of information encoding, and none
> have a particular advantage in theory in coming closest to this limit.
>
>
>> after all, using analog methods to transmit its "digital" information.
>
> Again, a confusion of "analog" and "digital" with other terms which
> are closely associated but not identical (e.g., "continuous" vs.
> "quantized," "linear," "sampled," and so forth. The methods used to
> transmit HDTV are not "analog," they simply aren't the easy-to-grasp
> examples of "digital" that we all see in classes or books on the basics.

In a sense. It's a digital signal transmitted by modulating an analog
carrier. The HD signal can be extracted from the output stream of a BT848,
which is designed to decode analog SD TV.

I don't expect the average non-engineer to grasp this level of subtlety
though.

--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
July 14, 2005 11:20:18 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"J. Clarke" <jclarke.usenet@snet.net.invalid> wrote in message
news:D b65n312tdb@news1.newsguy.com...
> Bob Myers wrote:
>
> >
> > "Mxsmanic" <mxsmanic@gmail.com> wrote in message
> > news:hc5cd1tjf76v2ig7195pbrjfjbdsmp94ph@4ax.com...
> >> The HDTV standard isn't doing anything that analog can't do.
> >
> > Actually, it's doing quite a bit that analog can't do (or more
precisely,
> > it is easily doing things that would extremely difficult if not
> > practically
> > impossible to do in analog form). Chief among these is permitting
> > a rather high degree of data compression while still keeping all
> > components of the signal completely independent and separable.
> > This points out one of the true advantages of a digital representation
> > over analog - it puts the information into a form that is easily
> > manipulated, mathematically, without necessarily introducing error
> > and loss into that manipulation. The 1920 x 1080, 60 Hz, interlaced-
> > scan format (which is the highest currently in use in the U.S. system)
> > would basically be impossible to transmit in an analog system. (The
> > best analog TV system ever put into use, Japan's MUSE (for MUltiple
> > Sub-Nyquist Encoding) didn't achieve the same level of quality, and
> > sacrificied quite a lot to squeeve a signal of somewhat lower resolution
> > into quite a bit more bandwidth. As noted earlier, this isn't a
violation
> > of the Shannon limit, but (as is the case with just about all
compression
> > methods) trading off redundancy for capacity).
> >
> > Your comments regarding "analog" systems having an inherent
> > capability to more readily approach the Shannon limit than "digital"
> > again are based on the conventional, common examples of specific
> > implementations of the two (i.e., a straight equal-weight binary
> > representation for "digital," and the sort of simple examples of
"analog"
> > that we're all used to). But this isn't really an inherent limitation
of
> > digital in general. Digital representations can be (and have been)
> > designed) which do not place equal noise margin on all bits, etc.,
> > and provide a more "graceful" degradation in the presence of noise.
> > Shannon applies to ALL forms of information encoding, and none
> > have a particular advantage in theory in coming closest to this limit.
> >
> >
> >> after all, using analog methods to transmit its "digital" information.
> >
> > Again, a confusion of "analog" and "digital" with other terms which
> > are closely associated but not identical (e.g., "continuous" vs.
> > "quantized," "linear," "sampled," and so forth. The methods used to
> > transmit HDTV are not "analog," they simply aren't the easy-to-grasp
> > examples of "digital" that we all see in classes or books on the basics.
>
> In a sense. It's a digital signal transmitted by modulating an analog
> carrier. The HD signal can be extracted from the output stream of a
BT848,
> which is designed to decode analog SD TV.
>
> I don't expect the average non-engineer to grasp this level of subtlety
> though.
>
> --
> --John
> to email, dial "usenet" and validate
> (was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
July 14, 2005 11:23:29 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"J. Clarke" <jclarke.usenet@snet.net.invalid> wrote in message
news:D b65n312tdb@news1.newsguy.com...
> In a sense. It's a digital signal transmitted by modulating an analog
> carrier. The HD signal can be extracted from the output stream of a
BT848,
> which is designed to decode analog SD TV.
>
> I don't expect the average non-engineer to grasp this level of subtlety
> though.

Yes, we've definitely passed the point of general interest here, so I'm
inclined to close out my participation in the thread at this point.

But just to follow up on the above - in the sense I meant, meaning the
specific definitions of "analog" and "digital" covered earlier, I would
say that there's no such thing as an "analog carrier"- the carrier
itself, prior to modulation, carries no information (nor is the information
being impressed upon it in an "analog" form), so in what sense would
we call it "analog" to begin with?

Bob M.
Anonymous
a b U Graphics card
July 15, 2005 1:34:21 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers wrote:

>
> "J. Clarke" <jclarke.usenet@snet.net.invalid> wrote in message
> news:D b65n312tdb@news1.newsguy.com...
>> In a sense. It's a digital signal transmitted by modulating an analog
>> carrier. The HD signal can be extracted from the output stream of a
> BT848,
>> which is designed to decode analog SD TV.
>>
>> I don't expect the average non-engineer to grasp this level of subtlety
>> though.
>
> Yes, we've definitely passed the point of general interest here, so I'm
> inclined to close out my participation in the thread at this point.
>
> But just to follow up on the above - in the sense I meant, meaning the
> specific definitions of "analog" and "digital" covered earlier, I would
> say that there's no such thing as an "analog carrier"- the carrier
> itself, prior to modulation, carries no information (nor is the
> information being impressed upon it in an "analog" form), so in what sense
> would we call it "analog" to begin with?

In the sense that a chip which is not designed to deal with digital signals
can nonetheless extract the signal.

> Bob M.

--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
August 29, 2005 8:39:09 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"" wrote:
> A friend has got a new LCD monitor and he wants to know if
> connecting
> via a DVI connector would improve the quality over the VGA
> connectors? I
> would suspect it's imperceptible. He doesn't have a video card
> with a
> DVI connector yet, but he wants to get one if there's any
> difference.
>
> Yousuf Khan

I’m following threads on LCD monitors because I’m in the market. It
seems some opinion is that DVI is not a step forward .
The way I see it is even if DVI is no advantage the better monitors
will
have the option regardless?
Also what I’m wondering about is how much the video card
contributes
towards DVI performance? Are some cards simply holding back the
benefits of the DVI interface?

--
Posted using the http://www.hardwareforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.hardwareforumz.com/General-DVI-connector-wor...
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.hardwareforumz.com/eform.php?p=309275
Anonymous
a b U Graphics card
August 30, 2005 12:23:14 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Captin wrote:
> I'm following threads on LCD monitors because I'm in the market. It
> seems some opinion is that DVI is not a step forward .
> The way I see it is even if DVI is no advantage the better monitors
> will
> have the option regardless?
> Also what I'm wondering about is how much the video card
> contributes
> towards DVI performance? Are some cards simply holding back the
> benefits of the DVI interface?

I doubt it, if anything DVI should be the great equalizer of video
cards (at least as far as picture quality goes, not performance-wise).
DVI bypasses a video card's own RAMDACs in favour of the RAMDACs built
into the LCD monitor. So no matter whether you have an expensive video
card or a cheap one, you're still going to be using the same RAMDACs,
i.e. those inside the monitor. In the past, the RAMDACs inside some
expensive video cards were probably slightly higher precision than
those inside cheap cards; this gained the expensive cards an advantage.

I guess what happens now is that the picture quality is now going to be
dependent on the quality of the RAMDACs inside the monitors instead of
inside the video cards.

Yousuf Khan
Anonymous
a b U Graphics card
August 30, 2005 2:24:59 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

On 7 Jul 2005 11:33:27 -0700, "Yousuf Khan" <yjkhan@gmail.com> wrote:


>Yeah, that's basically what I've been hearing from asking around
>recently. Several people that I've asked said they couldn't tell the
>difference between the DVI interface and the VGA one.

That's my experience so far. We tried analog compared to DVI on his
LCD and to the naked eye we couldn't tell the difference.
Anonymous
a b U Graphics card
August 30, 2005 4:39:10 AM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"" wrote:
> J. Clarke writes:
>
> > In practice this sometimes shows--digital TV has no
> redundancy to speak
> > of--it either has a perfect image or it has dead air, there
> is no gradual
> > degradation like there is with analog.
>
> True for all digital systems. Analog systems always have some
> sort of
> error, and this error increases gradually and gracefully as
> noise
> increases. Digital systems draw an artificial line below
> which all is
> noise and above which all is signal. As long as noise
> actually
> remains below this line in the channel, digital transmission
> is
> error-free. But if it rises above the line, there is a
> _sudden_ (and
> often catastrophic) appearance of serious, uncorrectible
> errors in the
> channel.
>
> The whole idea of digital is to draw the line at the right
> place, so
> that you always have error-free transmission. You sacrifice
> the bit
> of channel capacity below the line in order to get error-free
> transmission at a slightly slower rate than analog might
> provide.

I’m asking everyone here. Does the performance of DVI vary a great
deal from video card to video card?
I mean is it possible we have a situation where DVI offers a step
forward with some video cards and not so with others?

--
Posted using the http://www.hardwareforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.hardwareforumz.com/General-DVI-connector-wor...
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.hardwareforumz.com/eform.php?p=309462
Anonymous
a b U Graphics card
August 30, 2005 12:09:10 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Captin wrote:

> "" wrote:
> > J. Clarke writes:
> >
> > > In practice this sometimes shows--digital TV has no
> > redundancy to speak
> > > of--it either has a perfect image or it has dead air, there
> > is no gradual
> > > degradation like there is with analog.
> >
> > True for all digital systems. Analog systems always have some
> > sort of
> > error, and this error increases gradually and gracefully as
> > noise
> > increases. Digital systems draw an artificial line below
> > which all is
> > noise and above which all is signal. As long as noise
> > actually
> > remains below this line in the channel, digital transmission
> > is
> > error-free. But if it rises above the line, there is a
> > _sudden_ (and
> > often catastrophic) appearance of serious, uncorrectible
> > errors in the
> > channel.
> >
> > The whole idea of digital is to draw the line at the right
> > place, so
> > that you always have error-free transmission. You sacrifice
> > the bit
> > of channel capacity below the line in order to get error-free
> > transmission at a slightly slower rate than analog might
> > provide.
>
> I’m asking everyone here. Does the performance of DVI vary a great
> deal from video card to video card?
> I mean is it possible we have a situation where DVI offers a step
> forward with some video cards and not so with others?

It's not a simple question.

First, the result depends on the monitor. In general, with LCD displays,
using the same board, same monitor, same cable, same everything, DVI will
yield an image that is any where from imperceptibly to greatly superior to
that yielded by analog. With CRT displays, in general DVI yields inferior
results to analog simply because the DAC used in CRTs with DVI is usually
not of very high quality.

Then it depends on the resolution--the resolution limits for DVI are lower
than for analog. If the monitor and video board can support a higher
resoltion with analog than DVI allows, then in general analog can give a
better image by using that high resolution.

Then it depends on the configuration--if your monitor is one which does not
scale the DVI input then you will always have a 1:1 correspondence between
physical and logical pixels and you won't get sharper than that. If it
_does_ scale however and if the signal sent out by the video board is
different from the native resolution of the monitor then the image will be
degraded to some exent. Whether the result will be better or worse than
that yielded by analog at the same resolution then depends on the details
of the implementation of scaling in the monitor.

--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
August 30, 2005 5:16:38 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Captin" <UseLinkToEmail@HardwareForumz.com> wrote in message
news:7_309275_19d4a380c2665be6697db7cc09b6a079@hardwareforumz.com...
> "" wrote:
> > A friend has got a new LCD monitor and he wants to know if
> > connecting
> > via a DVI connector would improve the quality over the VGA
> > connectors? I
> > would suspect it's imperceptible. He doesn't have a video card
> > with a
> > DVI connector yet, but he wants to get one if there's any
> > difference.
> >
> > Yousuf Khan
>
> I'm following threads on LCD monitors because I'm in the market. It
> seems some opinion is that DVI is not a step forward .
> The way I see it is even if DVI is no advantage the better monitors
> will
> have the option regardless?
> Also what I'm wondering about is how much the video card
> contributes
> towards DVI performance? Are some cards simply holding back the
> benefits of the DVI interface?
>
> --
> Posted using the http://www.hardwareforumz.com interface, at author's
> request
> Articles individually checked for conformance to usenet standards
> Topic URL:
> http://www.hardwareforumz.com/General-DVI-connector-wor...
> Visit Topic URL to contact author (reg. req'd). Report abuse:
> http://www.hardwareforumz.com/eform.php?p=309275

IMHO, cards without DVI are made so mostly for purposes of economy or
space.

IMEO (Expert Opinion) most newer LCD monitors do well enough on VGA that the
extra expense of the DVI link(s) is not justified. That said, YMMV,
depending on how
fussy you are about artifacts. Think of DVI as just a different way of
delivering the
detailed pixel information to the display. How the display uses that
information is the
key, and those details are what have improved significantly over the last
several years.

It is worth taking time to test things personally - unfortunately getting
access to the
necessary components to DIY is a barrier for most of us "casual" users.

Regards,
NGA
Anonymous
a b U Graphics card
August 30, 2005 9:39:52 PM

Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"" wrote:
> J. Clarke writes:
>
> > In practice this sometimes shows--digital TV has no
> redundancy to speak
> > of--it either has a perfect image or it has dead air, there
> is no gradual
> > degradation like there is with analog.
>
> True for all digital systems. Analog systems always have some
> sort of
> error, and this error increases gradually and gracefully as
> noise
> increases. Digital systems draw an artificial line below
> which all is
> noise and above which all is signal. As long as noise
> actually
> remains below this line in the channel, digital transmission
> is
> error-free. But if it rises above the line, there is a
> _sudden_ (and
> often catastrophic) appearance of serious, uncorrectible
> errors in the
> channel.
>
> The whole idea of digital is to draw the line at the right
> place, so
> that you always have error-free transmission. You sacrifice
> the bit
> of channel capacity below the line in order to get error-free
> transmission at a slightly slower rate than analog might
> provide.

Who is setting the pace with 19" LCD monitors currently?

--
Posted using the http://www.hardwareforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.hardwareforumz.com/General-DVI-connector-wor...
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.hardwareforumz.com/eform.php?p=309584
!