DVI - any real difference?

Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

I am about to purchase a TFT monitor to replace my 19" CRT.

I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and DVI
outputs.

How important is it to buy a display with DVI input? Does it make any real
differerence?

Cheers.

Bobby
11 answers Last reply
More about real difference
  1. Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

    From a Cnet review


    DVI support is found primarily on LCDs. However, the advantage of
    digital signals for LCDs is of somewhat less importance now than it
    was a few years ago. Analog signal processing has improved to the
    point where major differences in image quality can be difficult to
    detect. Unless you're a pro photographer, a prepress professional, or
    someone else who needs superprecise, top-notch image quality, you
    should be fine using a CRT or an LCD on an analog signal.


    --
    ASUS A8V/Athlon 64 FX-55
    ATI RADEON X800XT PE
    1GB OCZ Gold Edition Rev3 DDR PC-3700
  2. Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

    On a 19 inch screen, you will see a slight difference in the sharpness. I
    have 2 of 17 inch DVI monitors here in the office. The older machine's ATI
    display card is too old to have DVI. The newer one has it. I can see the
    difference in the newer one if I look very close at the details. Since the
    bandwidth of the display data is very wide, there is slightly less loss with
    the DVI when run through the VGA cable.

    Since you paid a little extra for your display card to have DVI, you should
    use it. When connect the new LCD monitor, and the computer is started up,
    simple press the Auto Setup button on the monitor. This will set up the
    monitors data timing to match the display card. You will be extremely
    impressed at the results.

    --

    Jerry G.
    ======


    "Bobby" <bobby@europe.com> wrote in message
    news:38uk4cF5s2g4rU1@individual.net...
    I am about to purchase a TFT monitor to replace my 19" CRT.

    I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and DVI
    outputs.

    How important is it to buy a display with DVI input? Does it make any real
    differerence?

    Cheers.

    Bobby
  3. Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

    On Sat, 5 Mar 2005 20:46:03 -0000, "Bobby" <bobby@europe.com> wrote:

    >I am about to purchase a TFT monitor to replace my 19" CRT.
    >
    >I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and DVI
    >outputs.
    >
    >How important is it to buy a display with DVI input? Does it make any real
    >differerence?
    >
    >Cheers.
    >
    >Bobby
    >
    Samsung say the difference is minor but that you should notice
    slightly sharper text. So overall it gives a slightly sharper image
    but it is a minor difference.
  4. Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

    In message <38uk4cF5s2g4rU1@individual.net> "Bobby" <bobby@europe.com>
    wrote:

    >I am about to purchase a TFT monitor to replace my 19" CRT.
    >
    >I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and DVI
    >outputs.
    >
    >How important is it to buy a display with DVI input? Does it make any real
    >differerence?

    If you're planning on buying a LCD (flat panel) monitor, definitely go
    DVI. If you're sticking with CRT it doesn't make a huge difference.

    Either way it's not the end of the world, you can get a pretty decent
    picture on any modern LCD from an analog signal, but it's noticeably
    better using DVI-D on my Dell 2005FPW (20.1" widescreen LCD)


    --
    Failure is not an option. It's bundled with your software.
  5. Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

    The DVI output of the video card gives a pure digital signal that is much
    sharper and cleaner than the analog signal from the VGA output.
    Yes, get an LCD monitor with a DVI connection.

    --
    DaveW


    "Bobby" <bobby@europe.com> wrote in message
    news:38uk4cF5s2g4rU1@individual.net...
    >I am about to purchase a TFT monitor to replace my 19" CRT.
    >
    > I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and
    > DVI outputs.
    >
    > How important is it to buy a display with DVI input? Does it make any real
    > differerence?
    >
    > Cheers.
    >
    > Bobby
    >
  6. Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

    Jerry G. wrote:

    > On a 19 inch screen, you will

    ^H^H^H^H may

    > see a slight difference in the sharpness. I
    > have 2 of 17 inch DVI monitors here in the office. The older machine's ATI
    > display card is too old to have DVI. The newer one has it. I can see the
    > difference in the newer one if I look very close at the details. Since the
    > bandwidth of the display data is very wide, there is slightly less loss
    > with the DVI when run through the VGA cable.

    Not a valid comparison--try it on the analog and digital outputs of the
    _same_ board and see if you can see the difference.

    > Since you paid a little extra for your display card to have DVI, you
    > should use it. When connect the new LCD monitor, and the computer is
    > started up, simple press the Auto Setup button on the monitor. This will
    > set up the monitors data timing to match the display card. You will be
    > extremely impressed at the results.
    >

    --
    --John
    to email, dial "usenet" and validate
    (was jclarke at eye bee em dot net)
  7. Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

    Yes, it does.

    There are a number of issues, but from what I've seen, #1 is that most
    analog cables -- even those that come with the monitor -- are not
    impedance matched and introduce ringing ("ghosts") around sharp
    transitions. It's most noticeable on small text, it's not noticeable at
    all (unless its really bad) on TV or movies or games. It's very subtle,
    many people won't notice it, but I'm in the display industry and I see
    it quite often.

    The #2 problem is time base accuracy and stability -- the analog monitor
    doesn't sample the pixel at exacty the moment that the video card
    "sends" it. [This used to be the #1 problem, but beweeen monitors
    getting bettter and increased display resolution (more 1280x1024 vs.
    1024x768), I'd say it's now #2]. You can diagnose this very quickly
    (and often adjust to eliminate it) by putting up a test pattern of
    alternating black and white vertical bars one single pixel wide; you
    should see a perfect reproduction, with zero moire present. The key is
    perfect adjustment of the dot clock frequency and phase. Note, however,
    that on the majority of monitors, the "Auto" function doesn't produce an
    exactly correct adjustment. Close, in many cases, but not exact.

    DVI simply eliminates the issues that cause both forms of distortion.
    With an analog monitor, maybe it's ok, maybe it's not (and it's usually
    not without test pattern adjustment). With DVI, assuming that the
    display works, there is no distortion from either cable issues or dot
    clock matching to the video card. In those regards, it's perfect.


    Bobby wrote:
    > I am about to purchase a TFT monitor to replace my 19" CRT.
    >
    > I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and DVI
    > outputs.
    >
    > How important is it to buy a display with DVI input? Does it make any real
    > differerence?
    >
    > Cheers.
    >
    > Bobby
    >
    >
  8. Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

    On Sat, 5 Mar 2005 20:46:03 -0000, "Bobby" <bobby@europe.com> wrote:

    >How important is it to buy a display with DVI input? Does it make any real
    >differerence?

    I have a noticable difference on my NEC MultiSync LCD 1860NX using a
    Radeon 9600XT card.

    I have both analog and DVI inputs. It took me a while to find a DVI
    cable, but when I switched it was a remarkably improved picture
    quality.
  9. Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

    "Bobby" <bobby@europe.com> wrote in message
    news:38uk4cF5s2g4rU1@individual.net...
    >I am about to purchase a TFT monitor to replace my 19" CRT.
    >
    > I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and
    > DVI outputs.
    >
    > How important is it to buy a display with DVI input? Does it make any real
    > differerence?
    >
    > Cheers.
    >
    > Bobby
    >How about on a CRT. I have a NEC 1350X 22' monitor. It has both analog and
    >digital inputs. Video card is ATI 9800 pro.
  10. Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

    When interfacing to a CRT, you don't have the "dot clock" issue, so to
    that extent there is no benefit to DVI. However, you still have the
    issue with analog signal integrity on the cable (noise, ringing,
    impeadance mismatching, ghosting). If this isn't an issue, then there
    is unlikely to be any difference, but if it is an issue, then DVI will
    still be superior.


    Jeff McNulty wrote:

    > "Bobby" <bobby@europe.com> wrote in message
    > news:38uk4cF5s2g4rU1@individual.net...
    >
    >>I am about to purchase a TFT monitor to replace my 19" CRT.
    >>
    >>I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and
    >>DVI outputs.
    >>
    >>How important is it to buy a display with DVI input? Does it make any real
    >>differerence?
    >>
    >>Cheers.
    >>
    >>Bobby
    >>How about on a CRT. I have a NEC 1350X 22' monitor. It has both analog and
    >>digital inputs. Video card is ATI 9800 pro.
    >
    >
    >
  11. Archived from groups: alt.comp.hardware,alt.comp.periphs.videocards.ati (More info?)

    Barry Watzman wrote:

    > When interfacing to a CRT, you don't have the "dot clock" issue, so to
    > that extent there is no benefit to DVI. However, you still have the
    > issue with analog signal integrity on the cable (noise, ringing,
    > impeadance mismatching, ghosting). If this isn't an issue, then there
    > is unlikely to be any difference, but if it is an issue, then DVI will
    > still be superior.

    If, and ONLY if the D/A converter in the CRT provides signal quality greater
    than that of the video board+cable. That has not usually been the
    case--the D/A converters used in CRTs were for the most part pretty dismal.

    > Jeff McNulty wrote:
    >
    >> "Bobby" <bobby@europe.com> wrote in message
    >> news:38uk4cF5s2g4rU1@individual.net...
    >>
    >>>I am about to purchase a TFT monitor to replace my 19" CRT.
    >>>
    >>>I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and
    >>>DVI outputs.
    >>>
    >>>How important is it to buy a display with DVI input? Does it make any
    >>>real differerence?
    >>>
    >>>Cheers.
    >>>
    >>>Bobby
    >>>How about on a CRT. I have a NEC 1350X 22' monitor. It has both analog
    >>>and digital inputs. Video card is ATI 9800 pro.
    >>
    >>
    >>

    --
    --John
    to email, dial "usenet" and validate
    (was jclarke at eye bee em dot net)
Ask a new question

Read More

Radeon DVI Graphics Cards ATI Graphics