Simple question..
what's difference between DVI-I and DVI? I even saw DVI-D on some web site
I've seen some Radeon's images these days, I noticed that the DVI connector is some sort different than the one on my video card
Anyone help me, I would be appreciate~
11 answers Last reply
More about tomshardware
  1. DVI-I has extra pins carrying RGB and sync signals. This allows one to connect a VGA monitor to the port using a DVI-I/VGA adapter.

    DVI-D is just a renaming of the orginal DVI. The "D" indicates Digital only meaning no RGB capability.

    <b>99% is great, unless you are talking about system stability</b>
  2. so does that mean that DVI-I, DVI-D, both support for DFP monitor, except that DVI-I can use a VGA monitor by using a DVI-I/VGA adapter, but DVI-D can only be using DFP monitor, right?
    so on all the video card that have DVI connector, it's either DVI-I or DVI-D, am I correct?
    how to identify them then?
    thanks again~
  3. Here's a <A HREF="http://store.kayye.com/kayye/dviaccessories.html" target="_new">link with connector pics</A>.

    Notice the DVI-D has 20 pins (or whatever the number) plus a key tab.

    The DVI-I has the same 20 pins plus 4 additional pins (RGB + sync) and a different key.

    I think the original DVI connector carried just the 20 pins and no key (but I'm not sure about this).

    I don't know about the older DFP, what you need to convert from DVI (DVI-I or DVI-D) to DFP.

    <b>99% is great, unless you are talking about system stability</b>
  4. oops..
    I thought DVI connector are to connect to DFP monitor...
    I'm wrong I guess, DVI connector is connect to what kind of monitor? DVI monitor?
  5. Hi, stumbbled across ur post.
    I have a MSI GF4 Ti4200 VTD8x 128MB and it has the DVI-I out on it and I have an adaptor which changes my DFP monitor to this DVI-I. (check my problem in this forum under 'Problem connecting a Ti4200 with a digital flat)
    as u can see in my problem, it does work when my pc is up and running but with the monitor plugged out during bootin up and shutting down the pc.
    I donno if its the conversion problem of the adapter(keeping in mind that it works perfectly when it booted up) or if its just my motherboard bios not up to scratch with the conversion.
    I just have to unplug my monitor during boot up and shut down untill I find the solution, which I can still play my up to date games on....hence, before I had a Savage4 pro 32MB which I couldn't play the latest games on.

    Ps..My MSI card came with a free DVI - CRT changer so if you have a spare CRT monitor, u can use that instead to output 2 CRT monitors together.(i.e. the card has a CRT out put as well)

    Also, if you can find out a solution to my problem, Il'd be very grateful.
  6. DVI is also one of the formats that is starting to be used in the HDTV scene since it is a digital signal and can allow for content protection as opposed to the analog component inputs. Here is a <A HREF="http://www.bestbuy.com/images/esku/back/11126165baA.jpg" target="_new">picture</A> of the rear panel of a modern Sony HDTV widescreen projection monitor that uses the input.

    Also based upon the info provided in phsstpok's link, while the DVI-D is digital only the DVI-I can support both analog and digital. It also states however that although some connecters are the DVI-I interface that they may only support the digital format.

    From looking at Radeon's current cards it sounds like their DVI-I interface supports both formats since it can connect to a <b>digital </b>flat panel or use an adapter to connect to a standard, analog, VGA monitor.
    <P ID="edit"><FONT SIZE=-1><EM>Edited by CasualCat2001 on 04/21/03 11:59 PM.</EM></FONT></P>
  7. oh yeah, also, if the connector I have is a DVI-D, does that mean my video card doesn't have a second RAMDAC?
    thanks for everyone of answering me, great help, thanks
  8. Quote:
    oh yeah, also, if the connector I have is a DVI-D, does that mean my video card doesn't have a second RAMDAC?
    thanks for everyone of answering me, great help, thanks

    According to the link, DFP was the old interface. I think "DFP" stands for Digital Flat Panel.

    DVI (both DVI-D and DVI-I) is what is used today. You'll find it on projectors, flat panel displays, plasma displays, HDTV (as someone already mentioned). I think there was even talk of adding it to CRTs. CRTs operate in the analog domain but I imagine using a digital interface would reduce signal degredation and increase possible run lengths.

    Without DVI-I then the second connector on you video card would have to be VGA, assuming your card had a second RAMDAC. Without one of these two connectors you couldn't use a second RAMDAC nor a second CRT.

    I forgot which video card you have. 9500 and 9700 series (plus updated models) have two ramdacs builti into the GPU, IIRC.

    Older cards, like 7500 and 8500 (9100) need an off-chip RAMDAC for dual CRT function. I don't recall what the 9000 series has for RAMDACs.

    <b>99% is great, unless you are talking about system stability</b><P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 04/22/03 06:08 PM.</EM></FONT></P>
  9. hum...
    okay, thanks for your answer
    my video card is a Radeon8500LE, so mine doesn't have 2nd RAMDAC for Dual CRT function
    btw, I could use VGA monitor and a DVI monitor at same time right?
  10. Yup.

    You can also use a CRT and a TV as a second monitor but the quality is not very good.

    <b>99% is great, unless you are talking about system stability</b>
  11. thanks a lot~ now this post will end~
Ask a new question

Read More

Graphics Cards DVI DVI D Graphics