/ Sign-up
Your question


  • Graphics Cards
  • DVI
  • DVI D
  • Graphics
Last response: in Graphics Cards
April 21, 2003 5:09:17 PM

Simple question..
what's difference between DVI-I and DVI? I even saw DVI-D on some web site
I've seen some Radeon's images these days, I noticed that the DVI connector is some sort different than the one on my video card
Anyone help me, I would be appreciate~

More about : dvi

April 21, 2003 7:43:21 PM

DVI-I has extra pins carrying RGB and sync signals. This allows one to connect a VGA monitor to the port using a DVI-I/VGA adapter.

DVI-D is just a renaming of the orginal DVI. The "D" indicates Digital only meaning no RGB capability.

<b>99% is great, unless you are talking about system stability</b>
April 21, 2003 9:07:50 PM

so does that mean that DVI-I, DVI-D, both support for DFP monitor, except that DVI-I can use a VGA monitor by using a DVI-I/VGA adapter, but DVI-D can only be using DFP monitor, right?
so on all the video card that have DVI connector, it's either DVI-I or DVI-D, am I correct?
how to identify them then?
thanks again~
Related resources
Can't find your answer ? Ask !
April 22, 2003 2:01:57 AM

Here's a <A HREF="" target="_new">link with connector pics</A>.

Notice the DVI-D has 20 pins (or whatever the number) plus a key tab.

The DVI-I has the same 20 pins plus 4 additional pins (RGB + sync) and a different key.

I think the original DVI connector carried just the 20 pins and no key (but I'm not sure about this).

I don't know about the older DFP, what you need to convert from DVI (DVI-I or DVI-D) to DFP.

<b>99% is great, unless you are talking about system stability</b>
April 22, 2003 2:59:17 AM

I thought DVI connector are to connect to DFP monitor...
I'm wrong I guess, DVI connector is connect to what kind of monitor? DVI monitor?
April 22, 2003 3:06:53 AM

Hi, stumbbled across ur post.
I have a MSI GF4 Ti4200 VTD8x 128MB and it has the DVI-I out on it and I have an adaptor which changes my DFP monitor to this DVI-I. (check my problem in this forum under 'Problem connecting a Ti4200 with a digital flat)
as u can see in my problem, it does work when my pc is up and running but with the monitor plugged out during bootin up and shutting down the pc.
I donno if its the conversion problem of the adapter(keeping in mind that it works perfectly when it booted up) or if its just my motherboard bios not up to scratch with the conversion.
I just have to unplug my monitor during boot up and shut down untill I find the solution, which I can still play my up to date games on....hence, before I had a Savage4 pro 32MB which I couldn't play the latest games on.

Ps..My MSI card came with a free DVI - CRT changer so if you have a spare CRT monitor, u can use that instead to output 2 CRT monitors together.(i.e. the card has a CRT out put as well)

Also, if you can find out a solution to my problem, Il'd be very grateful.
April 22, 2003 3:48:19 AM

DVI is also one of the formats that is starting to be used in the HDTV scene since it is a digital signal and can allow for content protection as opposed to the analog component inputs. Here is a <A HREF="" target="_new">picture</A> of the rear panel of a modern Sony HDTV widescreen projection monitor that uses the input.

Also based upon the info provided in phsstpok's link, while the DVI-D is digital only the DVI-I can support both analog and digital. It also states however that although some connecters are the DVI-I interface that they may only support the digital format.

From looking at Radeon's current cards it sounds like their DVI-I interface supports both formats since it can connect to a <b>digital </b>flat panel or use an adapter to connect to a standard, analog, VGA monitor.
<P ID="edit"><FONT SIZE=-1><EM>Edited by CasualCat2001 on 04/21/03 11:59 PM.</EM></FONT></P>
April 22, 2003 3:58:28 AM

oh yeah, also, if the connector I have is a DVI-D, does that mean my video card doesn't have a second RAMDAC?
thanks for everyone of answering me, great help, thanks
April 22, 2003 10:00:56 PM

oh yeah, also, if the connector I have is a DVI-D, does that mean my video card doesn't have a second RAMDAC?
thanks for everyone of answering me, great help, thanks

According to the link, DFP was the old interface. I think "DFP" stands for Digital Flat Panel.

DVI (both DVI-D and DVI-I) is what is used today. You'll find it on projectors, flat panel displays, plasma displays, HDTV (as someone already mentioned). I think there was even talk of adding it to CRTs. CRTs operate in the analog domain but I imagine using a digital interface would reduce signal degredation and increase possible run lengths.

Without DVI-I then the second connector on you video card would have to be VGA, assuming your card had a second RAMDAC. Without one of these two connectors you couldn't use a second RAMDAC nor a second CRT.

I forgot which video card you have. 9500 and 9700 series (plus updated models) have two ramdacs builti into the GPU, IIRC.

Older cards, like 7500 and 8500 (9100) need an off-chip RAMDAC for dual CRT function. I don't recall what the 9000 series has for RAMDACs.

<b>99% is great, unless you are talking about system stability</b><P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 04/22/03 06:08 PM.</EM></FONT></P>
April 22, 2003 10:22:35 PM

okay, thanks for your answer
my video card is a Radeon8500LE, so mine doesn't have 2nd RAMDAC for Dual CRT function
btw, I could use VGA monitor and a DVI monitor at same time right?
April 22, 2003 10:31:45 PM


You can also use a CRT and a TV as a second monitor but the quality is not very good.

<b>99% is great, unless you are talking about system stability</b>
April 22, 2003 10:35:52 PM

thanks a lot~ now this post will end~