what's difference between DVI-I and DVI? I even saw DVI-D on some web site
I've seen some Radeon's images these days, I noticed that the DVI connector is some sort different than the one on my video card
Anyone help me, I would be appreciate~
so does that mean that DVI-I, DVI-D, both support for DFP monitor, except that DVI-I can use a VGA monitor by using a DVI-I/VGA adapter, but DVI-D can only be using DFP monitor, right?
so on all the video card that have DVI connector, it's either DVI-I or DVI-D, am I correct?
how to identify them then?
Hi, stumbbled across ur post.
I have a MSI GF4 Ti4200 VTD8x 128MB and it has the DVI-I out on it and I have an adaptor which changes my DFP monitor to this DVI-I. (check my problem in this forum under 'Problem connecting a Ti4200 with a digital flat)
as u can see in my problem, it does work when my pc is up and running but with the monitor plugged out during bootin up and shutting down the pc.
I donno if its the conversion problem of the adapter(keeping in mind that it works perfectly when it booted up) or if its just my motherboard bios not up to scratch with the conversion.
I just have to unplug my monitor during boot up and shut down untill I find the solution, which I can still play my up to date games on....hence, before I had a Savage4 pro 32MB which I couldn't play the latest games on.
Ps..My MSI card came with a free DVI - CRT changer so if you have a spare CRT monitor, u can use that instead to output 2 CRT monitors together.(i.e. the card has a CRT out put as well)
Also, if you can find out a solution to my problem, Il'd be very grateful.
DVI is also one of the formats that is starting to be used in the HDTV scene since it is a digital signal and can allow for content protection as opposed to the analog component inputs. Here is a <A HREF="http://www.bestbuy.com/images/esku/back/11126165baA.jpg" target="_new">picture</A> of the rear panel of a modern Sony HDTV widescreen projection monitor that uses the input.
Also based upon the info provided in phsstpok's link, while the DVI-D is digital only the DVI-I can support both analog and digital. It also states however that although some connecters are the DVI-I interface that they may only support the digital format.
From looking at Radeon's current cards it sounds like their DVI-I interface supports both formats since it can connect to a <b>digital </b>flat panel or use an adapter to connect to a standard, analog, VGA monitor.
<P ID="edit"><FONT SIZE=-1><EM>Edited by CasualCat2001 on 04/21/03 11:59 PM.</EM></FONT></P>
oh yeah, also, if the connector I have is a DVI-D, does that mean my video card doesn't have a second RAMDAC?
thanks for everyone of answering me, great help, thanks
According to the link, DFP was the old interface. I think "DFP" stands for Digital Flat Panel.
DVI (both DVI-D and DVI-I) is what is used today. You'll find it on projectors, flat panel displays, plasma displays, HDTV (as someone already mentioned). I think there was even talk of adding it to CRTs. CRTs operate in the analog domain but I imagine using a digital interface would reduce signal degredation and increase possible run lengths.
Without DVI-I then the second connector on you video card would have to be VGA, assuming your card had a second RAMDAC. Without one of these two connectors you couldn't use a second RAMDAC nor a second CRT.
I forgot which video card you have. 9500 and 9700 series (plus updated models) have two ramdacs builti into the GPU, IIRC.
Older cards, like 7500 and 8500 (9100) need an off-chip RAMDAC for dual CRT function. I don't recall what the 9000 series has for RAMDACs.
<b>99% is great, unless you are talking about system stability</b><P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 04/22/03 06:08 PM.</EM></FONT></P>