Sign in with
Sign up | Sign in
Your question

2x DVI-I out ... 1 DVI-D in... driver or hardware ????

Last response: in Graphics & Displays
Share
March 5, 2009 5:47:54 PM

The monitor I bought is a Samsung 305T
The only input it has is a single DVI-D

I was looking to get a new video card (leaning toward GTX 295). That card has 2x DVI-I out...

How do you connect these two devices if the cable is DVI-D to DVI-D in order to run at max resolution (2560x1600 @ 60Hz)?

I am pretty certain the DVI-D will fit into both types... But I remember reading that in order to run at 2560x1600 resolution, you need to use both output channels. If this is the case, would I need some kind of Y-splitter or do the graphic drivers handle this?


Thank you in advance for any and all help !!!
March 5, 2009 6:05:19 PM

Dual-channel refers to each DVI-D or DVI-I connector. If there are 2 sets of nine pins then it's a single-channel connection. If it's a full 8x3 array of pins, then it's a dual-channel connection.

Make sure your DVI cable is dual-channel by checking pin-count. Your GPU probably is (I can't imagine it not being...), and your monitor must be as well.
March 5, 2009 6:24:48 PM

so the newer generation cards (GTX 295 and HD4870 X2) output Dual-Link out of the DVI slot and it functions as either DVI-I or DVI-D?
March 6, 2009 2:07:06 AM

Unlike single-link DVI cables, dual-links feature a full set of pins for increased bandwidth, allowing you to view higher resolutions at longer distances without loss or interference.
"DVI-I" stands for "DVI-Integrated" and supports both digital and analog transfers, so it works with both digital and analog Visual Display Units. "DVI-D" stands for "DVI-Digital" and supports digital transfers only.

!