Sign in with
Sign up | Sign in
Your question

Does it matter which output the monitor is plugged into on the video card?

Last response: in Graphics & Displays
Share
June 13, 2013 8:13:57 PM

I am running two monitors. One is 60 hz and one is 144hz.

On my HD7850 there are two DVI plugs. One says DVI-I and the other says DVI-D.

Does it matter which monitor is plugged into which? if yes, how does it affect the grand scheme of things?
June 13, 2013 8:37:14 PM

Why do you have 144 Hz? Never heard of that. The range is 60 to 75. DVI D (one with no dots at the single tip) is DVI, the DVI I (with extra pins) able to be DVI and VGA. I do not think there is difference, however, your frequency makes no sense (to me).
m
0
l
a b C Monitor
June 13, 2013 8:39:39 PM

Plug the 144Hz monitor into the Dual-Link DVI port. Single link DVI ports have a column of pins missing in the big cluster in the middle, dual link ones don't. The dual link ones pretty much allow for double the transfer rate, you need that for 144Hz as opposed to the standard 60Hz.

Hope this helps! :) 
m
0
l
Related resources
June 13, 2013 9:12:00 PM

I was always wondering about those missed pins. Good inf. Also the DVI-I is for VGA. Just saying.
m
0
l
June 13, 2013 10:53:35 PM

Munchbot said:
Plug the 144Hz monitor into the Dual-Link DVI port. Single link DVI ports have a column of pins missing in the big cluster in the middle, dual link ones don't. The dual link ones pretty much allow for double the transfer rate, you need that for 144Hz as opposed to the standard 60Hz.

Hope this helps! :) 


I am still a little confused because I swapped both plugs to test it out and im still getting 144hz on that monitor regardless of which plug.

I also dont really understand your description of the DVI`s.

Here is a picture of the back of my video card.
m
0
l
!