Difference in SLI graphics cards DVI ports.

Skandinaavlane

Reputable
Aug 16, 2014
19
0
4,520
Does it play any difference to which card my monitor is connected? Lets say I have my monitor connected via DVI to the second card, would there be any reduced performance (frame delay for example)?
 
Solution
There is no performance or visible difference regarding which card it is plugged into. Just know not all cards will have outputs.
true what cards are they some dont scale well or hit that 80%. Some hit 25%,50%, and if coded correctly and have the best therotecial scale(~80% i never heard of a hundred percent boost) then yeah you'll get insane fps like 190fps in bf4.
 

mapesdhs

Distinguished
One small caveat to this: a small number of motherboards have a BIOS restriction where
the primary GPU must be in a particular slot. I've come across this a couple of times with
certain ASUS boards for example. Consult the user guide, best to stick to the outputs
from the GPU in the slot referred to as slot 1, PCIE_1, or whatever the manual says.

If your motherboard has no such restriction though then yes indeed it doesn't matter
which outputs one uses.

Ian.

 

Skandinaavlane

Reputable
Aug 16, 2014
19
0
4,520
Guys! That's not what I asked at all. I'm asking if connecting my monitor with the second graphics card and NOT the first one would cause any performance issues. Like would there be a delay for motion on the monitor?

EDIT: Thanks mapesdhs and Gam3r01 for your answers.
 

mapesdhs

Distinguished
Re fps: if the minimum fps is always more than the monitor refresh, set double buffering on to prevent
screen tearing, and that's as good as it can be with the monitor in question. If the fps rate is often less
than the monitor refresh (but not always), you can use adaptive buffering so it only uses single buffer
mode when the fps drops below the monitor refresh rate (best of both worlds).

In theory there shouldn't be any performance issues (assuming the multi-GPU config on your mbd even
allows that setup in the 1st place), but unless there's some special reason to do otherwise, as Gam3r01
says you should connect the monitor to the primary card as described in the mbd user guide.

Ian.

 


The second card backs up the first one allowing the frames per second to increase. Or you can use higher game settings and/or higher resolution and drop the fps back again.

 


yeah you need to have all monitors plugged into the same card. the second card will process half the data and the first will do the other half then the "slave(racist lol) will transfer its data to the top card which will send out a full image. The transfer of the data is quick so no loss in performance should be seen.
 

mapesdhs

Distinguished
Side tracking a bit, but there are various ways the cards can split the processing; by scanline is just one
method, though the most common (and I think the default). Others including tiling, round robin frame
allocation, pixel rolling, etc. NVIDIA has a setting to change the SLI mode, but usually the default is the
best option for most games. I experimented recently with the SLI modes using three GTX 580 3GB cards,
found that although in some cases an alternative mode gave higher benchmark scores (such as 3DMark),
they often made stuttering worse. In other cases the alternative SLI modes killed performance completely.

Just curious btw Skandinaavlane, what SLI combo are you considering? Do you have just one card atm and
are thinking of buying another? Or do you already have the 2 cards? If so, which models? Performance scaling
varies greatly by game, resolution, detail level, CPU bottleneck issues, etc. I've been testing the extremes so
that people can see where their older system might reside on the potential 'bottleneck scale', as it were. And
believe me, testing fast cards with a 1-core P4 is painful. :D Sooo looking forward to testing a QX9650 on the
same board instead, and meanwhile many others such as i3 550, i5 760, i7 870, i7 990X, i5 2500K, i7 3930K,
i7 4820K, etc. Likewise, testing from an Athlon64 3400+ to a Ph2 1090T with many inbetween, and a range of GPUs.

Ian.

 

Skandinaavlane

Reputable
Aug 16, 2014
19
0
4,520
Hey, thanks for your wise words.

I already have two cards, they're both by Gigabyte and GTX 760 2GB Windforce X3. I suppose there wouldn't be any issues with plugging the screen into the second card.

EDIT: Just for a note, I have i5 4670k @ 4.2GHz
 

mapesdhs

Distinguished


(most welcome!)

I guess my question would be though, why would or do you want to? Is there some special reason
for not connecting the output to the top-most primary card? To avoid any possible issues, it's best
to stick with the recommended defaults for this sort of thing.

Ian.