Sign in with
Sign up | Sign in
Your question
Solved

Difference in SLI graphics cards DVI ports.

Tags:
  • DVI
  • Cable
  • SLI
  • Graphics
  • Graphics Cards
  • Monitors
Last response: in Graphics & Displays
Share
September 11, 2014 7:25:19 PM

Does it play any difference to which card my monitor is connected? Lets say I have my monitor connected via DVI to the second card, would there be any reduced performance (frame delay for example)?

More about : difference sli graphics cards dvi ports

a b U Graphics card
a b C Monitor
September 11, 2014 7:26:59 PM

no not really in a sli they work together.
m
0
l
a c 249 U Graphics card
a c 134 C Monitor
September 11, 2014 7:29:12 PM

No difference except for faster fps. About 80% increase.
m
0
l
Related resources
September 11, 2014 7:31:19 PM

Uh, what do you mean by "faster fps"? Am I missing out on something?
m
0
l
a b U Graphics card
a b C Monitor
September 11, 2014 7:31:48 PM

true what cards are they some dont scale well or hit that 80%. Some hit 25%,50%, and if coded correctly and have the best therotecial scale(~80% i never heard of a hundred percent boost) then yeah you'll get insane fps like 190fps in bf4.
m
0
l
a c 428 U Graphics card
a b C Monitor
September 11, 2014 7:32:19 PM

This is actually incorrect, you must attach the monitors to the primary card, which is most SLI setups is the top card. Any others disable their outputs and act as "slave" cards to the first, and simply process data.
http://www.nvidia.in/object/sli-technology-multimonitor...
m
0
l
a b U Graphics card
a b C Monitor
September 11, 2014 7:33:06 PM

how fast the frames come to the screen basically. 60fps come out 60frames/second, so 100frame/second is faster, 110,120...
m
0
l
September 11, 2014 7:33:16 PM


One small caveat to this: a small number of motherboards have a BIOS restriction where
the primary GPU must be in a particular slot. I've come across this a couple of times with
certain ASUS boards for example. Consult the user guide, best to stick to the outputs
from the GPU in the slot referred to as slot 1, PCIE_1, or whatever the manual says.

If your motherboard has no such restriction though then yes indeed it doesn't matter
which outputs one uses.

Ian.

m
1
l
September 11, 2014 7:34:58 PM

Guys! That's not what I asked at all. I'm asking if connecting my monitor with the second graphics card and NOT the first one would cause any performance issues. Like would there be a delay for motion on the monitor?

EDIT: Thanks mapesdhs and Gam3r01 for your answers.
m
0
l
September 11, 2014 7:35:07 PM

Re fps: if the minimum fps is always more than the monitor refresh, set double buffering on to prevent
screen tearing, and that's as good as it can be with the monitor in question. If the fps rate is often less
than the monitor refresh (but not always), you can use adaptive buffering so it only uses single buffer
mode when the fps drops below the monitor refresh rate (best of both worlds).

In theory there shouldn't be any performance issues (assuming the multi-GPU config on your mbd even
allows that setup in the 1st place), but unless there's some special reason to do otherwise, as Gam3r01
says you should connect the monitor to the primary card as described in the mbd user guide.

Ian.

m
1
l

Best solution

a c 428 U Graphics card
a b C Monitor
September 11, 2014 7:35:45 PM

There is no performance or visible difference regarding which card it is plugged into. Just know not all cards will have outputs.
Share
a c 249 U Graphics card
a c 134 C Monitor
September 11, 2014 7:37:25 PM

Skandinaavlane said:
Uh, what do you mean by "faster fps"? Am I missing out on something?


The second card backs up the first one allowing the frames per second to increase. Or you can use higher game settings and/or higher resolution and drop the fps back again.

m
0
l
a b U Graphics card
a b C Monitor
September 11, 2014 7:37:54 PM

Skandinaavlane said:
Guys! That's not what I asked at all. I'm asking if connecting my monitor with the second graphics card and NOT the first one would cause any performance issues. Like would there be a delay for motion on the monitor?

EDIT: Thanks mapesdhs for the answer.


yeah you need to have all monitors plugged into the same card. the second card will process half the data and the first will do the other half then the "slave(racist lol) will transfer its data to the top card which will send out a full image. The transfer of the data is quick so no loss in performance should be seen.
m
0
l
September 11, 2014 7:51:50 PM

Side tracking a bit, but there are various ways the cards can split the processing; by scanline is just one
method, though the most common (and I think the default). Others including tiling, round robin frame
allocation, pixel rolling, etc. NVIDIA has a setting to change the SLI mode, but usually the default is the
best option for most games. I experimented recently with the SLI modes using three GTX 580 3GB cards,
found that although in some cases an alternative mode gave higher benchmark scores (such as 3DMark),
they often made stuttering worse. In other cases the alternative SLI modes killed performance completely.

Just curious btw Skandinaavlane, what SLI combo are you considering? Do you have just one card atm and
are thinking of buying another? Or do you already have the 2 cards? If so, which models? Performance scaling
varies greatly by game, resolution, detail level, CPU bottleneck issues, etc. I've been testing the extremes so
that people can see where their older system might reside on the potential 'bottleneck scale', as it were. And
believe me, testing fast cards with a 1-core P4 is painful. :D  Sooo looking forward to testing a QX9650 on the
same board instead, and meanwhile many others such as i3 550, i5 760, i7 870, i7 990X, i5 2500K, i7 3930K,
i7 4820K, etc. Likewise, testing from an Athlon64 3400+ to a Ph2 1090T with many inbetween, and a range of GPUs.

Ian.

m
1
l
September 11, 2014 8:02:23 PM

Hey, thanks for your wise words.

I already have two cards, they're both by Gigabyte and GTX 760 2GB Windforce X3. I suppose there wouldn't be any issues with plugging the screen into the second card.

EDIT: Just for a note, I have i5 4670k @ 4.2GHz
m
0
l
September 11, 2014 8:06:36 PM

Skandinaavlane said:
... I suppose there wouldn't be any issues with plugging the screen into the second card.


(most welcome!)

I guess my question would be though, why would or do you want to? Is there some special reason
for not connecting the output to the top-most primary card? To avoid any possible issues, it's best
to stick with the recommended defaults for this sort of thing.

Ian.

m
0
l
!