Sign in with
Sign up | Sign in
Your question
Solved

Does SLI/Crossfire add one frame of input lag?

Last response: in Graphics & Displays
Share
March 6, 2012 7:02:58 PM

I can't seem to find a straight answer about this anywhere. Does SLI/Crossfire add one frame of input lag, and if so how noticeable is that, more so on a 120hz screen vs a 60hz screen.
a b U Graphics card
March 6, 2012 7:30:33 PM

Is the lag perceivable? Not really, does CF/SLI cause microstutter? yes. Can you tell see microstutter or does it affect you? that depends on your perception. Microstutter doesn't affect me, but there are some that it does. However i may just be lucky as some games don't have microstutter problems, and I may not play the games that have a huge issue.

To better answer the question, adding anything that would affect game play (second stick of ram, second cpu, second gpu) could technically introduce more input lag. More often than not it's a negligible amount of input lag.
m
0
l
March 6, 2012 8:38:54 PM

Yes, I understand about microstutter however I'm more interested in input lag on dual gpu systems. I don't want to buy a new motherboard/psu and reinstall everything only to find out I notice a tiny bit of input lag because of sli.
I've heard that dual card sli/crossfire introduces 1 frame of input lag to the process, so at 60hz it's 16.7x whatever ms, at 120hz it's 8.x whatever ms, is that correct?
I guess I don't understand how a second stick of ram or cpu would add input lag in a game? I was really just referring to the known issue of the 1 frame of input lag added to sli/crossfire. Is that 1 frame always there, or does it only show up when the fps drops below 60 on a 60hz or 120 on a 120hz?
m
0
l
Related resources
a b U Graphics card
March 6, 2012 9:43:45 PM

lmulder said:
Yes, I understand about microstutter however I'm more interested in input lag on dual gpu systems. I don't want to buy a new motherboard/psu and reinstall everything only to find out I notice a tiny bit of input lag because of sli.
I've heard that dual card sli/crossfire introduces 1 frame of input lag to the process, so at 60hz it's 16.7x whatever ms, at 120hz it's 8.x whatever ms, is that correct?
I guess I don't understand how a second stick of ram or cpu would add input lag in a game? I was really just referring to the known issue of the 1 frame of input lag added to sli/crossfire. Is that 1 frame always there, or does it only show up when the fps drops below 60 on a 60hz or 120 on a 120hz?


If you look at input lag literally, it means the time it takes from clicking the mouse to an action on the screen. Adding more components that affect game play will slightly increase input lag, usually it's almost non-exsistent by human standards.

From the sound of what you are talking about doesn't apply to input lag, but lag in terms of refresh rate.
My suggestion is if you've noticed the input lag between going from a CRT to a LCD monitor then you might notice the lag on adding a gpu for CF/SLI.
However it largely doesn't affect many people, and the biggest change in input lag will almost always be in the monitor.

Edit: I think i know what you are referring to now. Nvidia does introduce a frame of lag in SLI as a buffer to combat microstutter. It helps SLI slightly in terms of preventing microstutter, but not much as most cards are prone to it. AMD however does not use a buffer to try to smooth out microstutter. (not sure about the 7000/600 series cards as Pcie 3.0 should provide enough bandwidth latency difference from pcie 2.0 to offset this a bit)
m
0
l
March 7, 2012 1:32:00 AM

Yes, exactly, that is what input lag is :p .

I guess I should've elaborated on my particular setup. I'm running a single gtx 570 on a benq xl2410t 120hz monitor. I love it for the smooth 120hz which I can notice 100% would never go back to 60hz, and as far as I can tell this is the best input lag free monitor I have ever had. The difference between this and my old benq 60hz is night and day. I play battlefield obsessively, so every ounce of reaction time I can muster helps. There's a reason I won't run a 60hz monitor anymore, or use an IPS LCD - reaction time is the most important factor.

That one frame for the nvidia buffer, I didn't even know existed. I was talking about the addition of one frame of input lag added to the system when adding a second gpu using Alternate Frame Rendering. That's great info! I didn't even know that nvidia did this, glad I asked thank you! This definitely explains why crossfire microstutters more than sli from what I've been researching.

I've been thinking of trying sli, however I'd be pretty upset if I did that, bought a new motherboard/psu, went through the whole headache of reinstalling everything, spent $1,000 on some nice new kepler gpus (or just another 570) and I noticed the one frame of lag in SLI from AFR, and this other frame for the nvidia buffer?

I'm wondering if anybody actually notices it with their SLI setup, maybe they don't but do when they compare going from one gpu to two back and forth.

From what I'm thinking it's even less noticeable on a 120hz monitor since that one frame if running at 120fps or higher is 8.x ms, it would be half as much input lag as it would on a 60hz monitor where it's be 16.7x ms. Does that sound right?

I found a good article that tested quad-sli/crossfire.
http://www.pcgameshardware.com/aid,675353/GPU-benchmark...

"Quad-Crossfire gegen Quad-SLI: Input lag
In addition to the mentioned problems there also is an increased input lag, which is related to the behavior of the Alternate Frame Renderings (AFR). Quad GPU systems calculate 7 frames in advance before they are displayed - but the limit of Vista is 4 frames thus 3 frames have to "wait”. Therefore the input lag becomes more obvious when the frame rate is dropping. Because of the extreme frame distribution there is an additional lag since during the 80 milliseconds pause (in case of Stalker Clear Sky) mouse movements are not put into action.

Compared to a single GPU card three or two coupled graphics processor might have a grubby frame distribution and a slight input lag but it depends on you if you are annoyed by them. Quad SLI/CF on the other hand disqualifies itself in everyday gaming; the extreme micro stuttering and the heavy input lag are absolutely inacceptable given the acquisition costs of such a system. "


It seems like in AFR's cons list, for each gpu added it adds one frame of input lag. So, going with what you said for sli adding one frame to combat microstutter, do we now add two frames for a dual gpu sli setup? 1 frame as a buffer and then another frame from the AFR limitation?

That review states that quad sli suffers from 4 frames of input lag. I would definitely notice that and would consider my first person shooters unplayable.
m
0
l

Best solution

a b U Graphics card
March 7, 2012 2:17:35 AM

lmulder said:
Yes, exactly, that is what input lag is :p .

I guess I should've elaborated on my particular setup. I'm running a single gtx 570 on a benq xl2410t 120hz monitor. I love it for the smooth 120hz which I can notice 100% would never go back to 60hz, and as far as I can tell this is the best input lag free monitor I have ever had. The difference between this and my old benq 60hz is night and day. I play battlefield obsessively, so every ounce of reaction time I can muster helps. There's a reason I won't run a 60hz monitor anymore, or use an IPS LCD - reaction time is the most important factor.

That one frame for the nvidia buffer, I didn't even know existed. I was talking about the addition of one frame of input lag added to the system when adding a second gpu using Alternate Frame Rendering. That's great info! I didn't even know that nvidia did this, glad I asked thank you! This definitely explains why crossfire microstutters more than sli from what I've been researching.

I've been thinking of trying sli, however I'd be pretty upset if I did that, bought a new motherboard/psu, went through the whole headache of reinstalling everything, spent $1,000 on some nice new kepler gpus (or just another 570) and I noticed the one frame of lag in SLI from AFR, and this other frame for the nvidia buffer?

I'm wondering if anybody actually notices it with their SLI setup, maybe they don't but do when they compare going from one gpu to two back and forth.

From what I'm thinking it's even less noticeable on a 120hz monitor since that one frame if running at 120fps or higher is 8.x ms, it would be half as much input lag as it would on a 60hz monitor where it's be 16.7x ms. Does that sound right?

I found a good article that tested quad-sli/crossfire.
http://www.pcgameshardware.com/aid,675353/GPU-benchmark...

"Quad-Crossfire gegen Quad-SLI: Input lag
In addition to the mentioned problems there also is an increased input lag, which is related to the behavior of the Alternate Frame Renderings (AFR). Quad GPU systems calculate 7 frames in advance before they are displayed - but the limit of Vista is 4 frames thus 3 frames have to "wait”. Therefore the input lag becomes more obvious when the frame rate is dropping. Because of the extreme frame distribution there is an additional lag since during the 80 milliseconds pause (in case of Stalker Clear Sky) mouse movements are not put into action.

Compared to a single GPU card three or two coupled graphics processor might have a grubby frame distribution and a slight input lag but it depends on you if you are annoyed by them. Quad SLI/CF on the other hand disqualifies itself in everyday gaming; the extreme micro stuttering and the heavy input lag are absolutely inacceptable given the acquisition costs of such a system. "


It seems like in AFR's cons list, for each gpu added it adds one frame of input lag. So, going with what you said for sli adding one frame to combat microstutter, do we now add two frames for a dual gpu sli setup? 1 frame as a buffer and then another frame from the AFR limitation?

That review states that quad sli suffers from 4 frames of input lag. I would definitely notice that and would consider my first person shooters unplayable.


I've never personally noticed any lag with cf (aside from a CPU bottleneck) if it really is something that worries you and you are looking for more performance, then I'd suggest moving to a 680 or 7970. If you look up the benchmarks when the 680 is available and that isn't acceptable, then your only option is a dual or triple config (ive heard trifire is less prone to microstutter) if you can, I'd check with a friend if possible before making a conclusion as for the wide majority, a one or two frame input lag isn't a big deal unless you play shooters competitively or for money. I personally play for fun so my opinion may differ from others.
Share
March 7, 2012 2:29:06 AM

Yea cool, good info on the sli nvidia buffer frame.

I think I'm going to wait for the 680, I'm really interested in that card. However, there are no performance numbers, we'll see.

I think that's why there isn't enough info on this matter, one frame isn't a big deal to the majority of players and most probably can't notice. I've heard that the competitive players who travel for tournaments stray away from sli and I guess I can see why.

Well, if you say it's not noticeable and there doesn't seem to be a lot out there who say it is, it might be worth a shot if the 680 isn't up to par.

Battlefield 3 at 120hz with my settings would be amazing. I can get 120+ in BC2, Left 4 Dead 2 with great settings, I'd love to conquer bf3. It still gets 80+, 100+, slight dips to 60 in heavy areas.
m
0
l
March 16, 2012 11:55:41 PM

Best answer selected by lmulder.
m
0
l
!