I know this topic has been discussed ad nauseam on the internet, but I cannot seem to find a clear answer to this specific question:
1) SLI/CF gives roughly x2 FPS
2) Micro-stutter is usually only pronounced when FPS is < 60 (monitor refresh rate?)
Then does it not follow that micro-stutter is only an issue for games that a single GPU of the same type would give only 30 FPS or worse?
For instance, if I had a single 570 card that gives me 40 FPS in "Game A" but x2 570 gives me 80 FPS, then micro-stutter would not occur? But if "Game A: Part II" gives only 25 FPS with single 570 and 50 FPS with SLI 570, then micro-stutter may occur?
In a nutshell, is micro-stutter only an issue when FPS would be so low with single-GPU of the same type that FPS would be troublesome anyway?
I ask because im trying to decide on SLI 570/CF 6970 or a single 580 and whether or not I should spec for possibly SLI 580 in the future, affecting mobo and PSU selection.
Micro-stutter is a myth. I have had two different SLI setups (560Ti's and 580's) and I have never noticed any stuttering in games. What I did notice was much higher performance than single card solutions...
I don't believe its a myth. I think there is reasonable evidence that its real and it affects everyone with SLI/CF to some degree. It's good that you have not experienced it to your knowledge, but I was hoping for more than anecdotal evidence. That's why I felt my question has not really been answered after much searching.
I'm just trying to get a handle on when it might be problem, under what circumstances, to help me decide what to buy.
I dont mean to discount your knowledge. But maybe your dual cards that stomp the games you play today will, a year from now, be so taxed as to bring your FPS low enough to make micro-stuttering evident?
although this thread is about a month old, i figured i should still reply because i'm thinking about getting either 560 ti sli or 570 sli myself and having done quite a bit of research on the issue of micro stuttering... it seems there is a fair amount of people bitching about it but i think it's exaggerated.
just look at videos yourself: http://www.youtube.com/watch?v=vY8uppkjwL0
with this one, i think one would have to be quite blind if one wouldn't notice the micro stutter (which btw obviously does not occur only at low framerates - many people say low framerates actually hide it...).
but that is quad sli... another thing people noticed - the more cards, the more stuttering.
now take a look at this (unfortunately, i found no professionally recorded gameplay video of 560 sli or 570 sli, so one has to put up with camcorder quality... but aside from professionally capturing the video, it's obviously the best way to get an unaltered impression. anything recorded on the same machine with fraps is worthless for framerate/stutter-analysis): http://www.youtube.com/watch?v=xH74CSXvVbQ
if you still see stuttering there (obviously aside from the framerate sometimes dropping a little because of general performance) then you either have better eyes than me or you're imagining things
yes - here, one can still see stuttering. but it's nowhere near as bad as with quad sli and the metro gameplay demonstrates that it's not noticeable in game because the camera won't move smoothly along a path while you play. and during cinematics, it's bound to be so slight that at least i probably wouldn't be bothered by it. still have to think about that myself but at any rate, as i've said in the beginning, it seems like a vastly exaggerated issue.
thinking a bit about this... it is really unfortunate how bad today's graphics cards are, especially considering the price. if you want to e.g. run metro 2033 or witcher 2 maxed out, you have to have a sli setup (590 obviously counts as sli).
sure, one can argue "well, just turn off XYZ, it doesn't make much of a difference anyway" - but then why at the same time bitch about consoles holding pc games back, if you have to turn down detail anyway? granted, that's just the case with a couple of games... there should be many more... still, i find it very disappointing that there is simply no way to play the current most demanding games maxed out without even a slight stutter. and that even when you shell out a couple of hundred bucks, the only absolutely stutter-free experience you can get is with a 580 and details turned down quite a bit.
that disappointment is only reinforced when i think about a graph of the development of graphics power i recently saw. where all the indicators, like fill rate, polygon drawing, etc. have risen at about a quadratic rate up until about three years ago and since then, it's merely roughly a linear increase.
and of course also by the delay of kepler. i don't want to wait until the middle of 2012 to play metro 2033 and witcher 2 "the way it's meant to be played".
I think it's pretty sad when a single gtx 580 won't even max out BF3. Some people are going to dump tons of cash into a new rig just to play that game. Myself, I'm probably going to go with the new CoD MW3 a go seeing how I would rather play a game on an older engine and be able to play it smoothly than worry about turning everything on low just to get by on a game.