Sign in with
Sign up | Sign in
Your question

First impressions of SLI: pretty crappy

Last response: in Graphics & Displays
Share
March 5, 2012 11:54:28 PM

I just got my second GTX 560 TI in today. It was pretty easy to install and showed up immediately.

First thing I did was open up Crysis. Yep, 2007's Crysis. I thought, "Let's max this bitch out."

Nerp. Didn't happen. I had to lower the settings to high with no AA just to get the standard 60fps. Now I know it's a pretty demanding game, that's why I tested it out, but the improvement in performance wasn't a great leap from the single. The game's 5 years old.

I opened up BF3. Started out in ultra, and bumped the settings down to a mix of med, high and ultra. Even then the fps were sporadic at best and would sit at 60, then on a quick turn they'd drop to somewhere in the 30's. Ended up only having slightly higher results than with only 1 card.

Oh yeah, and the GPUs are twice as loud now... and noticeable during games. I've been running a few benchmarks, 3DMark11 and Furmark. Both show almost a 2x increase in performance (as far as numbers go) but the temps were pretty nuts. I've never seen my single card go over 84° C. As soon as I launched Furmark I heard a cricket-like noise that lasted during the entire 15 minute test. By the end, GPU 1 touched 96° C and 2 sat in the 80's. Global V-sync's on so I highly doubt the sound was coming from the PSU.

So yeah, not very impressed. I installed this thing like it was Christmas, but now it's just... whatever. I dropped another $250 on this one, and I refuse to be one of those people who RMA's something when there's nothing wrong with it.

So if I could do it all again would I just get a 580? Definitely.
March 6, 2012 12:05:41 AM

Sounds like you need some more case ventilation.

Also, disable vsync... dunno why you're surprised at random frame drops into the 30s when you have vsync enabled.

Annnnnnnnnd yeah I dunno if there's anything that gets a rock solid 60fps in Crysis at 1080p with 4xAA or higher... maybe GTX 580 SLI or 7950/7970 crossfire would do it, but 560ti SLI is nowhere near enough GPU muscle to achieve minimum framerates of 60 in that game... it's one of the worst optimized games in existence on top of being very demanding.
m
0
l
March 6, 2012 12:10:16 AM

BigMack70 said:
Sounds like you need some more case ventilation.

Also, disable vsync... dunno why you're surprised at random frame drops into the 30s when you have vsync enabled.

Annnnnnnnnd yeah I dunno if there's anything that gets a rock solid 60fps in Crysis at 1080p with 4xAA or higher... maybe GTX 580 SLI or 7950/7970 crossfire would do it, but 560ti SLI is nowhere near enough GPU muscle to achieve minimum framerates of 60 in that game... it's one of the worst optimized games in existence on top of being very demanding.



i5 2500k. Can't see it bottlenecking with that.

It's a mATX board, so I can see that being a problem with the venation. However my case does have a mesh front and side panel and i've counted up a total of 13 fans including GPUs and PSU.
m
0
l
Related resources
March 6, 2012 12:15:33 AM

Yeah, I checked your config after posting - just wanted to make sure you weren't running SLI on something like a 3 GHz Athlon II or something.

Anyways, your expectations were way too high. To get 60fps minimum, it means that average fps needs to be 80-120 depending on the game, and 560ti SLI isn't enough to do that in the most demanding games. And crysis is optimized like such garbage that I don't know if we'll ever see a setup that can make it through the whole game with 60fps minimum.

Do you have 120mm front intake, side intake, rear exhaust, and top exhaust case fans?
m
0
l
a c 225 U Graphics card
March 6, 2012 12:20:01 AM

Can ya match this:

http://www.guru3d.com/article/geforce-gtx-560-ti-sli-re...

If not, sounds like problem is specific to your box.

My son is running twin factory overclocked (900MHz) 560 Ti's (2600k @ 4.8GHz, CAS 7 DDR3-1600) that we OC'd to 1020 MHz and then later dropped down to 980 Mhz to keep temps below 80C w/ all case fans at slowest possible speed. Having machine run silent was a higher priority than those last few fps. Case is a DF-85 w/ CP-850

Ya didn't say which 560 Ti's ya used but given the $250 price, I hope they were factory overclocked jobs with HUGE cooler sand beefed up VRMs....900Mhz ones w/ 7 phase VRMs are generally about $205 - $230 and they will run a lot quieter and cooler than designs based upon the reference PCB.

He plays BF3 on Ultra settings in single player mode, runs smooth. Multi-player mode he drops to High
m
0
l
a c 216 U Graphics card
March 6, 2012 12:24:02 AM

Rather than saying Crysis was optimized like garbage or horribly optimized, you could just as easily say the game was never meant to be played with every setting maxed. The game just put everything available at the time in the game and let use adjust the settings to self optimize the game.

I realize that a lot of people feel that all games are meant to be maxed, but that's not true. One thing I've seen more of lately is games that have all the settings that Crysis has, if not more, which in no way shape or form could be maxed, but they have predefined setups that will say low, med, high, ultra (Rift is a good example). This seems to satisfy the people who feel all games should be maxed, or it's garbage, yet still allow people to tweak into the unreasonable.
m
0
l
March 6, 2012 12:24:51 AM

For crysis, did you run GPU-Z to check and see if the GPU usage was actually hit in 100%, or if one of the CPU cores were acting as a bottleneck?


with benchmarks, modern cards will show 80-90% performance boost from SLI but in real workd gaming, it goes down to around 60-70% performance boost, with a few games that are heavily optimized, getting over 80% extra performance.

PS for SLI, you need a front case fan with airflow aimed directly at the videocards, you need a rush of air blowing between the cards as the back of the videocards also get very hot and will cause the GPU with it's vent very close to the other card to suck in pre heated air.

To fix this, make sure you have a case fan blowing air directly at/ between the cards. (if you have a vent for a side panel fan and it is in a position to blow on both cards and get some air in between then then that will also be good)


PS the i5 2500k does bottleneck if not overclocked

PS With case fans it is very important to avoid air streams fighting each other, eg 2 perpendicular fans will have greatly reduced performance past the area where the streams intersect.

Which is why with many cases, people try to have a single overall direction of airflow, eg most of the air coming in to the front and leaving from the back or the top, or some of both, or most of the air coming in from the side and leaving through the top and back.
m
0
l
March 6, 2012 12:25:02 AM

Quote:
Can ya match this:

http://www.guru3d.com/article/gefo [...] i-review/7

If not, sounds like problem is specific to your box.


exactly... maxed out, 560ti SLI doesn't even get you 60fps average in maxed out 1080p Crysis

It's playable and pretty smooth but nowhere near being able to just vsync it and sit at 60fps for the whole game.
m
0
l
a b U Graphics card
March 6, 2012 12:26:00 AM

Hey bro, guess what?, i have the same issue, except i am using a AMD FX 6100. and the funny thing is. its only BF3 i have tried but i get same case senario. shoot i have even set everything to low. but not takes a tump right in the 30's even with 1 GTX 560 ti. I won't be doing SLI. obviously it has some issues with that new driver. I am gonna get a Kepler card when they are released but my buddy told me it has to do something with the Texture Fill Rate.
m
0
l
March 6, 2012 12:26:23 AM

bystander said:
Rather than saying Crysis was optimized like garbage or horribly optimized, you could just as easily say the game was never meant to be played with every setting maxed. The game just put everything available at the time in the game and let use adjust the settings to self optimize the game.

I realize that a lot of people feel that all games are meant to be maxed, but that's not true. One thing I've seen more of lately is games that have all the settings that Crysis has, if not more, which in no way shape or form could be maxed, but they have predefined setups that will say low, med, high, ultra (Rift is a good example). This seems to satisfy the people who feel all games should be maxed, or it's garbage, yet still allow people to tweak into the unreasonable.


Crysis has an insane memory leak during the last level/boss where my FPS would nosedive from 50 to about 4 and then back, constantly. It's not a well optimized game, at all. Why bother defending it? This is pretty common knowledge...
m
0
l
March 6, 2012 12:31:29 AM

flossbandit said:
I just got my second GTX 560 TI in today. It was pretty easy to install and showed up immediately.

First thing I did was open up Crysis. Yep, 2007's Crysis. I thought, "Let's max this bitch out."

Nerp. Didn't happen. I had to lower the settings to high with no AA just to get the standard 60fps. Now I know it's a pretty demanding game, that's why I tested it out, but the improvement in performance wasn't a great leap from the single. The game's 5 years old.

I opened up BF3. Started out in ultra, and bumped the settings down to a mix of med, high and ultra. Even then the fps were sporadic at best and would sit at 60, then on a quick turn they'd drop to somewhere in the 30's. Ended up only having slightly higher results than with only 1 card.

Oh yeah, and the GPUs are twice as loud now... and noticeable during games. I've been running a few benchmarks, 3DMark11 and Furmark. Both show almost a 2x increase in performance (as far as numbers go) but the temps were pretty nuts. I've never seen my single card go over 84° C. As soon as I launched Furmark I heard a cricket-like noise that lasted during the entire 15 minute test. By the end, GPU 1 touched 96° C and 2 sat in the 80's. Global V-sync's on so I highly doubt the sound was coming from the PSU.

So yeah, not very impressed. I installed this thing like it was Christmas, but now it's just... whatever. I dropped another $250 on this one, and I refuse to be one of those people who RMA's something when there's nothing wrong with it.

So if I could do it all again would I just get a 580? Definitely.



I agree with floss bandit. SLI isn't enough of a gain i performance to justify the crazy cost. I have sli set up and am seeing performance that could be achieved with a single card. My advice.......Just like he said.....get the 580 and be done.
m
0
l
a c 216 U Graphics card
March 6, 2012 12:33:05 AM

BigMack70 said:
Crysis has an insane memory leak during the last level/boss where my FPS would nosedive from 50 to about 4 and then back, constantly. It's not a well optimized game, at all. Why bother defending it? This is pretty common knowledge...


Just because people say it's badly optimized, doesn't mean they are right. There are plenty of people who view Crysis as I do too. It really is all about how you view things. I prefer to have the settings available to make a game look better over the years rather than having a game which will look good when it's released, but will look awful in a few years.

You seem to view an optimized game as a game that removes all high end settings that hurt performance too much. Then again, I think it was great to be "optimized like garbage", because it has allowed is to keep tweaking settings higher and higher for better and better visuals years after it was released.

That said, I am aware of the aircraft carrier memory leak, actually, I believe there is a small memory leak throughout the game, it's just the aircraft carrier was horrible. I would call that a bug rather an optimized thing.
m
0
l
March 6, 2012 12:43:39 AM

w/e... there are games that look better than Crysis and run like twice as smoothly and have no memory leaks

IMO that's about the textbook definition of a game optimized like crap, but if you wanna sugarcoat that fact then go right ahead.
m
0
l
a b U Graphics card
March 6, 2012 12:50:36 AM

BigMack70 said:
Sounds like you need some more case ventilation.

Also, disable vsync... dunno why you're surprised at random frame drops into the 30s when you have vsync enabled.

Annnnnnnnnd yeah I dunno if there's anything that gets a rock solid 60fps in Crysis at 1080p with 4xAA or higher... maybe GTX 580 SLI or 7950/7970 crossfire would do it, but 560ti SLI is nowhere near enough GPU muscle to achieve minimum framerates of 60 in that game... it's one of the worst optimized games in existence on top of being very demanding.


like previous people have said, just cause you can max it doesnt mean its poorly optimized. is bf3 badly optimized because a 7970 barely maxes it at 1080p?

BigMack70 said:
Quote:
Can ya match this:

http://www.guru3d.com/article/gefo [...] i-review/7

If not, sounds like problem is specific to your box.


exactly... maxed out, 560ti SLI doesn't even get you 60fps average in maxed out 1080p Crysis

It's playable and pretty smooth but nowhere near being able to just vsync it and sit at 60fps for the whole game.


those fps are relative to maxing out crysis. they had lower then maxed settings in their tests.

Rockdpm said:
Hey bro, guess what?, i have the same issue, except i am using a AMD FX 6100. and the funny thing is. its only BF3 i have tried but i get same case senario. shoot i have even set everything to low. but not takes a tump right in the 30's even with 1 GTX 560 ti. I won't be doing SLI. obviously it has some issues with that new driver. I am gonna get a Kepler card when they are released but my buddy told me it has to do something with the Texture Fill Rate.


thats because your cpu cant keep up.

BigMack70 said:
Crysis has an insane memory leak during the last level/boss where my FPS would nosedive from 50 to about 4 and then back, constantly. It's not a well optimized game, at all. Why bother defending it? This is pretty common knowledge...


its not common knowledge. idk where you got that idea.

m
0
l
March 6, 2012 12:56:02 AM

cbrunnem said:
like previous people have said, just cause you can max it doesnt mean its poorly optimized. is bf3 badly optimized because a 7970 barely maxes it at 1080p?

those fps are relative to maxing out crysis. they had lower then maxed settings in their tests.

its not common knowledge. idk where you got that idea.


You are right... apparently there's enough people here who want to defend Crysis that it is not common knowledge. Lots of people apparently think that a 5 year old game that no longer looks state-of-the-art (though admittedly, still pretty close) and requires extremely high-end modern GPU setups to max out smoothly AND has nasty memory leaks is well optimized.

Anyways, I believe that Enthusiast is the highest setting available in Crysis Warhead, and they did test that (with 2xAA) down at the bottom. You'll note that this is what I was referring to, as I said the 560ti SLI doesn't average 60fps and the only test where it doesn't average at least that is with enthusiast settings down at the bottom.

If BF3 looked a little bit worse than it does, was released 5 years ago, and wasn't able to be maxed by a 7970 then YES - it would be badly optimized. However, BF3 looks better than Crysis, is new, and is maxable by a single 7970 at higher fps than Crysis is.
m
0
l
March 6, 2012 12:59:40 AM

JackNaylorPE said:
Can ya match this:

http://www.guru3d.com/article/geforce-gtx-560-ti-sli-re...

If not, sounds like problem is specific to your box.

My son is running twin factory overclocked (900MHz) 560 Ti's (2600k @ 4.8GHz, CAS 7 DDR3-1600) that we OC'd to 1020 MHz and then later dropped down to 980 Mhz to keep temps below 80C w/ all case fans at slowest possible speed. Having machine run silent was a higher priority than those last few fps. Case is a DF-85 w/ CP-850

Ya didn't say which 560 Ti's ya used but given the $250 price, I hope they were factory overclocked jobs with HUGE cooler sand beefed up VRMs....900Mhz ones w/ 7 phase VRMs are generally about $205 - $230 and they will run a lot quieter and cooler than designs based upon the reference PCB.

He plays BF3 on Ultra settings in single player mode, runs smooth. Multi-player mode he drops to High


two of these:

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

I made a thread before I actually ordered it if I should have gotten the single fan version or another one with 2 like this. I was told it wouldn't make any difference, but now that I actually see/hear it, it may have been a better choice to have used the single fan as a top card because there was more ventilation out of the back rather than down in the case.

Here's the other one I was looking at:

http://www.newegg.com/Product/Product.aspx?Item=N82E168...
m
0
l
a c 216 U Graphics card
March 6, 2012 1:00:09 AM

BigMack70 said:
w/e... there are games that look better than Crysis and run like twice as smoothly and have no memory leaks

IMO that's about the textbook definition of a game optimized like crap, but if you wanna sugarcoat that fact then go right ahead.


Yes, there are, but it took several years before that happened and you still see a lot of people who still believe Crysis looks better than anything still (I don't necessarily agree though). They were the first to many of the things they did, and others took that idea and improved it. I'm not going to tell the Wright brothers that their planes were crap, they brought a new technology to the forefront and people after them improved them.

I personally wished Crytec would have done the same thing with Crysis 2 as they did with Farcry and Crysis, but unfortunately, too many people kept being upset they couldn't max those games at release. Instead we get a good game out of Crysis 2, but it is not cutting edge, but at least they released the DX11 patch, which is kind of a compromise.
m
0
l
March 6, 2012 1:03:45 AM

I'm not picking on Crysis for being un-maxable in 2007... that doesn't really bear on a discussion of optimization one way or another

I'm picking on Crysis for not running nearly as well as better looking games 5 years later in 2012
m
0
l
a c 216 U Graphics card
March 6, 2012 1:53:38 AM

BigMack70 said:
I'm not picking on Crysis for being un-maxable in 2007... that doesn't really bear on a discussion of optimization one way or another

I'm picking on Crysis for not running nearly as well as better looking games 5 years later in 2012


In many ways, Crysis is still the best looking game. I find that rather weird, cutting on a game that doesn't perform/look as good as a game 5 years after it was made. They had no idea what hardware could do today, they also pioneered a lot of techniques which have later been improved on. Like I said earlier, it's like calling the Wright brothers planes, crap, because today they are much better.
m
0
l
a b U Graphics card
March 6, 2012 3:16:02 PM

BigMack70 said:
I'm not picking on Crysis for being un-maxable in 2007... that doesn't really bear on a discussion of optimization one way or another

I'm picking on Crysis for not running nearly as well as better looking games 5 years later in 2012


just to throw this out there but new cards(7000 series) cant play dx9 games worth a *** so it might not just be the game.
m
0
l
March 6, 2012 3:44:51 PM

point taken but maxed Crysis is dx10 :p 
m
0
l
a b U Graphics card
March 6, 2012 4:04:28 PM

BigMack70 said:
point taken but maxed Crysis is dx10 :p 


whats the difference between dx10 and 9....? As far as i know its essentially the same thing but with some updates.

but the point still stand regardless in my opinion.

also BF3 developers could have made the objects and textures have twice the polys and pixels and then no card on the market could max the game. at that point is bf3 a badly optimized game? my point i guess im getting at is no one knows why its hard to max. a lot think its because of how high the highest graphics settings are and others think its because of poor optimization. i believe the first.
m
0
l
a c 216 U Graphics card
March 6, 2012 8:22:15 PM

The biggest thing that hurts Crysis performance, beyond all the new shadowing effects is the jungle. That game has a huge clipping range with thousands of trees to render. While new games often meet the appearance of Crysis, none have attempted to render so much at once. I believe the jungle is also why many people still think it's one of the best if not the best looking game today.
m
0
l
March 6, 2012 8:44:33 PM

FlossBandit, did you notice any perceivable input lag going from one card to sli? Any delay in mouse/keyboard input, even ever so slighty?
m
0
l
a c 88 U Graphics card
March 6, 2012 9:06:14 PM

Best way to play BF3 with the 560ti SLI set up for me is , I set everything to Ultra settings and turn off motion blur and AA, the game runs really well and I get good fps and I'm running an older system than yours.
Have you got the latest drivers from Nvidia?
Have you set your gaming profile in the Nvidia Control panel to BF3?
Are you running MSI Afterburner to adjust your fan speed to reduce the temps?
Can you fit any more fans to your case?
Turn off V sync?
m
0
l
March 6, 2012 10:07:41 PM

bystander said:
The biggest thing that hurts Crysis performance, beyond all the new shadowing effects is the jungle. That game has a huge clipping range with thousands of trees to render. While new games often meet the appearance of Crysis, none have attempted to render so much at once. I believe the jungle is also why many people still think it's one of the best if not the best looking game today.


Actually on my playthrough it was NPC enemies that had the biggest effect on performance (outside of memory leaks)... just displaying the jungle resulted in 50-70fps but if there were 1-2 dozen enemies loaded in the area the fps could sit at times around 30
m
0
l
March 6, 2012 10:11:32 PM

I cant tell what is wrong with your system but what i do know is that 2 560 ti's is better than a 580.
m
0
l
March 6, 2012 10:12:44 PM

ihatehismap said:
I cant tell what is wrong with your system but what i do know is that 2 560 ti's is better than a 580.


nothing wrong with his system... that's exactly how 2 560ti's should perform (especially if he has vsync on)
m
0
l
a c 92 U Graphics card
March 6, 2012 10:36:24 PM

Crysis at max settings was never designed to be optimized. They didn't even bother to make it runable so it was just there for show. Im not sure why people still think crysis is optimized when it was know to be a messy game from the start. It looked good and did a lot of things with the engine but the implementations were all just there for show.

Crytec basically did the same with Crysis 2 with absurd amount of tessellation and above ground ocean to make the game harder to run but not give better quality.
m
0
l
a c 216 U Graphics card
March 6, 2012 11:23:13 PM

BigMack70 said:
Actually on my playthrough it was NPC enemies that had the biggest effect on performance (outside of memory leaks)... just displaying the jungle resulted in 50-70fps but if there were 1-2 dozen enemies loaded in the area the fps could sit at times around 30


For me, anytime I looked across large landscapes, my FPS would drop. Like when you first come up to the bay at the start, and look across the water, FPS always takes a nose dive.
m
0
l
March 7, 2012 11:38:02 AM

lmulder said:
FlossBandit, did you notice any perceivable input lag going from one card to sli? Any delay in mouse/keyboard input, even ever so slighty?


Nope. None whatsoever.

monsta said:
Best way to play BF3 with the 560ti SLI set up for me is , I set everything to Ultra settings and turn off motion blur and AA, the game runs really well and I get good fps and I'm running an older system than yours.
Have you got the latest drivers from Nvidia?
Have you set your gaming profile in the Nvidia Control panel to BF3?
Are you running MSI Afterburner to adjust your fan speed to reduce the temps?
Can you fit any more fans to your case?
Turn off V sync?


1)yep
2)nope. it's all default, which shouldn't be any higher than the in game settings.
3)afterburn? no, i just let the fans automatically do their thing. maybe i should look into this.
4)perhaps. i'd have to mount it in unconventional ways though. another one would be number 14 plus.
5)v-sync's on. i'll probably keep it that way unless i see some significant performance improvement or if the temps drop a whole lot. i really don't understand why v-sync gets so much hate from a lot of gamers. it makes most games look way better on a quick turn. a lot less vertical lines and tearing in the picture.
m
0
l
March 7, 2012 3:48:00 PM

Even with a single card in BF3 I notice that the game introduces input lag and just gets sloppy with both AA and AF. Fairly unacceptable in my opinion for a top tier game. Turn those off and it's smooth as butter. The AA is just bad imo compared to other games where it runs great, same with AF but not as bad as AA.
m
0
l
March 7, 2012 4:02:35 PM

I dunno if you ever noticed the article Tomshardware released all about microstuttering? http://www.tomshardware.com/gallery/1xHD6870X2,0101-300...

The article explains how two GPUs working together do essentially double the average frame rates. It also goes on to detail that this is accomplished by having extremely high FPS peaks, and extremely low FPS dips, constantly!

If you look at the graph below, it shows two cards working together on top, and one single card by itself on the bottom. As you can clearly see the minimum FPS of two cards together is very close the the constant FPS produced by just one card. In essence the game is CONSTANTLY zooming between the speed of one single card running the game, and two cards running at double speed.

Have you ever been in a car where someone hits the brakes, then the gas, and then the brakes? Yeah it's like that, fast then slow then fast then slow, over and over the entire time you play the game. You might get used to it if it was just on fast OR slow, but it is going to constantly swing between fast and slow, making it as noticeable as possible. This is referred to as micro-stuttering.

It's the dirty little secret of SLI/crossfire, that you're only going to see a marginal improvement in the best case scenario in game, and potentially a big drop in perceived quality due to this constant fast/slow cycle. In many games it almost seems like something is rhythmically lagging a bit every couple seconds, but nothing is and you will go crazy trying to smooth it out.

m
0
l
March 7, 2012 5:07:25 PM

Exactly, I can elaborate on that as well.

http://techreport.com/articles.x/21516/8

Tech Report includes frame time in milliseconds for the 99%, which instead of just a fps chart, this really gives you an indication for how fast the game actually feels.

For SLI/Crossfire, fps charts aren't acurate for that exact reason. If a game is getting the high peaks at say 160fps and low peaks at 120fps constantly, the average will read 40fps, however the game will feel more like 120fps. You're losing a good number of fps off the already 70-80% scaling on a well scaling game.

Also, microstuttering may not feel as smooth as a single card.
m
0
l
March 7, 2012 5:08:09 PM

Quote:
Been there, done that also...
I will never SLI again, better to buy a better card upfront.
SLI and Crossfire = microstuttering, noise, crappy late drivers and huge power bill.


Can you elaborate on what you didn't like about it mostly? Was the microstuttering too noticeable and not as smooth as a single card? Did you experience any drastic or noticeable input lag with sli?
m
0
l
March 7, 2012 5:27:15 PM

did you notice any added input lag with your sli compared to one card?
m
0
l
a c 125 U Graphics card
March 7, 2012 6:27:20 PM

I ran this test almost a year ago:


At 2xMSAA the game runs quite smoothly for me. And 560 Ti's are faster than my OCed 5850s.

Also with BF3, I avg usually around mid 70s in the original maps at Ultra 2xMSAA, but the B2K maps seem to hog more memory and I get that slight pause/stutter when turning around. Dropping to High textures mostly fixed that, at least fixed it enough for me.
m
0
l
a c 216 U Graphics card
March 8, 2012 1:06:52 AM

subcutaneous said:
I dunno if you ever noticed the article Tomshardware released all about microstuttering? http://www.tomshardware.com/gallery/1xHD6870X2,0101-300...

The article explains how two GPUs working together do essentially double the average frame rates. It also goes on to detail that this is accomplished by having extremely high FPS peaks, and extremely low FPS dips, constantly!

If you look at the graph below, it shows two cards working together on top, and one single card by itself on the bottom. As you can clearly see the minimum FPS of two cards together is very close the the constant FPS produced by just one card. In essence the game is CONSTANTLY zooming between the speed of one single card running the game, and two cards running at double speed.

Have you ever been in a car where someone hits the brakes, then the gas, and then the brakes? Yeah it's like that, fast then slow then fast then slow, over and over the entire time you play the game. You might get used to it if it was just on fast OR slow, but it is going to constantly swing between fast and slow, making it as noticeable as possible. This is referred to as micro-stuttering.

It's the dirty little secret of SLI/crossfire, that you're only going to see a marginal improvement in the best case scenario in game, and potentially a big drop in perceived quality due to this constant fast/slow cycle. In many games it almost seems like something is rhythmically lagging a bit every couple seconds, but nothing is and you will go crazy trying to smooth it out.

http://media.bestofmicro.com/W/F/300543/original/1xHD6870X2.png


If you read further, it showed that the 6950 and 6990 in crossfire had far less issues with this issue. Also, oddly enough, 6870 in trifire (requires specialy x2 card for it to work) also had far less issue.
m
0
l
March 8, 2012 3:33:16 AM

wolfram23 said:
I ran this test almost a year ago:
http://i746.photobucket.com/albums/xx103/Wolfram23/5850%20single%20vs%20CF/Crysis.jpg

At 2xMSAA the game runs quite smoothly for me. And 560 Ti's are faster than my OCed 5850s.

Also with BF3, I avg usually around mid 70s in the original maps at Ultra 2xMSAA, but the B2K maps seem to hog more memory and I get that slight pause/stutter when turning around. Dropping to High textures mostly fixed that, at least fixed it enough for me.


I hate to say it, but those min fps have horrible scaling. When you look at the average fps as well, going from 32 to 52... sure that's actually decent scaling for CF. But if there's microstutter in crysis, that 52 may just feel more like 42. The microstutter depending on the game engine can take what could be 80% scaling and bring it down to what actually plays more like 40% scaling, for example.
m
0
l
a c 125 U Graphics card
March 8, 2012 2:07:34 PM

lmulder said:
I hate to say it, but those min fps have horrible scaling. When you look at the average fps as well, going from 32 to 52... sure that's actually decent scaling for CF. But if there's microstutter in crysis, that 52 may just feel more like 42. The microstutter depending on the game engine can take what could be 80% scaling and bring it down to what actually plays more like 40% scaling, for example.


Could be CPU related, I didn't bother testing at different CPU speeds. Or that "min" might have occured in the first 5 seconds of the run where it's still loading everything, I don't know. Hard to say as those results are very old.

I've never had micro stutter with my setup though. Micro stutter is entirely different from a "laggy spot" in a game where there's an FPS drop. Stuttering is something that happens consistently regardless of framerate (though it's more noticeable at low FPS) due to a difference in rendering times per frame. I get lag when I exceed my VRAM, for example. Like in Skyrim if I install a bunch of HD mods, the game will run smooth if I run straight but turning around causes a short hiccup in framerate while the memory gets swapped.

Basically I'm totally willing to deal with "lag spots" here and there as long as it's mostly high fps.

I did a lot of benchmarks by the way, the scaling was pretty good. Min didn't scale as well as max and avg, but again that could be factors outside of gpu horsepower (VRAM limitations or CPU or other bandwidth issues)


Keep in mind this was on the 10.5 drivers, and there has been improvements in many games since then.
m
0
l
a c 216 U Graphics card
March 8, 2012 2:13:13 PM

lmulder said:
I hate to say it, but those min fps have horrible scaling. When you look at the average fps as well, going from 32 to 52... sure that's actually decent scaling for CF. But if there's microstutter in crysis, that 52 may just feel more like 42. The microstutter depending on the game engine can take what could be 80% scaling and bring it down to what actually plays more like 40% scaling, for example.


You do have to be wary of making general statements of crossfire due to a single review. The one review were they analyzed micro-stuttering found it was only an issue with the 6870's in crossfire, but with the 6990, 590 and 6870 trifire, it was almost gone. They speculated the issue was due to lower end cards causing the problem, but it could have been the 6800 chip design too (the 6990 and 6950's are a different design).
m
0
l
April 5, 2012 9:10:36 PM

Almost gone is all a matter of perception. Mathematically the inconsistency is there just as bad as it ever was not matter how fast your GPUs are, it's just that microstuttering at extreme FPS is significantly less noticeable than at a lower FPS(low end CF/SLI rigs)

Once you notice microstuttering the first time you will never be able to ignore it after that, and when it happens in a game you like it will make you upset to no end. I have turned off one GPU in some games just to smooth out the gameplay, even though the average frames per second was about half the actual experience of playing the game(immersion) was much better.

To me personally immersion is the single most important part of the PC gaming experience, and nothing is going to ruin it like a jumpy inconsistent display. Maybe it will only be severe or annoying on some of the games I play, but if I put down the money I would rather have my hardware provide immersion in any game than an extra 20 maximum FPS.
m
0
l
a c 216 U Graphics card
April 5, 2012 9:37:17 PM

You obviously didn't read the article as it wasn't a matter of being less noticeable because it was higher FPS, but rather, with the higher end cards, the time difference between frames was nearly the same, or rather, much closer to being equal. Sadly, the study was incomplete as it didn't use a wide enough variety of cards to pin point exactly what changed the results.

EDIT: I'm not going to dig for the article for you, but the link to the graph above allows you to scroll to most of the graphics in the article. As you can see by it, the 6990 and 590 are much smoother.
m
0
l
a b U Graphics card
April 6, 2012 1:05:01 AM

I know when doing crossfire you cannot play xfire in windowed mode. Not sure if SLI operates the same way but that might help some if they behave the same.

Additionally, someone stated that BF3 is bottlenecked by cpu, which is incorrect since according to tom's it is more gpu bound.

http://www.tomshardware.com/reviews/gaming-fx-pentium-a...
m
0
l
April 6, 2012 2:46:20 AM

bystander said:
You obviously didn't read the article as it wasn't a matter of being less noticeable because it was higher FPS, but rather, with the higher end cards, the time difference between frames was nearly the same, or rather, much closer to being equal. Sadly, the study was incomplete as it didn't use a wide enough variety of cards to pin point exactly what changed the results.

EDIT: I'm not going to dig for the article for you, but the link to the graph above allows you to scroll to most of the graphics in the article. As you can see by it, the 6990 and 590 are much smoother.
http://media.bestofmicro.com/amd-crossfire-nvidia-sli,Y-X-300633-3.png


So I just re-read the article. When I made that post I had not read the article recently, so when I read your response I assumed that maybe I in fact had been mistaken.

Unfortunately for you the article did not mention a thing about higher end cards experiencing this phenomenon to a lesser degree. It did mention that the min/max framerates on Nvidia SLI solutions seemed to be better(less variable) than ATIs. Feel free to link me to another article that studies it more in depth than Toms Hardware did.

It also mentioned that depending on the drivers the results were quite a bit different. Some of the versions had been optimized for less stutter(at a lower average FPS) and some had been optimized for maximum FPS(with more agressive stutter, most ATI drivers lean this way)

I can see plainly that the 590 seems to have much less min/max range than any of the other dual GPU setups, but this could easily be attributed to good(ideal) drivers and honestly I would support SLI/CF 100% if they made all of their drivers in this fashion. And they in no way said that micro stuttering was not happening in that instance, even when framerate minimums are not suffering from the multiple card driver foolishness, the frame time differential is still quite evident.

Here's a nice video example, of course there's no way to "verify" this video but it illustrates the problem either way. Clearly these setups are all producing more than adequate frame rates, but the gameplay is noticeably different just based on different cards and drivers being used. http://www.youtube.com/watch?v=pw5QZ6NvkIU


Even if you are lucky and ordered cards that had great driver support and play nice and smooth like the GTX 275 SLI who knows when the next driver update is going to make your favorite game go stutter crazy? Of course you can roll back to the older drivers, but it's just another headache to micro manage every aspect of your graphics integration into a working system. I prefer to be able to update my drivers care free, as usually updated drivers offer benefits.
m
0
l
a c 216 U Graphics card
April 6, 2012 2:51:01 AM

If you wouldn't mind reposting the actual link, I'll find the place where it says it, but if you look at the graph I showed you in the previous post, you can see that the 6990, the 590 and the 6870 in trifire all have far less microstuttering. It is not, as you said previously, "it's just that microstuttering at extreme FPS is significantly less noticeable than at a lower FPS(low end CF/SLI rigs) ".
m
0
l
April 6, 2012 3:30:22 AM

If you actually look at the graph.....

The 6990 has just as much variability in it's min/max FPS every bit as much as the 6870 in CF. Maybe your screen is not adjusted for good contrast but the line representing the 6990 is going f'in crazy.

Beyond that it isolates the 590 is conspicuous having very low variability in min/max but it is the only dual GPU solution to show this quality.

Also it is plain to see that any time differential is going to be less obvious if it happens at a much faster rate.(can you tell how the screen is changing if it only changes twice per second? Of course in fact I am sure you could count the number of frames per second up to a cerain point, but eventually they are happening so quickly it is one smooth picture and you can't dilineate one frame from the next) In the same fashion if you GPUs are stuttering every 20 frames, then at 20 frames per second your going to notice the stutter every time it happens. Once you get to 160 frames in a second, even if it stutters 8 times in that second they may be happening so quickly that you never even notice it happen. So in that aspect ANY higher end card is going to yield less "visible" stuttering, until it becomes outdated enough to have a low FPS in modern games.

The fact remains that stuttering could be taking place even on that GTX 590 as it does not depend on the min/max differential but on the number of times the solution "stutters" as a result of having a timing mis-match when it tries to output the frames.
m
0
l
a c 216 U Graphics card
April 6, 2012 3:37:28 AM

subcutaneous said:
If you actually look at the graph.....

The 6990 has just as much variability in it's min/max FPS every bit as much as the 6870 in CF. Maybe your screen is not adjusted for good contrast but the line representing the 6990 is going f'in crazy.

Beyond that it isolates the 590 is conspicuous having very low variability in min/max but it is the only dual GPU solution to show this quality.

Also it is plain to see that any time differential is going to be less obvious if it happens at a much faster rate.(can you tell how the screen is changing if it only changes twice per second? Of course in fact I am sure you could count the number of frames per second up to a cerain point, but eventually they are happening so quickly it is one smooth picture and you can't dilineate one frame from the next) In the same fashion if you GPUs are stuttering every 20 frames, then at 20 frames per second your going to notice the stutter every time it happens. Once you get to 160 frames in a second, even if it stutters 8 times in that second they may be happening so quickly that you never even notice it happen. So in that aspect ANY higher end card is going to yield less "visible" stuttering, until it becomes outdated enough to have a low FPS in modern games.

The fact remains that stuttering could be taking place even on that GTX 590 as it does not depend on the min/max differential but on the number of times the solution "stutters" as a result of having a timing mis-match when it tries to output the frames.


I see I was associating the wrong card with the darker orange line. That said, you are basically calling into question their entire test. In which case, the test means nothing at all.

I'm fairly certain they were testing each individual frame, and not an average. The point was, when looking at the FPS counter every second, you don't see any microstuttering, it was when you look at the individual frames that it happens.
m
0
l
April 6, 2012 4:13:27 AM

Yes I see how the graph is meant to be interpreted now. I'll admit I didn't notice how the graphs were set up to show the time it took to render each frame when I looked over the article earlier today. So I agree it does show distinctly that the 590 is not suffering from significant stuttering.(in contrast to the final sentence of my previous post) Too used to looking at GPU performance graphs showing average FPS......

The main reason I argue against multiple GPU setups is because the majority of people asking about it are looking at two low priced cards. Very few people are looking to go from a low end GPU solution straight into a multiple 500$ GPU setup. So for the vast majority of these gamers trying to get awesome performance from two cheap cards, they are going to end up sorely disappointed.

But I also agree that the article has almost no bearing on reality due to the fact that new updates are constantly being released and the situation is always changing when it comes to what games are supporting and what drivers are supporting etc.

Chances are if you are willing to do the research(buy cards that don't exhibit the problem as readily) and meddle with the drivers and the game settings and trying all kinds of beta updates etc then there is a pretty good chance you could end up with a nice smooth SLI/CF setup but it's not going to be an easy road to go down. Your average person who is thinking about trying SLI/CF for the first time is going to spend less money and less time, and overall is likely to get a sub-par outcome for the money spent.
m
0
l
!