Sign in with
Sign up | Sign in
Your question

Crossfire is bad minimum fps :O?

Last response: in Graphics & Displays
Share
November 6, 2010 2:32:34 AM

Hello, from some benchmarks i've seen on toms + a post on this forum that having crossfire may reduce your minimum framerates below what a single GPU would normally have. Is this true? Any numbers/proportions on how much lower it will become?

I ask this because when the 69xx comes out this month i'l either purchase x2 6870 or 1 69xx and won't be upgrading for about 1-2 years.

I have a gaming rig. I intend to play games at maximum settings on 1920x1080 with AAx2(or even x4) and all that delicious stuff.
a c 216 U Graphics card
November 6, 2010 3:02:40 AM

It does happen on some games. I personally think it's better to go with a single card solution unless you can't get the performance you need with 1 card.

I'd wait for the 69xx.
m
0
l
a c 194 U Graphics card
November 6, 2010 3:12:48 AM

If the minimum frame rates are CPU limited crossfire provides additional work for the CPU and will therefore drop the frame rate a bit lower, but the average is usually significantly higher thanks to the added card.
m
0
l
Related resources
November 6, 2010 4:38:59 AM

Thanks for all the replies. Generally i'm fine with 60fps (mores even better - yes i do notice the difference). I play games like Metro 2033 which on my current card (5870) with max settings etc gets owned below the 30fps requirement (which is annoying).

So i was just looking for the best option to go with for future games (1-2 years from now), so i don't have another <30 fps issue like metro 2033. I could always lower settings, but i really enjoy playing at max :) .

I suppose now i'll just continue waiting for hte 69xx and see what the benchmarked results show.
m
0
l
a b U Graphics card
November 6, 2010 4:51:03 AM

That post may well have been mine. I believe someone else later noted that the lower frame rates all occurred early in the benchmarks; after everything was loaded up things seemed to run smoothly. But by then the benchmark-related damage was done - the minimum frame rates had been registered.
m
0
l
a c 216 U Graphics card
November 6, 2010 4:56:06 AM

expensivecomputer said:
Thanks for all the replies. Generally i'm fine with 60fps (mores even better - yes i do notice the difference). I play games like Metro 2033 which on my current card (5870) with max settings etc gets owned below the 30fps requirement (which is annoying).

So i was just looking for the best option to go with for future games (1-2 years from now), so i don't have another <30 fps issue like metro 2033. I could always lower settings, but i really enjoy playing at max :) .

I suppose now i'll just continue waiting for hte 69xx and see what the benchmarked results show.


If you have a 60hz refresh rate on your monitor (most are, not all), more than 60 FPS does not show on your screen, and you cannot see more. The good part about over 60 fps is that also usually means fewer moments ever drop below 60 fps.
m
0
l
a c 216 U Graphics card
November 6, 2010 5:01:21 AM

Twoboxer said:
That post may well have been mine. I believe someone else later noted that the lower frame rates all occurred early in the benchmarks; after everything was loaded up things seemed to run smoothly. But by then the benchmark-related damage was done - the minimum frame rates had been registered.


While that does happen some of the time, it's not always the case.

For example, here is a review of GTS 450's in SLI: http://www.xbitlabs.com/articles/video/display/geforce-...

"It may seem that the GeForce GTS 450 SLI is quite an appealing solution compared to the more advanced single-chip cards, but we must note a couple of disappointing facts about it. First, this SLI tandem had a lower bottom speed than its single-GPU opponents in some games. This made the gameplay uncomfortable in S.T.A.L.K.E.R.: Call of Pripyat, for example. Second, the GeForce GTS 450 SLI would generally slow down more than its opponents at the resolution of 2560x1600. And third, we had some incompatibility issues typical of multi-GPU solutions. We could not launch Lost Planet 2 at all using the latest version of the Nvidia GeForce driver, and there may be other such games."

I'm guessing there are times that SLI/CF doesn't always help, and then your low end FPS end up being based off 1 weaker card, which is not as good as a single strong card.
m
0
l
a b U Graphics card
November 6, 2010 5:13:57 AM

Yes - note I didn't say I totally believed the answer given was correct. My personal experience with 2xVidcards has not been good.

When I played a FPS, often the "drivers weren't optimized" until after I finished playing the game :) 

Now I play MMOs almost exclusively, and typically got lower frame rates in SLI/xfire than when I use one of the cards. When I don't get lower frame rates, the increase in FPS is often worthless.

So I went back to single-card play.

I've always attributed lower sli/xfire frame rates to MMOs putting a heavier demand on the cpu than FPS games, being less threaded than FPS, and then faltering under the burden of drivers managing the work of two vid cards.

That's my belief, and I'm sticking to it lol.
m
0
l
a b U Graphics card
November 6, 2010 5:26:20 AM

@bystander:
If I understand the benchmarks you've linked, the issue is somewhat different.

I believe xbit is saying that altho 2x450 got higher FPS than its *competitors*, it had lower minimum FPS than some of those *competitors*. IOW, 2x450 might deliver more average FPS than 1x470, but 1x470 might deliver higher minimum frame rates and therefore better playability.

The benchmark I was quoting (2x6850, 6870) showed that 2 x 6850 had lower minimum FPS than 1 x 6850. That's a bit different.

A poor guy running a 6850 who buys a second one isn't trying to hit 200 FPS. He's trying to make his games more playable, ie higher minimum FPS. And he gets exactly the opposite result.

That's a pissed off customer.
m
0
l
a c 216 U Graphics card
November 6, 2010 2:21:03 PM

Twoboxer said:
@bystander:
If I understand the benchmarks you've linked, the issue is somewhat different.

I believe xbit is saying that altho 2x450 got higher FPS than its *competitors*, it had lower minimum FPS than some of those *competitors*. IOW, 2x450 might deliver more average FPS than 1x470, but 1x470 might deliver higher minimum frame rates and therefore better playability.

The benchmark I was quoting (2x6850, 6870) showed that 2 x 6850 had lower minimum FPS than 1 x 6850. That's a bit different.

A poor guy running a 6850 who buys a second one isn't trying to hit 200 FPS. He's trying to make his games more playable, ie higher minimum FPS. And he gets exactly the opposite result.

That's a pissed off customer.


I don't believe you are correct, as it specifically states this: First, this SLI tandem had a lower bottom speed than its single-GPU opponents in some games.

And ya, the reason I believe it may be dropping equal, or slightly below a single gpu, is because I believe the 2nd card isn't giving any benefit occationally, but it still is taking on some overhead.

edit: after reading what you wrote again, I don't know what you are trying to disagree with. I showed an artical that does show that going SLI/CF has a disadvantage, to support what the OP read, which was also a case were SLI had a disadvantage.
m
0
l
a b U Graphics card
November 6, 2010 6:20:46 PM

Not disagreeing with you, mate. Just saying your making an important but different point.

Xbit is saying 2x450 produces higher FPS than a 470, BUT the 470 has higher MINIMUM frame rates.

Tom's chart says If you are dissatisfied with the minimum FPS of a 6850, you won't solve the problem by adding a second one. Minimum FPS will go down.

This is not true of the 450; when you add a second 450, minimum FPS goes up.
m
0
l
a c 216 U Graphics card
November 6, 2010 6:53:37 PM

Are you sure on that? Granted, it wasn't often, but there were a couple of graphs on the post I gave that had the 450's in SLI having a lower FPS than a single 450 and a few with it barely better.
m
0
l
a b U Graphics card
November 6, 2010 7:26:58 PM

I'll go check lol. Maybe I relied to much on the word "Opponent", found one result to confirm and ignored the rest. BRB.

Nope. Here's the results; in all benches where 450 SLI achieved more than 4 FPS (yes, four), the Minimum FPS for 450SLI was higher - significantly higher - than a single 450. The outliers were:

Crysis Warhead: @2560, 2 vs 3 FPS
Metro 2033: @1680 4 vs 6, @1920 3 vs 5, @2560 2 vs 1

The fact that SLI beat a single card in Metro @2560 proves, if proof is needed, that these results are all so low that timing of events and other factors became an issue.

Compare those results to minimum FPS for 1x6850 and 2x6850 across the board. But then remember that some feel this occurs early in the benchmarks during texture loading or some such, and that after that things went "smoothly".

I remain a bit cautious, though, because "smoothly" is not a number.
m
0
l
!