onearmedscissorb

Distinguished
Apr 25, 2008
38
0
18,530
Assuming this is for playing video games, it's kind of a moot issue IMO. There's pretty much a greater than 99% chance that the game itself is going to effectively "bottleneck" the CPU AND graphics cards with a setup like that.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780

I doubt it, but if it does there's not a whole lot you can do about it. That would already be substantially faster than the fastest processor money can buy.

In any case even if it does I don't think you're going to be having any framerate issues.
 

Rubenov

Distinguished
Jun 18, 2008
39
0
18,530
That setup will give you 40 - 50 fps on Crysys DX10 @ Very High details with 8 x AA, 1900 x 1200 resolution. (note that Crysis is entirely GPU bottlenecked, a faster processor won't get you better results).

As long as you play at 1900 x 1200 I doubt you will have a bottleneck problem with any game.

Even if you play at a lower resolution, it won't matter, since you will be pumping out 100's of FPS like crazy... bottlenecked or not.
 
If you have that setup, and will spend that kind of money, you had better have a 24" 1920x1200 minimum. In most cases, at that kind of res, you should be graphically limited, not CPU limited, especially if you crank up the AA.
 

onearmedscissorb

Distinguished
Apr 25, 2008
38
0
18,530
I believe he is asking if the processor itself is actually fast enough to push three of those monstrosities. Whether the game has crazy AI like Supreme Commander that calls for a powerful CPU or not isn't the issue. It's still going to take a processor of X speed to make use of Y video card(s). But the problem is that the game probably isn't going to make real use of a quad core processor to begin with, and on top of that, I highly doubt any game really scales well with that graphics card setup. And that was my point about the game itself being the real bottleneck.
 

Rubenov

Distinguished
Jun 18, 2008
39
0
18,530


He means that Tri-SLI in general does not scales very well. Honestly, two GTX 280's in SLI would be almost the same in most games as 3 x gtx280's.

I would save $450 bucks and go with two instead of three... if you have the money.

The only game you would notice an improvement with 3 x gtx280's over 2 x gtx280s would be Crysis, and we are talking about a single digit gain in FPS at Very High settings 1900 x 1200 etc... Not worth the extra $450 in my opinion.
 

macer1

Distinguished
Mar 6, 2006
424
0
18,810
let me help you

This is taken from that review i just posted.



" I have to honestly say that I thought SLI would be the savior of these cards. While I was testing them I was extremely happy with the results and they looked extremely impressive, but once you put them into your graphs and begin to get a better perspective of how it compares to the other setups, you don%u2019t realize just how disappointing the results are. For the most part the GTX 280 in SLI and GX2 in Quad SLI perform the same, bar WIC AA/AF 2560 tests and Vantage.

Sure it%u2019s nice to see NVIDIA condense the power of two cores into one and four cores into two, but this doesn%u2019t come at a price drop. Unfortunately I didn%u2019t have my %u201Cpower thingy%u201D handy while at the IBuyPower Australia bunker, but what I thought was a quality power supply (1000W Zalman) wasn%u2019t able to handle the strain three cards placed on the system. Considering the water cooling pump uses only 24Watt, you can%u2019t even say that%u2019s the cause of it.

Like I said in the review of the ZOTAC GTX 280, the cards clearly have potential as far as PhysX and CUDA go, but it doesn%u2019t seem like they%u2019re ever going to be the performance beasts we thought they would be. After months of having Tri SLI and Quad SLI drivers out you would have think that NVIDIA would have improved on the technology, but adding a third card to your system and pulling another $600+ USD out of your wallet is going to be an extremely painful process.

As much as I would like to think NVIDIA have some magical driver around the corner that adds an extra 40% performance to the cards, the hard fact is it%u2019s highly unlikely as it would be ready now. It%u2019s been a while since NVIDIA has made a misstep in the graphics card world, but it looks like this could be one. It will be interesting to see what%u2019s going on with the GTX 280 in a few more weeks time; we can%u2019t see people embracing it like the 7800 and 8800 series of cards from yesteryear, though. "
 

SpinachEater

Distinguished
Oct 10, 2007
1,769
0
19,810
A CPU is in a way always a bottleneck. As you raise the clocks you improve your graphics performance. It isn't like you can go from 3.5 GHz to 3.7 GHz and not see improvements. It seems like once you hit around 3.2 GHz on a quad the returns start to diminish beyond those clocks.

I just did some 3Dmark06 benches last night and found that my over all score jumped the highest when moving towards 3.2 GHz, but beyond that it drops off a bit

Clock GHz delta 3DMark06 Score
2.66 to 2.88..........599
2.88 to 3.10..........618
3.10 to 3.32..........377


The bottleneck in a tri-SLI setup will be the drivers.
 

Try some gaming benchmarks though. From what I've found, CPU makes far more difference in 3dmark than in most games.
 

SpinachEater

Distinguished
Oct 10, 2007
1,769
0
19,810


Yeah that is what I tend to find too. 3Dmark is so CPU sensitive it is hard to compare results with a different system. I want to try the Devil May Cry 4 benchmark and the Crysis benchmark next to see how the different CPU speeds change the FPS in games. I was inspired to do the 3DMark test since there are so many people popping up that have low scores with old CPUs and think their new GPUs aren't working.




I dunno....the gains in both SM2 and SM3 peter out close to the same rate. These numbers are the margin of increase that came with each CPU OC. Granted, I didn't get up to 3.5 and above to continue the pattern but across the board, 3.2-3.3 GHz seems to the be sweet spot for the quads. The CPU gains really drop of in the 3.1-3.32 range. I think it is a little strange that the CPU score from 2.88 to 3.10 has better gains than going from 2.66 to 2.88.

deltaee1.jpg


I want to go back and break the OCs down in half (.11 GHz jumps) to see at which point the gains drop off.
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
Unless you plan to do serious cooling, its generally adviseable to avoid 3way SLI on the gtx280.

2way SLI gtx280 is tons of graphics power even on a 30" display - and the 3way drivers are still glitchy and even in the best case scenario the scaling will be disappointing for your investment.