SLI With GeForce GTX 280 Superclocked
If you thought the GeForce GTX 280 would suffer the same temperature problems in SLI as the GTX 260 in SLI mode, then you’d be mistaken. However, the powerful two-card solution encounters other problems. For instance, maximum power consumption is 540 watts, but both overclocked cards should fallen between 640 and 710 watts with the test system. The lower power consumption in SLI mode means that the temperature and noise level is lower than for a single card.
Thermal throttling of the graphics chip (as seen on the GTX 260) is not the reason why the GTX 280 in SLI only hits 85 degrees. A defect is also not likely, as the frame rates are slightly higher than the level of the Geforce GTX 260 in SLI. And both GTX 280 cards function normally when they’re running on their own. The loss of performance can only be explained by the lack of CPU horsepower to help facilitate scaling, which can be clearly seen from our overclocking results.
Although both GTX models are overclocked from the factory, the overall evaluation shows a loss of performance. If you average all the games of the benchmark suite, the overclocked GTX 280 in SLI saw a drop in performance of 1.1%, whereas the single card has a 5.8% increase. In Mass Effect (UT3 Engine), the single card at 1920x1200 pixels—with anti-aliasing—achieved an increase in frame rate of around 16%. In SLI mode, it decreased 0.8%.
Here are some highlights: World in Conflict at 1920x1200 pixels with 4xAA achieved 32.8 fps on a single card. With GTX 280 in SLI it hit 45.6 fps (the MSI overclock produced 44.2 fps). Mass Effect at 1920x1200 pixels with 8xAA and a single card reached 60.6 fps, and with the GTX 280 in SLI hit 74.6 fps (the MSI overclock was at 74.0 fps).
As you can see, SLI adds an acceptable level of additional power at the right resolutions, but without the platform to back that configuration up, you’ll actually sacrifice performance. If you look at the individual benchmarks, the worst values come from low resolutions and badly optimized games, which react negatively to SLI if they react at all. An important factor is now also the CPU—with more power, higher frame rates should be possible, and MSI’s factory overclocking should also provide additional gains. But without a powerful processor it is better to stick to a single card for 3D games, as the GTX 280 in SLI requires a little more in the way of system performance.
In 2D mode, the power consumption is 203 watts, while in 3D mode the pair draws 540 watts (from the wall). The GTX 260 in SLI drew 610 watts. According to the manufacturer’s specifications, both 3D cards should lie between 640 and 710 watts with the test system. If you wish to operate the overclocked GTX 280 in SLI mode, you will need a branded power supply with between 530 and 570 watts and 44 to 48 A on the 12 volt rail for a standard system. If the entire system reaches the top value of 710 watts (from the wall) a branded power supply with between 600 and 650 watts on that rail should be sufficient.
Current page: SLI With GeForce GTX 280 SuperclockedPrev Page GeForce GTX 280 Superclocked Next Page Assassin’s Creed v1.02
Stay on the Cutting Edge
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
Looks like the results for SLI and Crossfire were switched with the single card results. . .Reply
Not a bad article, really comprehensive.Reply
My one complaint? Why use that CPU when you know that the test cards are going to max it out? Why not a quad core OC'ed to 4GHz? It'd give far more meaning to the SLI results. We don't want results that we can duplicate at home, we want results that show what these cards can do. Its a GPU card comparason, not a complain about not having a powerful enough CPU story.
Oh? And please get a native english speaker to give it the once over for spelling and grammar errors, although this one had far less then many articles posted lately.
No 4870x2 in CF so its the worlds top end Nvidia vs ATI mid to low end.Reply
It'd be a good article if you'd used a powerful enough CPU and up to date Radeon drivers (considering we're now up to 8.8 now), I mean are those even the 'hotfix' 8.6's or just the vanilla drivers?Reply
Version AMD Catalyst 8.6? Why not just say i'm using ATI drivers with little to no optimizations for the 4800's. This is why the CF benchmarks tanked.Reply
at 1280, all of the highend cards were CPU limited. at that resolution, you need a 3.2-3.4 c2d to feed a 3870... this article had so much potential, and yet... so much work, so much testing, fast for nothing, because most of the results are very cpu limited (except 1920@AA).Reply
WTF, hd4850 SHOULD be a lot faster than 9600 GT and 8800 GT even tough they have 1Gig of ramReply
No 4870X2 and 1920 X 1200 max resolution tested. How about finishing the good start of an article with the rest of it...Reply
I agree, the 4870 X2 should have been in there and should have used the updated drivers. Good article but I think you fell short on finishing it.Reply
@pulasky - Rage much? It's called driver issues you dumbass. Some games are more optimised for multicard setups than others, and even then some favour SLi to Crossfire. And if you actually READ the article rather than let your shrinken libido get the better of you, you'll find that Crossfire does indeed work in CoD4.Reply
Remember, the more you know.