Last week, Nvidia called to let us know that Sandy Bridge-E really allowed three-way SLI to shine in games like Battlefield 3. It showed performance results up to 20% higher than Intel’s prior-generation platform, but it didn’t say whether it used the campaign or a multi-player map for testing. I really like the idea of benchmarking a 64-player rush match, yeah. But I just can’t accept that the results are reliable. I even reached out to Johan Andersson at DICE for guidance on testing, and he admitted there aren’t any good deterministic sequences to profile.
So, I fell back to the same campaign sequence used in Battlefield 3 Performance: 30+ Graphics Cards, Benchmarked, hoping that it was at least graphics-bound enough at Ultra settings to show off what three-way SLI can do.



It turns out that this sequence does demonstrate scaling at all three resolutions. A trio of GeForce GTX 580s yields great performance from 1680x1050 to 2560x1600. It just doesn’t shine significantly brighter on Core i7-3960X.
So, here’s my interpretation of Nvidia’s findings. It’s not that Core i7-3960X allows three-way SLI to stretch its legs in any particularly unique way. In a purely graphics-bound scenario, it scales almost as well on a $300ish Core i7-2600K or a $1000 Core i7-3960X. However, I suspect Nvidia did its benchmarking in a multi-player map, where processor performance is more influential. Less-powerful CPUs become bottlenecks with so much graphics muscle behind them, inhibiting scaling.
If anything, this serves as a reminder why gamers shouldn’t skimp on a processor and load up on GPUs. In a title like Battlefield 3, there are environments that tax graphics (the campaign) and others that exact a more demanding load on the CPU (multi-player). Balancing the two is critical. So, if you’re willing to splurge on three-way SLI, be prepared to also spend generously on a complementary platform. Today, Sandy Bridge-E, by virtue of its per-clock performance and six-core configuration, is unquestionably the best you can present to a trio of potent GTX 580s.
- Say Hello To The PC Hardware Trophy Wife
- Quad-Channel Memory And PCI Express 3.0
- X79 Express: P67, Is That You?
- Cooling And Overclocking Core i7-3960X
- Test Setup And Benchmarks
- Benchmark Results: PCMark 7
- Benchmark Results: 3DMark 11
- Benchmark Results: Sandra 2011
- Benchmark Results: Content Creation
- Benchmark Results: Productivity
- Benchmark Results: Media Encoding
- Benchmark Results: Crysis 2
- Benchmark Results: DiRT 3
- Benchmark Results: World Of Warcraft
- Crysis 2 In SLI
- DiRT 3 In SLI
- World Of Warcraft In SLI
- Battlefield 3 In SLI
- Power Consumption
- Core i7-3960X Versus Core i7-990X
- Core i7-3960X Versus Core i7-2600K/Core i5-2500K
- Core i7-3960X Versus FX-8150
- A Symbolic King In A Crowd Full Of Value
The funny thing is that cores don't scale well. They do, but it's far from ideal as the percentages from the 2600K show (and the FX-8150 but that's a different story).
But the takeaway:
-If you're playing games the i5-2500K is the best purchase you can make and it's enough for Tri-580 SLI. Only WoW shows any difference, but most games ignore it.
-X79 is Intel being just plain lazy. No matter how you slice it- the X79 should have been called X67 and left like that. It's also a wildcat platform that will only support at most 6 CPUs that aren't terribly crippled.
-A Phenom II 955BE (or unlocked 960T, or a 1090T/1100T) is still a fine CPU to have unless you're gaming with dual graphics cards or doing time-intensive tasks.
What we have today is simply a platform for bragging rights not a serious contender to the X38, X48, X58 family.
I would LOVE to see them pick up their game and provide me with a worthy upgrade over my 4GHz i7 2600 (Non-K). I would swoop it up.
Look, BD had 4 modules with two "cores" each, each module is equivalent to a Sandy Bridge core.
They should just combine both of those cores or make them a single core, so we get 4 threads.
Then create 4-6-8 core versions of those CPU's..
Think about it.. the FX8150 is more of a 4-core CPU where the resources are halved pretty much so you get two threads per core, it would have been MUCH MUCH better if they just kept 4 strong cores.
Not sure why either but I always seem to start an AMD related comment :\
The labels are wrong on the graphs on this page the last ones should read DDR2-2133 on the last two shouldn't it?
JeanLuc
The only use for the 3820 really seems to be a cheap placeholder processor if you need a new PC now, but want to wait for a likely full 8c/16t version to come out around the time Ivy Bridge is released. The 3930k should prove to be a very good high end gaming/ mid range workstation part though for people who invest close to $1k in graphics cards.
The funny thing is that cores don't scale well. They do, but it's far from ideal as the percentages from the 2600K show (and the FX-8150 but that's a different story).
But the takeaway:
-If you're playing games the i5-2500K is the best purchase you can make and it's enough for Tri-580 SLI. Only WoW shows any difference, but most games ignore it.
-X79 is Intel being just plain lazy. No matter how you slice it- the X79 should have been called X67 and left like that. It's also a wildcat platform that will only support at most 6 CPUs that aren't terribly crippled.
-A Phenom II 955BE (or unlocked 960T, or a 1090T/1100T) is still a fine CPU to have unless you're gaming with dual graphics cards or doing time-intensive tasks.
Yessir! Working on it now!
Yes. Its expensive. In other news the Earth orbits the Sun. I wish I had enough $$$ that the costs of this CPU was inconsequential to me.