What a great day to be a gamer. And what an enjoyable day to be running benchmarks in a lab full of hardware.
Intel’s X58 Express was the first chipset to support ATI’s CrossFire and Nvidia’s SLI multi-card rendering technologies (I’m purposely ignoring the D5400XS—a $600 board all on its own), giving enthusiast more choice in deploying graphics horsepower than ever. But even the first X58-based platforms were priced between $300 and $400.
Now you can find the absolute cheapest X58 boards for about $170. However, motherboard vendors often have to cut heavily to get prices that low, and the least-expensive CPU they’ll accept is still Intel’s $280 Core i7-920. That's $450, minimum, before factoring in a triple-channel memory kit.
Cost concerns aside, we’ve had no reservations about recommending reasonably-priced X58-based motherboards, 6GB kits, and overclocked Core i7 CPUs up until now—in fact, that’s what I’m running in my own workstation. But it’s a new day, met with a new platform, socket, and processor lineup. Core i5 and Core i7 for Intel’s LGA 1156 interface are here, bringing with them the same CrossFire and SLI support in a significantly more affordable package.
Does Your Platform Even Make A Difference?
All this talk about high-end motherboards and CPUs. But do those components even make a difference when it comes to gaming performance? Last year, with the launch of Core i7, we took Intel’s flagship, added two Radeon HD 4870 X2s, three GeForce GTX 280 cards, and tested seven different games in an effort to answer that question.
Our conclusion was that the platform complementing your graphics armada of choice absolutely does matter to enthusiasts looking for the best gaming performance. Even at 2560x1600—the most graphically-taxing resolution we’re able to run—the difference between a Core i7-965 and Core 2 Quad QX9770 Extreme was easily quantifiable.
But what if you were able to save $100-$200 on a CPU, motherboard, and memory kit, then turn around and put that savings into a beefier GPU? Maybe even a second GPU? With a much more mature infrastructure of optimized drivers in place, can you get away with a $300 Core i5/P55 combo, or do you need to spend more to get great performance?
With the launch of P55 and Intel’s three LGA 1156-based CPUs, we wanted to revisit the topic of gaming performance. However, we couldn’t do it in quite the same way. You see, whereas X58 serves up to 36 lanes of PCI Express 2.0 connectivity, P55 only boasts eight—and those eight aren’t even meant to accommodate graphics cards, since the chipset communicates with its host processor via Intel’s Direct Media Interface (DMI). The company won't say how fast this generation of DMI is running, but it's not fast enough for serious gaming.
Thus, we only have the CPU’s 16 lanes of on-board PCI Express 2.0, which can be used as a single x16 link, split up into two x8 links, or multiplexed via something like Nvidia’s NF200 bridge into additional x8 links. That third option is a little extreme for a platform aimed at a mainstream audience though, so we’re going stick with single- and dual-card configurations here.
The FU Edition, only from BFG.
ATI’s Radeon HD 4870 X2 remains the company’s fastest board (almost a year later, even), but Nvidia now offers the GeForce GTX 285 and GeForce GTX 295. While we don’t have a second GTX 295 in-house, we got our hands on the next best thing: a pair of BFG’s GeForce GTX 285 OCFU Edition cards, which work out well because they cost just about as much as the dual-GPU Radeon.
So Many Variables!
Comparing performance results in a story like this, and then identifying the cause underlying frame rate dips and spikes can be a challenge, as we’re testing a range of different CPU architectures and platforms that do different things with PCI Express.
There are a lot of miles on these puppies...
Fortunately, we spent some extra time normalizing the clock speed of today’s most popular enthusiast architectures and comparing the effect of integrated PCI Express 2.0, a pair of chipset-based links full x16 signaling, and chipset-based links at x8 signaling rates in our story In Theory: How Does Lynnfield’s On-Die PCI Express Affect Gaming?
The general conclusion there was that we do see theoretical advantages to running one graphics card on an integrated x16 link. However, that benefit does not translate to real-world gaming. And when you add a second graphics card, halving the on-die link to create two x8 connections does hit performance a bit in environments that are platform-limited. Of course, as you get closer to 2560x1600 with anti-aliasing enabled, the benefits of CrossFire and SLI are sufficient to mask that architectural compromise.
At the end of the day, you end up with what amounts to a wash at high resolutions with one card installed. With two, even at 2560x1600, the pair of dedicated x16 links turns out slightly better results. Is that enough to warrant a pricy X58 motherboard and Core i7 processor? With everything running at a constant 2.8 GHz, not really.
But let’s shift away from our theoretical look at performance and get all of these processors running at their retail speeds for a more real-world look at gaming using ATI’s CrossFire and Nvidia’s SLI in Intel’s new P55.
| Test Hardware | |
|---|---|
| Processors | Intel Core i7-920 Extreme (Bloomfield) 2.66 GHz, LGA 1366, 4.8 GT/s QPI, 8 MB L3, Power-savings enabled |
| Intel Core i7-870 (Lynnfield) 2.93 GHz, LGA 1156, 8 MB L3, Power-savings enabled | |
| Intel Core i5-750 (Lynnfield) 2.66 GHz, LGA 1156, 8 MB L3, Power-savings enabled | |
| Intel Core 2 Quad Q9550S (Yorkfield) 2.83 GHz, LGA 775, 1,333 MHz FSB, 12 MB L2, Power-savings enabled | |
| AMD Phenom II X4 965 BE (Deneb) 3.4 GHz, Socket AM3, 4 GT/s HyperTransport, 6 MB L3, Power-savings enabled | |
| Motherboards | Asus P6T (LGA 1366) X58/ICH10R, BIOS 0707 |
| Gigabyte P55-UD6 (LGA 1156) P55, BIOS F3 | |
| Intel DX48BT2 (LGA 775) X48/ICH10R, BIOS 1902 | |
| Asus M4A79T Deluxe (AM3) 790FX/SB750, BIOS 1103 | |
| Memory | Corsair 4 GB (2 x 2 GB) DDR3-1600 7-7-7-20 @ DDR3-1333 |
| Corsair 6 GB (3 x 2 GB) DDR3-1600 7-7-7-20 @ DDR3-1333 | |
| Hard Drive | Intel SSDSA2M160G2GC 160 GB SATA 3 Gb/s |
| Graphics | Sapphire Radeon HD 4870 X2 2GB x2 |
| BFG GeForce GTX 285 OCFU Edition 1GB x2 | |
| Power Supply | Cooler Master UCP 1100W |
| System Software And Drivers | |
| Operating System | Windows 7 x64 RTM |
| DirectX | DirectX 11 |
| Platform Driver | Intel INF Chipset Update Utility 9.1.1.1015 |
| Graphics Driver | Catalyst 9.8 |
| GeForce 190.62 | |
A few notes about our setup here:
Most notably, we've shifted from Windows Vista to Windows 7 for testing. Let us know what you think about this in the comments section, but it was pretty clear that Vista was never a favorite, so we're hoping Windows 7 is a more popular environment in which to test. We can confirm that all of the games benchmarked here work in Windows 7. FS X did have occasional hiccups when loading a map, but restarting the app fixed the problem every time.
It should also be noted that the GeForce GTX 285 OCFU Edition cards are clocked too aggressively for an SLI configuration; we had to manually turn the fan speed up in Nvidia's System Tools utility in order to keep them stable through testing. To be fair, we also had trouble with the Radeon HD 4870 X2s in CrossFire on Asus' P6T, where the top card doesn't get enough fresh air due to spacing issues.
All power-saving options were left enabled so that Turbo Boost, EIST, and Cool'n'Quiet would function as intended in a production environment.
| Benchmark Configuration | |
|---|---|
| Benchmark | Setttings |
| S.T.A.L.K.E.R.: Clear Sky | High Quality Settings, No AA / No AF, vsync off, 1680x1050 / 2560x1600, DirectX 10 / DirectX 10.1 |
| High Quality Settings, 4x AA / No AF, vsync off, 1680x1050 / 2560x1600, DirectX 10 / DirectX 10.1 | |
| Resident Evil 5 | Fixed Benchmark, No AA / No AF, vsync off, 1680x1050 / 2560x1600, DirectX 10 |
| Fixed Benchmark, 4x AA / No AF, vsync off, 1680x1050 / 2560x1600, DirectX 10 | |
| Far Cry 2 | Ultra High Quality Settings, No AA / No AF, vsync off, 1680x1050 / 2560x1600, Steam Version, DirectX 10 |
| Ultra High Quality Settings, 4x AA / No AF, vsync off, 1680x1050 / 2560x1600, Steam Version, DirectX 10 | |
| Left 4 Dead | High Quality Settings, No AA / No AF, vsync off, 1680x1050 / 2560x1600, Tomshardware Demo, Steam Version |
| High Quality Settings, 4x AA / 8x AF, vsync off, 1680x1050 / 2560x1600, Tomshardware Demo, Steam Version | |
| Grand Theft Auto 4 | Very High Quality Settings, No AA / Max AF, vsync off, 1680x1050 / 2560x1600, In-Game Benchmark |
| Crysis | Very High Quality Settings, No AA / No AF, vsync off, 1680x1050 / 2560x1600, DirectX 10, Patch 1.2.1, 64-bit executable |
| Very High Quality Settings, 4x AA / No AF, vsync off, 1680x1050 / 2560x1600, DirectX 10, Patch 1.2.1, 64-bit executable | |
| Flight Simulator X | Ultra High Quality Settings, No AA / Trilinear Filtering, 20 fps target, 1680x1050 / 2560x1600, DirectX 10 |
| Ultra High Quality Settings, AA Enabled / AF Enabled, 20 fps target, 1680x1050 / 2560x1600, DirectX 10 | |


We’re kicking things off with S.T.A.L.K.E.R., one of the most graphically-challenging benchmarks in our suite (though at 1680x1050, you wouldn’t really know it, since all of our tested platforms achieve playable frame rates, and the addition of a second card doesn’t really do much for our five contenders).
Cranking the resolution up to 2560x1600 does demonstrate a more palpable reason to go for CrossFire or SLI, though.
The real kicker is that, from Core i7 to Core i5 to Phenom II, there’s literally zero difference between these configurations if you’re using a single GeForce GTX 285. One Radeon HD 4870 X2 favors the Nehalem architecture ever so slightly, but the two frames separating the Core 2 Quad from the Core i7-870 are hardly worth making a buying decision on.


What should be popping out at you, however, is how much of a difference CrossFire and SLI support make as we shift into our anti-aliased numbers. To all intents, constructions, and purposes, one ultra-powerful GeForce GTX 285 will turn back the same results in all five of these configurations. The same goes for the Radeon HD 4870 X2 compared down the line. Go for the cheapest motherboard and processor if gaming is your only concern, we say. Put the money saved toward another graphics card.
If that is, in fact, the route you take, note that at 1680x1050, two Radeon HD 4870 X2s and GeForce GTX 285s perform similarly. Only when you step up to 2650x1600 is there a bit of distance put between competing graphics architectures—in this case favoring ATI by a few frames.
As of this writing, a week before the P55 launch, the least-expensive Radeon HD 4870 X2 costs $369 on Newegg. The GeForce GTX 285 OCFU is a $389 board. Given ATI’s advantage in S.T.A.L.K.E.R., the Radeon is looking like a better buy at the high-end. Too bad it seems to be suffering limited availability (actually, that’s a good sign—next-gen, we’re looking forward to you!).


You’ll need to take our Resident Evil 5 numbers with a grain of salt if you can’t help comparing ATI to Nvidia. After all, the game doesn’t launch in North America until mid-September, and was only released as a benchmarkable demo by Nvidia to showcase its 3D Vision technology.
With that said, it’s a pretty game, and our talks with Capcom have indicated that the demo is indicative of the title’s performance. More than likely, ATI simply needs more time in order to make its own driver optimizations.
Update: After a bit of waiting, we were plugged in to the game's developers in Japan, who let us know that there is zero difference between the DirectX 9 and DirectX 10 code paths available in the demo. Thus, if you run the demo on ATI hardware, be sure to use the DirectX 9 path. For the purpose of these tests, again, don't bother comparing performance, as the Radeon HD 4870 X2s should be significantly quicker. If you're running Nvidia hardware, it's also safe to go with DirectX 9. The DirectX 10 option is the way to go for playing through GeForce 3D Vision shades.
Update 2: There's a hotfix that ATI says adds CrossFireX support to Resident Evil 5, available here. This was not the problem we were experiencing here, it turns out, though. With the hotfix applied, our results in the DirectX 10 version of the demo were still at or under 40 frames per second. The real issue is, in fact, the DirectX 10 mode of the game. Re-running 2560x1600 with 4xAA results in a score of 104.6 frames per second (higher than a pair of GeForce GTX 285s on the Core i5 platform). If you're testing your ATI-based GPU in this one, make sure you use DX9!
Until then, we see the lowest-res test favoring the Core i5 and Core i7 LGA 1156-based platforms, suggesting that this title isn’t well-optimized for threading and is instead seeing processor-specific gains due to Turbo Boost kicking in. Transitioning to 2560x1600 negates that benefit though, as graphics performance is more acutely emphasized—a claim substantiated by SLI positively affecting performance where it didn’t before. In both resolutions, CrossFire actually hurts the performance of ATI’s cards, suggesting that there is simply no profile yet available for Resident Evil 5.


We see the same favoritism toward the new LGA 1156 CPUs paired with Nvidia’s fastest single-GPU card, and the gain grows with a pair installed. Though this contradicts the results of our academic look at integrated PCI Express, which showed that, at comparable clocks, X58’s dedicated x16 links enabled better performance, the addition of Turbo Boost accelerating CPU performance in a poorly-threaded game would in fact give these graphics processors more room to breathe and result in a less CPU-limited environment.
The same advantage persists at 2560x1600, though it isn’t as pronounced, since the graphics cards are decidedly more taxed here. The scaling with SLI is truly impressive, even if it’s the result of significant optimization prior to the game’s launch. At the end of the day, this is still good news for gamers with Nvidia cards.
ATI owners will need to wait until the company’s driver team gets its hands on the game, optimizes the Catalyst suite, and adds a CrossFire profile. With the demo now available, we can only hope it’ll wrap this into its next driver release around the time Resident Evil 5 ships to retail.


Using Ultra Quality settings, Far Cry 2 crosses the threshold from CPU- to GPU-bound. With one Radeon HD 4870 X2 installed, you get very similar performance across all five of our test platforms at 1680x1050. But adding a second in CrossFire shows where the Core 2 Quad and Phenom II get choked up, and where the Core i7 and Core i5 stretch their legs. The same as true at 2560x1600—you get amazing performance from a $199 Core i5 and two $400 Radon HD 4870 X2. Sounds disgustingly imbalanced, right? Nevertheless, Intel’s entry-level i5 delivers the goods.
A single GeForce GTX 285 actually favors AMD’s Phenom II X4 965 Black Edition, ironically enough, at 1680x1050, but the results even out at 2560x1600. SLI does help Nvidia here, but not nearly as much as ATI’s CrossFire. And because we didn’t test on the 790i or 980a chipsets--less common platforms than 790GX or P45, we say--there’s no way to tell how an Nvidia-based solution would fare against the three Nehalem-based builds.


We see a similar story with the introduction of anti-aliasing. The numbers aren’t as high, of course, but a single Radeon HD 4870 X2 is still constrained by our benchmarked platforms. Meanwhile, a pair of the flagship cards takes off when backed by either of the Core i7s or Intel’s new Core i5. AMD’s Phenom II X4 965 edges out the Core 2 Quad, but both platforms trail still.
In contrast, Nvidia’s GeForce GTX 285 favors the Core 2 Quad and Phenom II with only a single board installed. The SLI-capable X58 and P55 configurations demonstrate significant gains with a second card available, but again we’re left to wonder if SLI-equipped Core 2 Quad and Phenom II platforms would outperform the newer Core i7 and Core i5 chips if more prevalent motherboards were available for them.


At 1680x1050, Left 4 Dead is entirely CPU-bound. Adding CrossFire or SLI only results in lower frame rates. We do get a great sense for how clock speed affects this game, though—at least between the three Core i5/i7 CPUs. The trio is favored, to be sure. And although the Core i5-750 features a more aggressive Turbo Boost implementation than the Core i7-920, it isn’t able to usurp the X58-based platform. Interesting also is that the ATI and Nvidia cards score identically. The bottleneck couldn’t get any more pronounced.
The competition opens up a little bit at 2560x1600. With a single Radeon HD 4870 X2 installed, AMD’s Phenom II X4 965 actually takes a first place finish, followed by the three Nehalem-based chips. With a GeForce GTX 285, all five platforms perform almost the same, notably slower than ATI’s flagship.
Drop in a second GeForce GTX 285, though, and Nvidia overtakes ATI, if only by a sliver. The Core i7-870, with its 2.93 GHz base clock, proves to be the fastest. Of course, even the lowest result in this chart is ridiculously quick. There’s no reason leave anti-aliasing and anisotropic filtering disabled in Left 4 Dead.


There’s a slight benefit to adding CrossFire or SLI at 1680x1050 with more visual detail applied, but certainly not enough to warrant buying a second graphics card. Again, the Core i7-870 takes a first place finish in this one.
At 2560x1600, with 4xAA and 8xAF enabled, all five platforms turn back the same results with a single Radeon HD 4870 X2 installed. The same happens when you sub-in a GeForce GTX 285, though the Nvidia card is quite a bit slower. Nvidia takes off with the addition of SLI though, sailing past a pair of Radeon HD 4870 X2s. ATI’s cards pick up performance too, but they don’t scale nearly as well.
Even though the cards on our P55-based platforms only get eight lanes of PCI Express connectivity each, the Core i5 and Core i7 systems still manage to out-perform the Core 2 Quad and Phenom II machines under the influences of CrossFire.


We’ve fielded a lot of feedback email asking for Grand Theft Auto 4 results. The game often proves to be fairly processor-dependent, but performance generally only ranges between 40 and 60 frames per second.
At 1680x1050, we can see that this one prefers Nvidia’s GeForce GTX 285—especially on the Core 2 Quad platform, where ATI falls behind. The Core i5/Corei7s and AMD’s Phenom II X4 deliver similar performance otherwise. SLI is the only multi-card technology that exhibits a gain of any sort; CrossFire actually hurts performance.
Swap over to a single Radeon HD 4870 X2 and the three Nehalem-based CPUs take a lead, even over the Phenom II.
When you up the resolution, a single Radeon HD 4870 X2 delivers even results across our five test beds. Again, CrossFire does nothing positive for performance. A single GeForce GTX 285, on the other hand, does gain a fair bit of performance from a second card. With just one installed, you really won’t be able to tell the difference between Core i7, Core i5, or Phenom II, though.


Though we still haven't seen the sort of hardware that makes Crysis playable with Ultra Quality settings, it’s high time we upped the ante a bit and used Very High options. Also, in the past, we’ve noticed very I/O-limited scores, which did a fairly poor job reflecting performance due to a constant hammering our reference system's VelociRaptor. This time we’ve switched to Intel’s second-gen SSD.
Right off the bat, we see a single Radeon HD 4870 X2 outperforming the GeForce GTX 285. But perhaps more interesting is that, with one ATI card, the Phenom II scores first place at 1680x1050, followed by the Core 2 Quad. The Core i7s and Core i5 follow after. One Nvidia single-GPU flagship yields fairly similar results across the board.
Adding SLI to the equation again shoots Nvidia to the top of the pile, as ATI simply can’t get as much scaling from a pair of Radeon HD 4870 X2s. Even more bizarre is that the ATI gets zero benefit from CrossFire on the two fastest systems with only one card installed.
Shifting over to 2560x1600 sees a single GeForce GTX 285 dip under 20 fps across all five systems, and one Radeon HD 4870 X2 sits just above that 20 fps mark. Fortunately, CrossFire and SLI both boost performance substantially, getting all of our dual-card setups up around 30 fps.
The most interesting result here is the Core i7-920, which establishes an advantage most likely attributable to its twin 16-lane PCI Express 2.0 links. If you reference back to our analysis of PCI Express connectivity, you’ll see that the results map over almost perfectly, despite the fact we were running High Quality settings there. Notice also that the AMD platform isn’t getting hammered as hard here, almost certainly a result of our switch to an SSD, which doesn’t penalize AMD as severely for the performance of its storage controller.


Let’s get the easy one out of the way first: at 1680x1050, a single GeForce GTX 285 delivers comparable performance across all five platforms. The same holds true at 2560x1600, with the exception of Intel’s Core 2 Quad-based platform, where the Nvidia card falters.
Adding SLI helps Nvidia catapult into the lead from a fairly sizable deficit at both tested resolutions and on all three compatible platforms. But while 1680x1050 becomes playable, 2560x1600 almost certainly remains out of reach, even with almost $800 worth of GPU muscle under the hood.
Our single-card tests all favor ATI’s Radeon HD 4870 X2, though again the Core 2 Quad and Phenom II machines out-score the trio of Nehalem-based configurations.
CrossFire does help the Core i7-920, but it does less for the Core i5-750 or Core i7-870 at either resolution. Beyond that, though, ATI's technology scales very poorly compared to SLI here. This wouldn’t be as disconcerting in an older title if it wasn’t a trend we’ve observed in every game thus far, save S.T.A.L.K.E.R. Fortunately, even at 1680x1050 with 4xAA, you’re still looking at fairly-playable performance.


Although I’ve lamented the gradual downfall of the flight simulator with a number of readers via email, culminating earlier this year with the news that Microsoft laid off the entire Flight Simulator team, I continue to receive requests for the three-year old Flight Simulator X.
With Service Packs 1 and 2 installed (and DirectX 10 enabled), we set out to give the notoriously CPU-hungry test one more showing. Alas, with the frame rate cap disabled, the FRAPS results from a straight flight at the same time/date kept coming back with inconsistent scores. Therefore, we set the game to run at its Ultra High Quality pre-configured settings, which include a frame rate target bump from 15 to 20 fps. This is the way the game would be played, and it’s going to illustrate a very important point that we’ll circle back to in the conclusion.
For the most part, all of these configurations deliver excellent baseline performance in FS X. When a platform falls short, the addition CrossFire or SLI easily brings it back up to 20 frames. The GeForce GTX 285 is the only exception on the Phenom II platform, as it doesn’t support SLI. This is a flight simulator, though. For most of its pre-defined configurations, Microsoft specifies a cap of 15 fps. The fact that we’re able to achieve the Ultra High cap across this wide range of configurations should help assure the sim fans out there that any of these modern setups are ample for the aging title.
Here’s the real deal: we can turn the settings down below 1680x1050 and show you 200+ frame per second results that make one processor look like a champ while another “languishes” along at 175 frames. But where’s the value in that? Running at 1680x1050 represents a solid baseline for mainstream gamers, while 2560x1600 serves as today’s Holy Grail. Add or subtract anti-aliasing and anisotropic filtering anywhere in there for the best balance between performance and quality.
In games like S.T.A.L.K.E.R. and Far Cry 2, you see a lot of the same results, regardless of the platform on which you’re running. Those are the titles where frame rates drop perilously low—they’re limited by the GPU power plugged into their PCI Express slots. Ideally, when you add a second board and turn on CrossFire or SLI, that situation changes, performance jumps, and you get closer to approaching the CPU’s limit instead.
Other games, like Left 4 Dead and to a lesser extent Grand Theft Auto 4 (we’ve seen World in Conflict fall into this category, too), demonstrate more variance, even with one card installed. Frame rates are usually already playable, yielding less benefit when a second card is installed. These are the games that tend to be CPU-limited in some way—most playable, right up until a graphics bottleneck kicks in.
Thus, the conclusion here is pretty simple. When gaming is your top priority, buy “just enough” CPU and reallocate the rest of your budget toward graphics. In one test after another, we saw situations where a single ATI Radeon HD 4870 X2 or Nvidia GeForce GTX 285 wasn’t powerful enough to show some sort of benefit to one host processor or another. Only after adding a second card in CrossFire or SLI do you start seeing some benefit to a quicker CPU. And those are $400 graphics cards. Unless you’re planning on spending twice that on an upgrade, the point at which you’ll see GPU performance limit frame rates will come even sooner—long before integrated PCI Express or x8 links play any sort of role.
How does that apply to Intel’s new CPUs? Gamers planning on a single-card graphics subsystem will get plenty of mileage out of the $199 Core i5-750 and a $100 motherboard. Because this falls below where the Core 2 Quad Q9550 or Phenom II X4 965 BE are currently priced, we’ll have to see how Intel and AMD adjust post-launch. However, a 2.66 GHz quad-core chip capable of scaling up to 3.2 GHz in single-threaded applications is good for more than just gaming, and as a result, it looks a heckuva lot better than the two architectures it undercuts today.
One more thing: SLI versus CrossFire. Oy. In certain games, ATI simply kicks butt. Its performance with one Radeon HD 4870 X2 simply walks Nvidia’s GeForce GTX 285, despite the fact that the two models we used are priced similarly. But add a second, and in some cases SLI gets close to doubling performance, while ATI not only fails to scale well, but outright loses its lead. Left 4 Dead, Grand Theft Auto, and Crysis are three examples. ATI still wins out in S.T.A.L.K.E.R., but SLI buys more performance for Nvidia. ATI simply dominates Far Cry 2, no matter which way you cut it. Even still, we'd like to see ATI match the scaling Nvidia is getting from SLI. At least then our point that gamers are better off with a second graphics card versus a pricey CPU would be easier to drive home.

