Benchmark Results: StarCraft II
An extremely popular Blizzard title, StarCraft II is a great place to start our performance comparison. This is generally a CPU-limited game, so the differences we see in frame rates should be mostly attributable to the overhead required by stereoscopic 3D rendering:
Without anti-aliasing enabled, the Radeons perform very well compared to Nvidia's GeForce cards. Only the GeForce GTX 550 Ti is unplayable at 1920x1080. Nevertheless, the GeForce GTX 580 SLI configuration delivers the best overall performance by a little bit.
How can this be? Two GeForce GTX 580s are significantly more powerful than a single Radeon HD 6970, right? Absolutely. The thing to remember is that running in stereo mode limits performance to a 60 FPS maximum (the display output has to be synchronized to the active shutter glasses, which give you 60 frames per second in each eye). Thus, in a processor-bound game, you'll find that the most expensive GPUs may serve up more performance than these technologies can exploit.
Now let’s add anti-aliasing to the mix. There’s no in-game anti-aliasing available in StarCraft II, so we have to force it through the driver. Multi-sampling does not work with the TriDef driver in this game, so AMD's Radeon cards have to be tested with morphological anti-aliasing (MLAA), a post-processing filter that hunts down jagged edges. Unfortunately, this doesn’t result in an apples-to-apples comparison. It's still relevant, though, because it demonstrates the realistic options available to each solution if you want to game with AA turned on.
With 4x anti-aliasing enabled, the GeForce GTX 580 SLI configuration still isn't being hamstrung, suggesting those two cards still have performance to deliver. AMD's Radeon HD 6970 finishes second, coming closer to its limit. It even manages to beat the GeForce GTX 570 in normal 3D mode. The Radeon HD 6790 fails to achieve 30 FPS at 1080p.