Our Second Round With StarCraft II
When Gigabyte suggested that we review the performance of StarCraft II on an all-Gigabyte graphics card lineup, we were delighted. We wanted an excuse to revisit the game, even though we had performed a thorough performance analysis of the StarCraft II beta a few months ago.
While the game engine hasn’t changed much between our beta review and the final release, we weren’t especially satisfied with the benchmarking method we were forced to use at the time. This is because the only consistent way to benchmark the beta was by playing back a saved game. This involved watching a movie of game play that had previously occurred. While this test did stress the graphics engine, it wasn’t ideal for measuring real-world performance. In an actual game play scenario, the system is forced to calculate variables in real-time. Playing back a saved game with a predetermined outcome doesn't generate the exact same processing load.
The release of the full title allows us to create a more realistic simulation. The bundled StarCraft II Map Editor gives us the ability to build a map pre-populated with multiple simultaneous battles involving all three StarCraft races at the same time. Now that the computer has to perform all of the necessary AI calculations, instead of simply playing back a movie with a predetermined outcome, we have the opportunity to perform a worst-case scenario stress test of the game’s ability to push PC hardware to its limit.
In addition, AMD released the Catalyst 10.7 beta driver that supports anti-aliasing in StarCraft II, so we can see how Radeon and GeForce cards compare with this graphical enhancement enabled. Of course, between then and now, AMD made its Catalyst 10.8 package available as well, wrapping in the improvements introduced in the hotfix driver.
With all of these considerations in mind, it's a good time to revisit StarCraft II, post-release. Let’s start by looking at the hardware we're using to benchmark this game.