Unlike the original, StarCraft II has a boatload of graphics settings with 13 different performance options to tweak, including texture quality and resolution.
For simplicity's sake, there's a master graphics quality setting for all of the graphics options (except texture quality) based on one of four levels: low, medium, high, and ultra. We're going to keep things simple and compare the output of the master quality settings. Here is an animated GIF showing the differences:
At low detail, the game has a very flat look, as there are no shadows being rendered. Lighting is also very simple. Medium detail offers a huge increase in graphical fidelity, with shadows and a more complex lighting model that allows for shininess and bump mapping. At high detail, glows are added to the scenes’ lights, and individual units get their own light sources. Perhaps more importantly, the shadows have softer edges. Ultra detail adds a number of subtle shader and lighting improvements, but the most notable differences are shadow transparencies (notice the colored shadows cast by the crystals), even nicer shadow edges, and more details, such as foliage.
The bottom line is that medium detail is the minimum setting you'd want to use to enjoy StarCraft II, so this will be our baseline setting in the benchmarks. We will also benchmark the ultra setting to see what kind of performance hit we will take when amping up the fidelity.