The Benchmark Suite
An important factor in the selection of games is to cover a range of 3D graphics engines and a mix of different genres. For example, Assassin’s Creed runs on a very fast and high quality DirectX 10 engine, and more titles will be available with this graphics engine in the future. Call Of Duty is a first-person shooter, which is occasionally improved, and represents an excellent speed test for SLI or CrossFire, while Crysis is the yardstick for evaluating shader and raw graphics performance. For DirectX 10 effects, only the Very High mode is of interest, as it is only at this quality level that the water reflections and HDR rendering of the sun are really visible. The 3D performance of current graphics cards is still not quite good enough for Very High mode combined with high resolutions, anti-aliasing and fluid animations.
Enemy Territory Quake Wars runs with the id Engine and under OpenGL, which we have already seen in Prey, Doom, and Quake. Half Life 2: Episode 2 uses the improved Source engine from Valve—the same one used in Counter Strike Source (CSS) and the complete Half Life 2 series. Mass Effect is the most recent game with the UT3 Engine, and this is also used to give a modern 3D look to Rainbow Six Las Vegas 1 and 2, Stranglehold, Bioshock, UT3, Blacksite and Frontlines Fuel of War. Since there will be many other titles using the UT3 engine, Mass Effect and its associated results form one of the most important benchmarks. Flight Simulator 10 from Microsoft (FSX) should be familiar territory for private pilots and fans of flight simulators. And finally, the graphically-intricate World in Conflict is representative of the real time strategy games genre.
Test resolutions are 1280x1024 (aspect ratio 4:3), 1680x1050 (16:10) und 1920x1200 pixels (16:10), both without filtering and with AA/AF. Mass Effect was tested with 4xAA and 8xAA, which we forced via the graphics driver. The percentage of DirectX 10 titles has increased somewhat in recent months, and older graphics cards with Shader Model 3.0 are able to switch the games over to DirectX 9 with no trouble at all.
Caution! Since older cards switch down into DirectX 9 mode, they may run faster than their DirectX 10 replacements, as the latter must draw more intricate DX 10 effects.
|Assassins Creed v1.02||DX10||0AA+0AF||AA+AF||Scimitar Engine|
|Call of Duty 4 v1.6||DX9||0AA+0AF||4xAA+8xAF||Call of Duty|
|Crysis v1.21 High Quality||DX9/10||0AA+0AF||4xAA+8xAF||Crysis Engine|
|Crysis v1.21 Very High Quality||DX10||0AA+0AF||4xAA+8xAF||Crysis Engine|
|Enemy Territory: Quake Wars v1.4||OpenGL||0AA+0AF||4xAA+8xAF||ID|
|Half Life 2 Episode 2||DX9||0AA+0AF||4xAA+8xAF||Source Engine|
|Mass Effect||DX10||0AA+0AF||4xAA+AF/8xAA+AF||UT3 Engine|
|MS Flight Simulator X SP2||DX10||0AA+0AF||AA+AF||FSX|
|World in Conflict v1.05||DX10||0AA+0AF||4xAA+4xAF||MassTech|
The Radeon X800 XT only supports Shader Version 2.0 and won’t run this one. Cards using Shader Model 3 switch over to DirectX 9. If anti-aliasing is used, Assassin’s Creed appears to deactivate HDR rendering when using the GeForce 7, as the anti-aliasing is visible and the frame rates increase. The GeForce 7 does not support HDR rendering (Shader 3) with simultaneous anti-aliasing, as we have already seen in Oblivion and 3DMark06. The X1300 and X1300 Pro display the Assassin’s Creed menu correctly, but when playing, the cards generate AA graphics errors. The old GeForce 7 and Radeon X1000 series are only able to run at the 1280x1024 resolution with a refresh rate of 70 or 75 Hz—the 60 Hz setting is not available.
Call of Duty 4
Dual-chip cards like the X2 or GX2 run the test scene more slowly than two individual cards in SLI or CrossFire mode. If you focus on less complex sections, the frame rates increase; there is no obvious reason for this dual chip card limitation. The Radeon HD 3850 CrossFire using Catalyst 8.6 occasionally caused the game and the test system to crash.
At least 512 MB of memory and a 256-bit bus are barely sufficient to handle the anti-aliasing and Very High mode quality in Crysis at 1680x1050 pixels. The GeForce 9800 GTX SLI, 9800 GX2, Radeon HD 3850 and HD 3850 CrossFire exhibit visible breaks in performance, while the 1024 MB models of the GeForce 8800 GT and 9600 GT handle the high setting better thanks to their large frame buffers. The GeForce 8800 GTS 320 has the familiar memory problem, and even in the form of a SLI Duo, has trouble running Crysis with anti-aliasing, producing low frame rates. When using the 7950 GX2 and Quad-SLI, 7800 GT SLI, 7800 GTX SLI 7900 GS SLI, 7900 GT SLI, 7950 SLI textures tend to flicker. The GeForce 7800 GTX 512 special model performs worse than a normal 7800 GTX with 256 MB, due to the lack of driver optimizations. All DirectX 9 graphics chips are unable to switch into Very High mode.
Enemy Territory: Quake Wars
The older AMD cards with DirectX 9 do not support soft particles in Quake Wars; the function is missing completely in the graphics menu, and as a result, these cards achieve higher frame rates. The older GeForce 6 supports soft particles and appears to run slower in a direct comparison. Why the function is missing is unclear, since the GeForce 6 can only handle OpenGL 1.5, whereas the X1000 series from ATI is specified at OpenGL Version 2.0. When using the Radeon HD 3650, HD 2600, X1600 XT and X1600 Pro 3D, objects flicker in CrossFire mode. At lower resolutions without anti-aliasing, the graphics cards reach the CPU limit, where the maximum frame rate fluctuates badly—up to 5 fps is possible as an average result.
Half Life 2 Episode 2
The GeForce 6 and 7 require the game to be restarted when switching to anti-aliasing, and when changing the resolution. Without restarting Half Life 2 Episode 2, the time demo only runs at 25% of the actual 3D performance. This problem does not occur with any of the other tested cards.
Although the frame rate is over 30 fps and loading textures was taken into account, the test scene is very shaky on many of the individual cards. Mass Effect handles multiple graphics chips very well—3-way SLI runs beautifully using a trio of GeForce 8800 Ultras, with a frame rate just under that achieved with the 8800 Ultra SLI. The quad CrossFire with two HD 3870 X2s also provides good results; the game scales the four GPUs well. In order to get the anti-aliasing to work with the Radeon X1000, HD 2000 or HD 3000 you need to rename MassEffect.exe to Bioshock.exe. This can lead to graphics errors, but the test system and benchmark scene showed no signs of this. The new Radeon HD 4000 doesn’t need to have this trick applied; anti-aliasing is supported with no problems.
Starting a saved game with the GeForce 8600 GT SLI or 8600 GTS SLI at 1920x1200 pixels and anti-aliasing is a matter of luck. When it works, you either get just 2 fps or the test computer crashes shortly afterward. All DirectX 9 cards switch the UT3 Engine of DirectX 10 (DX10) down to DirectX 9 (DX9). When using the Radeon X800 XT (Shader 2.0), Mass Effect does not start.
Microsoft Flight Simulator X SP2
With FSX SP2 it was possible to achieve up to 70 fps with Nvidia card in DirectX 10 mode, but once Vista was supplied with Service Pack 1 (SP1), all frame rates dropped to the old levels of between 15 and 30 fps. The memory problems of the GeForce 8800 GTS 320 surface again—this results in 3 fps at 1920x1200 pixels with anti-aliasing. Three-way SLI with GeForce 8800 Ultra doesn’t like when the desktop is set to 1680x1050 or 1920x1200 pixels and the same resolution as the game. The Radeon X1800 to X1900, X1650 XT and X800 do not support anti-aliasing at 1680x1050 and 1920x1200 during the game—4xAA must be activated via the graphics driver. This also affects the Radeon X1650 Pro, X1600 XT, X1300 XT and X1300 Pro in the 1920x1200 pixel resolution setting. Anti-aliasing is not started in the game, and 4xAA must be forced via the graphics driver. With the DirectX 9 cards, the DirectX 10 preview is not available, and as a result, older cards have a speed advantage over their DX10 successors.
World in Conflict
CrossFire with three and four graphics chips at the 1920x1200 resolution show a visible increase in frame rates, but the GeForce 6800 GT and 6800 Ultra run very irregularly and poorly. The Radeon X1900 series and the X1950 XTX often have problems when switching to 1680x1050 pixels. The screen turns black and you need to find the resolution change confirmation message and click it. All DirectX 9 graphics cards switch the game from DirectX 10 to DirectX 9, which can lead to increased frame rates with older graphics chips.
The synthetic DirectX 9 benchmark is ideal for testing the optimizations of the graphics card or multi-core CPU. The default test resolution is 1280x1024 pixels with default quality settings and no filters. The CPU score with the Intel X38 chipset lies between 2521 and 2555. With the nForce 780i chipset, 3DMark06 achieved a CPU score of between 2372 and 2427.
If only you tested F@H PPD...
As mentioned in the charts introduction, these numbers take hours upon hours to compile, which means setting cut-offs for the product submissions and drivers. Unfortunately, the X2 didn't make it. However, there are results for a pair of 4870s in CrossFire, which is a roughly equivalent configuration. You'll also notice that there are no 4600-series Radeons. Again, same issue.
The charts also show the difference between games they are coded well and ones that aren't. Compare Crysis (very high quality) and HL2 Ep2 at 1920 x 1200 4xAA 8xAF, on Crysis even the best hardware left begging for mercy at around 24fps, yet HL2 EP2 which looks just as good (graphically speaking) IMO will run at 30 FPS on a 8600 GTS (yes an 8600 GTS!) and X1800XL. A small portion of the difference can be put down to DX10 but not all of it.