DirectX 9 Performance - An Important Factor
When the performance mainly depended on the pixel fill rate and memory bandwidth of a card, it was relatively easy to estimate the performance of a card by concentrating on these factors. With DirectX 8 and much more with DirectX 9 shader performance of a GPU becomes a more important factor. DX9 cards that performed pretty well in DirectX 8 games can suffer badly when it comes to DX9 shader code resulting in unplayable framerates.
Only very few DirectX 9 games have hit the market these days, so judging the performance of a card under DX9 is not an easy task right now. We can currently see one trend: NVIDIA's FX cards are in trouble. The reasons are manifold. The code that comes out of Microsoft's HLSL (High Level Shader Language) compiler does not go very well with NVIDIA's CineFX architecture and the cards do not offer 24 bit floating-point performance. They have to use the much more resource hungry 32 bit mode instead or fall back to 16 bit, which results in a loss of image quality in most situations. NVIDIA tries to compensate the disadvantages by asking game developers to use their own shader language Cg, which can create code that fits better to CineFX cards. The alternative is massive driver optimizations, but NVIDIA got in the critics with controversial optimization attempts in the past months.
Shot taken from out AquaMark3 article demonstrating driver bugs cause by optimization efforts?
ATI's Radeon 9500/9600/9700/9800 can run the code that comes out of the MS HLSL compiler - the so called DX9 standard code - very well and they also use the best compromise in terms of floating point quality by using the minimum DX9 spec 24bit. It seems that the whole design of the ATI Radeon DX9 chips was designed to fit to HLSL code.
If you intend to buy a new graphics card you see that ATI's DX9 cards run fast and without trouble in actual DirectX 9 games, while NVIDIA's FX cards rely on optimized code paths or driver optimizations. A trend that is not denied by NVIDIA, and meant to be improved with future drivers. It's now up to NVIDIA to convince potential buyers that there's no reason to worry. The FX cards could not impress in DX9 games or benchmarks we've seen so far (3DMark 2003, Tomb Raider - Angel Of Darkness, Halo und AquaMark 3). The new discussion initiated by Valve and the endless discussion regarding experiments with anisotropic filtering to achieve better performance don't help NVIDIA's case a whole lot either.