NVIDIA GeForceFX: Brute Force Attack Against the King

Serious Sam 2 Z-Buffer Problems

Just a few months ago, there was some discussion about a Z-Buffer problem with GeForce4 Ti cards in the game Serious Sam. With these cards, the game runs at 32 bit color depth only with a Z-Buffer size of 16 bit instead of 24 bit. This seems to be caused by the "ChoosePixelFormat" command, which is used by applications in order to determine the size of the Z-Buffer at the start. What happens is that the games then call up a 32 bit Z-Buffer, which the card cannot deliver, because 8 bits are already reserved for the stencil buffer. To remedy this, set the GeForce4 and GeForceFX to 16 bit, rather than 24 bit (without the 8 bit stencil). By contrast, the GeForce3 is set automatically to the correct or desired 24 bit. This error also does not occur with the ATI Radeon 9700 PRO, which sets itself to 24 bit Z-Buffering and 8 bit stencil. This hints at a problem in the GeForce4 and FX driver. At least Croteam points out an error in the NVIDIA driver. In numerous discussions about this topic on the Internet, some programmers have also confirmed an incorrect Z-Buffer assignment in their OpenGL applications when using the "ChoosePixelFormat" command.

This results in Z-errors, caused by decreased precision. Here are a few screen shots from Serious Sam 2 when run with the GeForceFX:

The errors are noticeable in the movements of the flickering surface of the portal.

NVIDIA, in contrast to ATI, does not plan for any setting options for the Z-Buffer depth in the driver, which means that the only way to work around this problem is when the application provides this option. This is the case with Serious Sam 2. With the console command /gap_iDepthBits=24 , you can force a Z-Buffer that the GeForce4 Ti and GeForceFX can display without a problem. This is an easy way to get rid of the display problems in SS2.

It has been presumed that NVIDIA cards gain a performance advantage through the smaller Z-Buffer - this can be confirmed only in part. The benchmark results of the GeForceFX under 16 bit and 24 bit Z-Buffer reveal only a very slight difference. And there's still the disadvantage of the display error. By comparison, the GeForce4 Ti cards chalk up better scores with 16 bit Z-Buffer than with 24 bit.