Remember the first Radeon cards that appeared (Radeon DDR). Remember how they had really god 32bit (color depth), but in 16bit the frames were almost the same as in 32, so the GeForce 2 GTS really outperformed the Radeon DDR there (while in 32bits it was really fighted).
Now my question, Is this fixed in the Radeons 7500 & 8500?
Do they run much faster in 16bit than in 32)
OK. I just found a couple of benchmarks (in both, 16 & 32bit) of the Radeon 8500 and the GeForce 3 Ti500/Ti200,
I noticed the difference between 16bit and 32bit framerates are very small in both ATI and NVIDIA.
So I guess its an old thing, that when you run a card in 16bit, it runs much much faster (like the old good GeForce 2 GTS)
Thanks for your help AMD_MAN.