Sign in with
Sign up | Sign in
Your question

Ati Radeon 8500 - 16bit vs 32bit

Last response: in Graphics & Displays
Share
April 28, 2002 7:14:02 AM

Remember the first Radeon cards that appeared (Radeon DDR). Remember how they had really god 32bit (color depth), but in 16bit the frames were almost the same as in 32, so the GeForce 2 GTS really outperformed the Radeon DDR there (while in 32bits it was really fighted).

Now my question, Is this fixed in the Radeons 7500 & 8500?
Do they run much faster in 16bit than in 32)
April 28, 2002 11:14:55 AM

No, it's a feature not an issue! nVidia does this too! 16-bit colour is obsolete for the most part now.

:wink: <b><i>"A penny saved is a penny earned!"</i></b> :wink:
April 28, 2002 12:14:09 PM

OK. I just found a couple of benchmarks (in both, 16 & 32bit) of the Radeon 8500 and the GeForce 3 Ti500/Ti200,
I noticed the difference between 16bit and 32bit framerates are very small in both ATI and NVIDIA.
So I guess its an old thing, that when you run a card in 16bit, it runs much much faster (like the old good GeForce 2 GTS) :) 
Thanks for your help AMD_MAN.
Greetings.
!