Ati Radeon 8500 - 16bit vs 32bit

Remember the first Radeon cards that appeared (Radeon DDR). Remember how they had really god 32bit (color depth), but in 16bit the frames were almost the same as in 32, so the GeForce 2 GTS really outperformed the Radeon DDR there (while in 32bits it was really fighted).

Now my question, Is this fixed in the Radeons 7500 & 8500?
Do they run much faster in 16bit than in 32)
2 answers Last reply
More about radeon 8500 16bit 32bit
  1. No, it's a feature not an issue! nVidia does this too! 16-bit colour is obsolete for the most part now.

    :wink: <b><i>"A penny saved is a penny earned!"</i></b> :wink:
  2. OK. I just found a couple of benchmarks (in both, 16 & 32bit) of the Radeon 8500 and the GeForce 3 Ti500/Ti200,
    I noticed the difference between 16bit and 32bit framerates are very small in both ATI and NVIDIA.
    So I guess its an old thing, that when you run a card in 16bit, it runs much much faster (like the old good GeForce 2 GTS) :)
    Thanks for your help AMD_MAN.
Ask a new question

Read More

Graphics Cards DDR Radeon ATI Graphics