Why Is GeForce Called 'GeForce256 '?
Well, it took me some time to really understand that as well. First of all it isn't the price, Creative Labs are supposed to ship theirs for $249, but if you're in the right state with low tax it may still add up to $256. It should also not really be the memory interface, because this is only 128-bit wide. Some think that the usage of DDR ('double data rate') memory excuses the use of '256' for the memory interface, but that's in my humble opinion not quite all right. GeForce-cards with SDR-RAM would anyway not deserve the '256' then and the fact that data is transferred with the rising as well as falling edge of the memory clock does still not make it wider than 128-bit. The memory interface is anyway my critique-point number one, because it provides the boards equipped with SDR-RAM with a slower memory bandwidth than TNT2-Ultra-boards. GeForce's memory is currently clocked at 166 MHz, while TNT2-Ultra runs it at 183+ MHz and both chips have the same memory bus width of 128-bit. NVIDIA did not tell us the memory clock of the DDR-RAM card in our test, but I guess it's 166 MHz too, so that this card has at least 81% more memory bandwidth than TNT2-Ultra.
But let's get back to the magic '256'. I could hardly believe my ears when I was finally told what the '256' stands for. NVIDIA adds the 32-bit deep color, the 24-bit deep Z-buffer and the 8-bit stencil buffer of each rendering pipeline and multiplies it with 4, for each pipeline, which indeed ads up to 256. So far about the fantasy of marketing people, they are a very special breed indeed.
NVIDIA has finally got the message as well, GeForce includes a HDTV-processor with HDTV-motion compensation. ATi introduced support for this upcoming technology already in the 'good old' Rage128, but although GeForce does something to catch up with them it's still missing an iDCT for best MPEG2-decoding/encoding as found in Rage128 and Rage128 Pro.