NVIDIA Launches Titanium Series

The Secret Behind 'Titanium'

The little metallurgy that I know comes from my mountain bike, which I tuned with a whole lot of carbon as well as titanium parts to save weight. The metal 'titanium' is known to be lighter than iron or steel while under many conditions at least as strong or durable. It is also more elastic than steel. I suppose that NVIDIA is trying to get at the 'titanium is lighter, but just as strong ' issue. The new chips that carry the 'titanium' nametag are produced (by TSMC in Taiwan, as you certainly know) using an enhanced process technology, which is allowing higher core clock frequencies, while offering better yields. Let me use my favorite phrase these days: Make no mistake , this new process technology makes those chips neither faster, nor does it add any new features to the new GeForce2 and GeForce3 chips. The real catch of the enhanced 0.15 micron process technology used for the 'titanium' chips is simply the fact that the chips are cheaper to produce, which in turn makes the cards less expensive as well. Besides that, the chips are just the same as before. Geforce3 is still GeForce3 and GeForce2 remains GeForce2. Only the chip of GeForce3 Titanium500 gained a bit of clock speed. While usual GeForce3 cards came with a core clock of 200 MHz, the 'Ti500' runs at 240 MHz and thus 20% faster. Too bad that the memory bandwidth of 'Ti500' hasn't been improved just as much, because THAT could really have boosted performance.

The new 'Ti'-cards are also based on a new printed circuit board (PCB) with 8 layers and an improved power supply circuitry.

A Practical Coincidence -DetonatorXP

A couple of months ago, when I heard about NVIDIA's 'Titanium'-plans for the first time, I was told that GeForce3 Ti200 will be equivalent to previous GeForce3 and GeForce2 Ti will perform just a well as previous GeForce2 Ultra cards, while both 'Ti'-cards would be much less expensive. When you now look at the above table and compare the performance numbers (memory bandwidth, fill rate), you will have problems to accept that. The reason why NVIDIA is actually able to get away with this claim, comes from the recent introduction of the 'DetonatorXP'-set of drivers. Those drivers have shown to improve performance, so that even a card with less fill rate and less memory bandwidth can beat a previous product, as long as the performance of this product is measured with the OLD drivers.

This is not completely fair of course, since the 'old' products are running just fine and also faster with the new driver set. Therefore an 'old' GeForce3 card will beat GeForce3 Ti200 and an 'old' GeForce2 Ultra card will beat Geforce2 Ti as long as all run with the same drivers.

DetonatorXP was also able to finally take care of some rather old bugs of GeForce3. When NVIDIA's new 3D chip was ready for its introduction in March 2001, some white papers contained feature lists that included 'volumetric' or '3D' textures as well as 'shadow buffers', while some other white papers didn't. After researching this confusing situation I was told that GeForce3 does NOT support those features. In reality, the feature support had been there, but the driver support didn't quite work. This has changed since 'DetonatorXP' was released. Now all GeForce3 cards are able to use 3D textures and shadow buffers. NVIDIA hadn't been able to market those features so far, so it seemed practical to make it look as if the new GeForce3 Titanium500/200 cards are the first to offer those features. This is not correct. Here's the original comment of an NVIDIA spokesperson about this issue: "We are marketing Shadow Buffers and 3D Textures as new features because they are newly enabled in the software drivers. Honestly, these features are available on the original GeForce3 as long as one uses the Detonator XP driver, but we don't spend time marketing last seasons products.....we market products that we are selling now. " I'd say that this comment speaks for itself.