It's now for almost 2 years that NVIDIA is ruling the 3D graphics chips scene in terms of 3D-performance. From a technological point of view, NVIDIA's chips weren't shabby either. Particularly the integration of a transform and lighting engine into a mainstream 3D chip, as done for the first time with NVIDIA's GeForce256 in late 1999, and the methodical development of this technology can be seen as NVIDIA's biggest claim to fame. However, while NVIDIA's GeForce2 Ultra chip may be the unchallenged leader in 3D-performance today, it is ATi's Radeon product, released in July 2000, which comes with the most sophisticated features right now. The technology of ATi's Radeon includes things like keyframe interpolation, four-matrix skinning, environment bump mapping, (proprietary) 3D-textures and HyperZ, while GeForce2 owners have to do without them. In July 2000, when Radeon was released, ATi was proud to announce that their latest chip supports several features of Microsoft's upcoming DirectX8 specification. NVIDIA admitted that GeForce2 is only supporting DirectX 7 features, but made a strong case saying that Radeon was far from a full-DirectX8 implementation. Finally now NVIDIA's mysterious successor of the GeForce2, well known under its code name 'NV20', is ready to be released and NVIDIA is once more proud to announce that this new 'GeForce3' chip is fully supporting Microsoft's latest DirectX version, which happens to be DirectX 8.
(Almost) No Benchmarks ... Yet
This article is supposed to introduce the vast amount of GeForce3's new features. It is trying to show you the benefits of its DirectX8 implementation, but also checking for some possible flaws. You won't find any benchmark results in this first piece that I am writing about GeForce3, because those are not supposed to be published before GeForce3 has got a bullet proof set of drivers. However, right now those 'bar graphs' may not be the biggest catch. The really impressive part of GeForce3 is its new technology. The name 'GeForce3' may make it look as if it was a mere successor of NVIDIA's two previous chips GeForce256 and GeForce2, but while it may still include quite a bit of the technology of its predecessors, its new technology is certainly marking a quantum leap for what will be possible in future 3D-applications.
There are actually a few benchmark results hidden in the article. Good luck finding them!
GeForce3 And DirectX8 - Do We Really Need A New Set Of 3D Features ... Again?
Yet another NVIDIA chip and yet another set of features. Does that really have to be true? There are enough people out there using graphics chips that don't even come with integrated T&L. Happy people I might add. Now NVIDIA teaches us that integrated TnL is already old news. What it takes are programmable vertex and pixel processors. Thank God you will find out that GeForce3 has got some features that you can benefit from right out of the box, because those other new DirectX8 gimmicks won't be implemented into 3D-games for quite a while. This is certainly a shame, because GeForce3 is an impressive product, but should you really pay for it right now?
What we have to keep in mind is that GeForce3 is the first 3D-platform that comes with a very similar feature list as Microsoft's upcoming Xbox. Bringing those features to market now might prove very beneficial for the Xbox.
You will have to decide for yourself if you consider GeForce3 worth the $599 is will probably cost once it's out. It is certainly the most feature rich product I have ever reviewed. Get ready for the longest article in the 5-year history of Tom's Hardware Guide. It will be a high-tech roller coast ride.