Performance Leap: NVIDIA GeForce 6800 Ultra
Introduction
12.000 points in 3DMark 2003. A score of over 60.000 in AquaMark 3. Over 60fps in Halo at 1600x1200 and more than 50fps in FarCry with High FSAA and 4tap anisotropic filtering at 1024x768 - these are numbers that will bring tears of joy to PC enthusiasts everywhere.
You'd have to go back quite a bit in the history of graphics cards to find a performance leap of similar magnitude. Maybe the transition from 3dfx's Voodoo1 to the Voodoo 2 comes close. Or the jump from NVIDIA's TNT2 to the GeForce 256 DDR, or perhaps the transition from ATi's Radeon 8500 to the 9700 Pro... Maybe these developments might come close, if, for the moment, we left aside the technological quantum leap in the past. But let's start at the beginning.
With the introduction of the GeForce 6800, the first member of the new NV4x family of graphics processors to see the light of day, NVIDIA is launching a product that, it hopes, will make everyone forget last year ever happened. If you'll remember, a year ago NVIDIA launched what was to become its new flagship model, the GeForce FX 5800 Ultra, codenamed NV30. Compared to its rivals, this card failed to impress where 3D gaming performance was concerned and was plagued by heat problems which necessitated an extravagant and very loud cooling solution, earning it the nickname "dustbuster". On top of these problems, discussions arose about bad image quality, NVIDIA's use of questionable driver optimizations to achieve higher performance and the reduced shader precision the card used. Since the entire product line was based on this design, the smaller models were plagued by the same problems. The NV30's architecture and design were by no means bad, and on paper it completely fulfilled and in some respects even surpassed the requirements set forth in Microsoft's DirectX 9 specification. However, it did make writing code that made use of these features much harder for game developers. Lastly, performance of standard shader code compiled with Microsoft's HLSL compiler was sub-optimal.
Old and new Generation: NV40 (left) compared to RIVA TNT alias NV4 - or in other words: 222Mio Transistors on the left, 7 Mio on the right.
Realizing there was a problem, NVIDIA launched a replacement only a few months later. While the new card, the FX 5900 / NV35, addressed many of the weaknesses that plagued the FX 5800, even this replacement had strong competition from ATi's Radeon 9800 series. It seemed that for every card NVIDIA had up its sleeve, ATi was able to pull another improved version of the R300 design out of its hat and steal NVIDIA's thunder.
Which brings us back to the present. Today, the competitors in the high-end sector are the FX 5950 Ultra (NV38) for team NVIDIA and the Radeon 9800XT (R360) for team ATi. Where performance is concerned, NVIDIA has been able to exploit its card to its fullest potential, thanks to frequent driver releases. ATi still has the upper hand in the performance and image quality department, though, as its cards use a better FSAA implementation and most of the time offer higher visual quality when shaders are used in games.
The new NV4x design is supposed to make all these weaknesses a thing of the past. The first card based on this design that will come to market is the GeForce 6800 Ultra. This card outshines its predecessor, the FX 5950 Ultra, in practically every category - and not just on paper!
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
There's a budget GeForce GPU selling in China that not even Nvidia knew it made — RTX 4010 turns out to be a modified RTX A400 workstation GPU
US to patch loopholes that allow China to buy banned AI GPUs from other countries — new regulations include national quotas on GPU exports and a global licensing system