Only a few days ago, everything seemed perfect in the best of all possible worlds for Nvidia. The card manufacturer had just launched its GeForce GTX 260 and 280, which – despite the six-month delay – pushed the unified architecture introduced with the GeForce 8 to the limits of what a 65-nm process and a gigantic number of transistors could offer. The performance gain compared to the former –and now older – generation wasn’t overwhelming (59% on average over a 9800 GTX), but the arrival of CUDA applications was an interesting development, and Nvidia had no competition. Meanwhile, AMD seemed to be ever deeper in the red with its graphics division, avowedly incapable of competing on the high-end market segment as it once did, with its existing high-end cards quickly aging performance-wise. Then came the hush-hush release of the Radeon HD 4850, even before anybody had time to test it, and at an astoundingly low price of $199.
Yet, in the AMD camp, a miracle has happened. The Radeon HD 4850’s performance surprised everybody – including Nvidia. Despite the last-minute launch of the GeForce 9800 GTX+, which will not be available in retail channels until mid-July, Nvidia simply can’t match the explosive performance/price ratio of the Radeon card as we demonstrated in our recent test. The familiar marketing pitch about optimizing efficiency and the architecture’s yield, which has always sounded like fluff, suddenly took on a new meaning considering the Radeon HD 4850’s test results. It also even awakened hopes of an even better performance in the future. Having managed — to its own surprise — to increase the number of multiprocessors from 320 to 800 despite a 43% increase in the number of transistors and at the same engraving depth, AMD doesn’t want to settle for playing in the minors – and for good reason. A Radeon HD 4870, based on the same architecture but with higher performance (and of course at a higher price) has been announced and is slowly beginning to be available, though there’s still some uncertainty about that last point. On paper, at least, it could directly compete with Nvidia’s new high-end cards, and at a significantly lower price. But how about in practice?