ATi's New Radeon - Smart Technology Meets Brute Force
Apology
I'd like to apologize to all of you who are missing results with a GeForce2 GTS 64 MB card. Unfortunately my 3D-specialist Silvino, who helped me with the benchmarks, did not have any 64 MB GeForce2 GTS card in his lab at the time of the tests. Those cards happen to be a bit faster than the 32 MB versions at resolutions above 1024x768 and they are able to do 1600x1200x32 in D3D as well as 640x480x32 and 800x600x32 in 4X FSAA. Still I can assure you that spot checks done by me have shown that Radeon keeps a continuous lead over GeForce2 GTS in 32-bit color at resolutions above 1024x768, even if the NVIDIA chip should be equipped with 64 MB and even if it should run with its memory overclocked to 366 MHz. I will supply you with updated results as soon as I have slept a bit. I will also add the resolutions in between 1024x768 and 1600x1200 for those of you who missed them in our benchmarks. Running those tests would have delayed this review by about a day.
Conclusions
You have read the endless list of 3D features and you have seen all the benchmark results. What do you say?
First of all I'd like to note that Radeon's benchmark results at 16-bit color look worse than they are, simply because numbers can't give you a feel for game play. I understand if some of you might complain about Radeon's 16-bit performance, but the 16-bit scores of Radeon are only an issue if you happen to play at 1280x1024x16 or 1600x1200x16, because Radeon's scores are definitely good enough in all the lower 16-bit color resolutions. In those two resolutions GeForce2 GTS is clearly and utterly beating the Radeon. This doesn't come as a surprise, because the NVIDIA chip does not suffer from memory bandwidth limitation as much in 16-bit color as it suffers in 32-bit color, making it able to reach at least 70% of its claimed fill rate. Radeon's HyperZ is also not too effective in 16-bit color, which is why Radeon's scores are almost identical in 16-bit color as in 32-bit color.
Things are a lot different when you look at the results in 32-bit color. You might be missing the scores at 1152x864 and 1280x1024, but believe me, as soon as the resolution skips 1024x768 Radeon is ahead of the rest, thanks to its HyperZ feature. The same is valid for FSAA. Let's be honest, why should somebody who is so much into image quality that he is using FSAA use anything worse than 32-bit color?
I personally like the Radeon, which is mainly due to the elegant 'HyperZ'-feature with the stupid name. ATi has shown that the memory bandwidth issue can be tackled in a different way than with pure brute force. It's like a light and fast sports car with a much better fuel consumption thanks to smart technology.
Radeon is indeed up there with the top crop when it comes to 32-bit performance and the chip comes with a wealth of new 3D features. Additionally you get the best integrated video, DVD and HDTV solution that money can buy right now. What is NVIDIA supposed to say? 16-bit color is more important than 32-bit color? I don't think so, since it was NVIDIA who told us how important 32-bit color is back in 1999 when 3dfx's Voodoo3 was unable to support that.
I am pretty sure that Radeon's performance will further improve once the drivers have matured a bit. I certainly look forward to the luxurious 'All-In-One' Radeon that's supposed to be released in early fall of this year. The SDR Radeon might mix up NVIDIA's GeForce2 MX sales as well, because it is meant to offer performance that's close to DDR Radeon for a rather low price.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
As I already said, I like the Radeon. I like it because I prefer intelligent technology to brute force. That's why I also prefer a Porsche 911 Turbo to a Dodge Viper.
Please Follow-up by reading the article Update: ATi's Industry Shaking Radeon revisited .