Skip to main content

PC Graphics Beyond XBOX - NVIDIA Introduces GeForce4

Conclusion GeForce4 Ti

Now let's get to the GeForce4 Ti line of cards. Those cards resemble the excellence of today's 3D. Two vertex shaders and an extremely efficient memory subsystem speak for themselves. Each card holds its own in the benchmarks and each card is able to beat its predecessor with the GeForce3 label. This is impressive, indeed!

It doesn't really matter which GeForce4 Ti card you choose. You will always be a winner. If you can only spend $199, then take the GeForce4 Ti4200 and be merry, because it is still faster than the former performance leader GeForce3 Ti500. Speaking of which - I hope that no dealer still has these cards in stock. GeForce3 Ti500 can now only sell at price points of $149-179. This is a steep drop, considering its price as of yesterday, which was $299. Who would pay that when he can get a faster product for $100 less?

The GeForce4 Ti4400 is a good product, too. It's well ahead of GeForce3 Ti500 and its brother GeForce4 Ti4200, but for $299 it's still acceptably priced. Only the GeForce4 Ti4600 is for people who don't care about how much money they spend. It's for those who want simply the best. For a hefty $399, they'll get the best 3D card that money can buy right now.

Some moaners are criticizing GeForce4 Ti for its lack of really new technologies. I would like to remind those people of the performance jump we are seeing here. Isn't that what really counts? GeForce4 Ti has so much power, it can run with anti aliasing enabled in virtually any game right now. Features alone don't win customers - they ought to make sense, and the performance has to be right, too. GeForce4 Ti seems to be a clear winner here.

What I'd like to rant about is the unwise idea to use the same name for the new top 3D performing chip as well as the technologically backward value product. The name 'GeForce4' has already lost its meaning on the day of its introduction. On the one hand it stands for top-notch 3D power, on the other hand it stands for old technology from times before GeForce3.

Last but not least, I'd commend NVIDIA on the decision to equip all their new cards with dual display capabilities. Two displays instead of only one can make your work more efficient indeed. We know, of course, that NVIDIA did not exactly invent this idea. This credit goes to Matrox. It's too bad that only GeForce4 MX was equipped with an integrated video encoder. I can only hope that NVIDIA will finally manage to bring its video output quality on par with S3 and ATi. So far NVIDIA has been down in the dumps as far as that feature is concerned.