THG Graphics Card Buyer's Guide

Image Quality

Image quality is a topic that would easily merit its own article, if not a book in its own right. What I mean here is the quality of the rendered 3D scene as it appears on the player's screen. This whole discussion was originally caused by the tricks and tweaks that graphics card makers have begun to build into their drivers. Their goal is to get the most performance out of their cards, and to this end, sometimes certain calculations are either skipped or simplified.

In principle, this is possible in a lot of places without the player being forced to accept reduced image quality. Unfortunately, the chipmakers tend to do a bit too much tweaking, especially to win performance comparisons. The result is often visibly reduced image quality, noticeable at least to experienced users. Casual gamers, on the other hand, may often not even notice anything. In our article (ATI's Optimized Texture Filtering Called Into Question ) we took a look at a number of optimizations used by the graphics chip companies, and explained how they work and what effect they have on image quality and 3D performance.

Here is an image quality comparison taken from the game FarCry using older drivers. In this driver, NVIDIA replaced some of the game's own shaders with highly optimized ones. The result is visibly reduced image quality on the one hand, but improved performance on the other.

Meanwhile, the chipmakers have learned that many users don't necessarily want such optimizations, especially if they are forced upon them. Anyone who pays $500 (or more) for a graphics card understandably expects the highest possible image quality. This is especially so considering that such optimizations are not really that essential - the enthusiast cards are now more than fast enough to handle the highest quality settings. In response, NVIDIA and ATI now allow for most of these optimizations to be switched off in their most recent drivers.

Another reason for reduced image quality can be the use of reduced floating-point precision in DirectX 9 games. A good example of this is the game FarCry. NVIDIA's GeForce FX cards render most of the shaders using only 16 bit precision, which leads to pronounced visual artifacts (see also: Shader Quality - FarCry ). While NVIDIA has addressed these quality issues with newer drivers, the result is that the frame rates have taken a nosedive as a result (FarCry - Very High ). NVIDIA was only able to overcome this performance handicap in DirectX 9 games with the new GeForce 6xxx line.

For some further reading about image quality, check out these articles:

Since the image quality produced by a card can change with literally every driver release, we recommend staying informed by reading the reviews of new card generations, as we also regularly test the image quality in these articles.