Graphics Beginners' Guide, Part 2: Graphics Technology

Anti-Aliasing

Aliasing (abbreviated 'AA') is a term to describe jagged or blocky patterns associated with displaying digital images. In the case of graphics, it refers to the inherent stair-likeness of angular edges when displayed on the monitor. Anti-aliasing is a graphics feature that reduces this effect. However, since anti-aliasing calculations use a fair amount of graphics processor power, enabling it will cause a significant drop in frame rates.

Anti-aliasing technology relies heavily on graphics memory performance, so high performance graphics cards with high performance memory will be able to use anti-aliasing with less of a performance hit than low-end graphics cards. Anti-aliasing can be enabled at different levels. For example, 4x anti-aliasing will produce higher quality images than 2x anti-aliasing, but at a higher performance cost than 2x anti-aliasing. While 2x doubles both horizontal and vertical resolution, 4x quadruples it.

Texture Filtering

All 3D objects in a video game are textured, and as the viewing angle of a displayed texture increases, the texture will appear more and more blurry and distorted in the game. To combat this effect, texture filtering was introduced for graphics processors.

The earliest texture filtering was called bilinear and displayed pretty obvious filtering 'bands' that weren't very pretty. This was fixed with tri-linear texture filtering, which improved on the bilinear technique. Both of these filtering options are relatively free from a performance cost standpoint in modern graphics cards.

The best filtering available is now anisotropic filtering (often abbreviated as AF). Similar to anti-aliasing, anisotropic filtering can be enabled at different levels. For example, 8x AF will produce superior filtering quality compared to 4x AF. Also similar to anti-aliasing, anisotropic filtering demands processing power, and will increasingly affect performance as the level of AF is raised.

High Definition Texture Sets

All 3D video games are developed with target specifications, and one of these specifications is the amount of texture memory the game will require. All of the required textures must fit into graphics card memory while playing, otherwise performance will be heavily taxed, as the extra textures required will have to be stored in slower system RAM, or even on the hard disk. So if a game's developers have targeted 128 MB of memory as their minimum requirement for that game, the textures to support it would be called a 'texture set' and would not require more than 128 MB of memory on a graphics card at any given time.

Newer games often support multiple texture sets, so that the game will support older graphics cards with less texture memory, in addition to the newest graphics cards with the most onboard memory. For example, a game might include three texture sets: for 128 MB, 256 MB and for 512 MB of graphics memory. Games that support 512 MB of graphics card memory are few and far between, but they are the most compelling reason to buy a graphics card with that much of memory. More memory generally has a relatively minor effect on raw performance, but can increase visual quality considerably if the game supports it.

Follow-up by reading Graphics Beginners' Guide, Part 1: Graphics Cards

Follow-up by reading Graphics Beginners' Guide, Part 3: Graphics Performance

  • I am wondering why there is such a thrill among the community for graphics card. I understand that it improves your viewing experience drastically, but at a large expense of your pocket. And new cards come almost every month or may be more often.

    Even if you think that you are going to spend all that extra money and go for the best card, its going to be oudated soon. So please help me understand the rationale behind the market and who are all the market.
    Reply