Skip to main content

How Much RAM Does Your Graphics Card Really Need?

Introduction

Advertisers love numbers because they are a simple and straightforward way to convey the idea of improvement. For example, version 2.0 is always better than version 1.0, a clock speed of three gigahertz simply must be faster than two gigahertz, and four gigabytes of RAM are better than three gigabytes. Rarely will somebody challenge the universally-accepted truth that more is better.

Unfortunately, the real world is a lot more complex than the simple numbers suggest. Sometimes version 2.0 loses the elegant interface that made version 1.0 so compelling. Sometimes 3 GHz clock speeds are slower than 2 GHz if they are based on an inferior architecture. And sometimes, more RAM doesn't make a difference.

Graphics card manufacturers have been exploiting the amount of RAM as a marketing tool since the very beginning. Back in the day, you needed a certain amount of RAM on the graphics card to simply run a resolution like 1024x768. As time went on and 3D accelerators emerged, RAM on the graphics card was employed to store textures and allow for features like anti-aliasing (AA), post-processing, and normal mapping.

The focus of this article isn't to dig into the minutia of where your graphics card RAM is being used. Instead, we're more interested in looking at the tangible impact that different amounts of graphics card RAM will have on your gaming experience. Our goal is to let you know exactly what advantage, if any, you can expect from a graphics card that has more RAM on-board.

Having said that, there are a few important concepts we'll need to cover before this will make any sense, so let's get started.