Ram Guide

JEDEC SDRAM

All DRAMs that have a synchronous interface are known generically as SDRAM. This includes CDRAM (Cache DRAM), RDRAM (Rambus DRAM), ESDRAM (Enhanced SDRAM) and others, however the type that most often is called SDRAM is the JEDEC standard synchronous DRAM.

JEDEC SDRAM not only has a synchronous interface controlled by the system clock, it also includes a dual-bank architecture and burst mode (1-bit, 2-bit, 4-bit, 8-bit and full page). A 'mode register' that can be set at power-on and changed during operation controls the burst mode, burst type (sequential or interleave), burst length and CAS latency (1, 2 or 3).

CAS Latency is one of several performance related timings for SDRAM. This measurement is the time it takes to strobe in the Row Address, and to activate the bank. When a burst read cycle is initiated, the addresses are set up and RAS\ and CS\ (chip select) are held low on the next clock cycle (rising edge of CLK), thereby activating the sense amplifiers on the bank. A period of time equal to tRCD (RAS\ to CAS\ delay) must pass after which CAS\ and CS\ are held low (again, at the next clock cycle). After the time period for tCAC (column access time) has passed the first bit of data is on the output line and can be retrieved (at the next clock cycle). The basic rule is that CAS latency times the clock speed (tCLK) must be equal or greater than tCAC (or CL x tCLK >= tCAC). This means that the column access time is the limiting factor for CAS Latency.

SDRAM was initially introduced as the answer to all performance problems, however it quickly became apparent that there was little performance benefit and a lot of compatibility problems. The first SDRAM modules contained only two clock lines, but it was soon determined that this was insufficient. This created two different module designs (2-clock and 4-clock), and you needed to know which your motherboard required. Though the timings were theoretically supposed to be 5-1-1-1 @ 66 MHz, many of the original SDRAM would only run at 6-2-2-2 when run in pairs, mostly because the chipsets (i430VX, SiS5571) had trouble with the speed and coordinating the accesses between modules. The i430TX chipset and later non-Intel chipsets improved upon this, and the SPD chip (serial presence detect) was added to the standard so chipsets could read the timings from the module. Unfortunately, for quite some time the SPD EEPROM was either not included on many modules, or not read by the motherboards.

SDRAM chips are officially rated in MHz, rather than nanoseconds (ns) so that there is a common denominator between the bus speed and the chip speed. This speed is determined by dividing 1 second (1 billion ns) by the output speed of the chip. For example a 67 MHz SDRAM chip is rated as 15ns. Note that this nanosecond rating is not measuring the same timing as an asynchronous DRAM chip. Remember, internally all DRAM operates in a very similar manner, and most performance gains are achieved by 'hiding' the internal operations in various ways.

The original SDRAM modules either used 83 MHz chips (12ns) or 100 MHz chips (10ns), however these were only rated for 66 MHz bus operation. Due to some of the delays introduced when having to deal with the various synchronization of signals, the 100 MHz chips will produce a module that operates reliably at about 83 MHz, in many cases. These SDRAM modules are now called PC66, to differentiate them from those conforming to Intel's PC100 specification.