Dot Pitch Mythology
In trying to evaluate a monitor's quality, most people will usually talk about dot pitch. In general, the lower the dot pitch (measured in millimeters), the better the monitor. The problem is that dot pitch can be measured in many different ways, and therefore doesn't necessarily mean much. Traditionally, a shadow mask CRT's dot pitch is the distance between two of the same-colored phosphor dots (measured diagonally from one scan line to the next). However, in an aperture grill CRT there are no dots (only stripes), so dot pitch (or more accurately, stripe pitch) - is measured horizontally, between two of the same-colored stripes. For marketing purposes, shadow mask manufacturers started quoting horizontal dot pitch, too. There are also a few companies that publish their mask pitch instead. However, since the mask is about 1/2" behind the phosphor surface of the screen, a .21mm mask pitch might actually translate into a .22mm phosphor dot pitch by the time the beam strikes the screen. Finally, because CRT tubes are nearly (but not completely) flat, the electron beam tends to spread out into an oval shape as it reaches the edges of the tube, so some manufacturers will spread the dot pitch wider toward the edges. Some manufacturers will publish two dot pitch measurements, one for the center of the screen, and the second for the outermost edges.
While the number and tightness of triads (or stripes) determines the optimum resolution, a monitor's ability to accurately strike precise phosphors (called "convergence") is even more important when it comes to creating a sharp image. If the convergence isn't quite right, an electron beam aimed at a blue phosphor (or stripe) might strike part of the red or green phosphor just next to it. Horizontal or vertical convergence errors (measured in tenths of millimeters) can cause color ghosting and make text harder to read or fine details appear fuzzy. Unfortunately, only a few monitor manufacturers publish their convergence specifications.
Matching Monitors To Graphics Cards
Monitors (and TV sets) are essentially analog devices. No matter how much you try to manipulate the tube components, they are still analog. However, computers operate in a digital world, so at some point (usually within the graphics board) the digital images created in the computer must be converted to analog signals in order for the monitor to display anything.
The key component in a graphics card that determines how well (or if) it can drive a monitor is the speed of its RAMDAC (Random Access Memory Digital Analog Converter), measured in megahertz. The RAMDAC is usually a separate component on the graphics board, though sometimes it is incorporated in the graphics chip, and sometimes (as in the case of DVI, digital video interface), it is in the monitor.
The following table shows minimum RAMDAC requirements for different resolutions running at different refresh rates.
|2048 x 1536||270||338||380|
|1856 x 1392||226||282||320||357|
|1600 x 1200||162||203||230||250||280|
|1280 x 1024||108||135||158||176||186|
|1024 x 768||65||79||95||106||112|
|800 x 600||40||50||56||63||67|
|640 x 480||25||32||36||40||42|