LCD Or Plasma - What's Your Pleasure? Understanding Modern Flat-Panel TV Technologies
Plasma Technology
A Chaotic Start
Contrary to what most people think, plasma is not really a recent technology, even if its industrialization didn't take off until the early 1990s. Research on plasma displays began in the United States over four decades ago, in 1960. The technology was developed by four researchers: Bitzer, Slottow, Willson, and Arora. The first prototype came out very quickly, in 1964. The matrix, which was revolutionary for its time, consisted of 4 x 4 pixels emitting monochrome blue light. Then, in 1967, the size of plasma matrices increased to 16 x 16 pixels, this time emitting a pale red light, still monochrome, using neon.
Obviously, the technology was of interest to manufacturers, and companies like IBM, NEC, Fujitsu, and Matsushita got on the bandwagon starting in 1970. Unfortunately, in the absence of industrial outlets for the technology, there was an almost total cessation of development in the USA in 1987 - the last company to throw in the towel was the giant IBM. A handful of researchers stuck with the technology in the US, but research continued mainly on the other side of the Pacific, in Japan. The first commercial models became available on the market in the early 1990s. Fujitsu was one of the first to break the 21" barrier.
Today, most major consumer electronics manufacturers offer plasma panels, including LG, Pioneer, Philips, Hitachi, and more.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Computing industry legend Elwood Edwards, the voice of AOL ‘You’ve got mail,’ passes away at 74
Mac Studio lookalike packs a Core i9-14900KF CPU and RTX 4090 Laptop GPU — the FEVM FN60G rocks a 3.8-liter chassis that's slightly smaller than the Mac Studio
D-Link refuses to patch a security flaw on over 60,000 NAS devices — the company instead recommends replacing legacy NAS with newer models