A Chaotic Start
Contrary to what most people think, plasma is not really a recent technology, even if its industrialization didn't take off until the early 1990s. Research on plasma displays began in the United States over four decades ago, in 1960. The technology was developed by four researchers: Bitzer, Slottow, Willson, and Arora. The first prototype came out very quickly, in 1964. The matrix, which was revolutionary for its time, consisted of 4 x 4 pixels emitting monochrome blue light. Then, in 1967, the size of plasma matrices increased to 16 x 16 pixels, this time emitting a pale red light, still monochrome, using neon.
Obviously, the technology was of interest to manufacturers, and companies like IBM, NEC, Fujitsu, and Matsushita got on the bandwagon starting in 1970. Unfortunately, in the absence of industrial outlets for the technology, there was an almost total cessation of development in the USA in 1987 - the last company to throw in the towel was the giant IBM. A handful of researchers stuck with the technology in the US, but research continued mainly on the other side of the Pacific, in Japan. The first commercial models became available on the market in the early 1990s. Fujitsu was one of the first to break the 21" barrier.
Today, most major consumer electronics manufacturers offer plasma panels, including LG, Pioneer, Philips, Hitachi, and more.
- Display Basics
- Plasma Technology
- A "Simple" Basic Principle
- From Fluorescent Tube To Plasma Pixel
- Advantages And Disadvantages Of Plasma Displays
- Major Disadvantages
- Application Areas For Plasma
- In Detail
- Addressing LCD Matrices
- A Little Lithography
- Advantages And Disadvantages Of LCDs
- But They Have Problems Too
- LCD: Applications