Boosting the performance of computer graphics has always been an art and there are various ways to do it. In recent years, we have seen larger graphics processors that host more pixel pipelines as well as fragmented designs, such as ATI's Radeon X1000 series; and there have been attempts to bolster 3D performance by deploying graphics cards in dual and quad configurations. The classic way is through innovations in design and the introduction of modern approaches and components and ATI wants to prove again that this approach still works. Welcome the Radeon X1950XTX.
Looking back in history we can reminisce to the beginning of computer graphics as we know it today. About 10 years ago, 3D graphics started to gain success. A drop in the price of EDO (Extended Data Out) memory chips made it affordable for add-in graphics cards makers to get their product out to the mainstream.
From there, the next logical step for progress comes from new applications that can utilize the upgraded hardware. Many different application programming interfaces (APIs) have been developed, but they continue to mature as newer ideas are hatched in the minds of developers and engineers to make everything quicker, better, faster and more efficient. From such changes we get new innovations as these new demands for processing power emerge.
In 1998 we saw several developments. The first was SLI (Scan Line Interleave). Two 3Dfx Voodoo² 3D accelerator cards could be hooked together, which allowed two or more cards to work at the same time. Later that year, the way scenes were rendered evolved through object manipulation via transform and lighting engines. This birthed cards such as the GeForce 256.
Soon after that, graphics cards got another boost from the memory manufacturers, as memory intensive graphics cards could tap into the power of DDR. Double Data Rate memory took some time to catch on, but we have had DDR2 and GDDR3 (for graphics only) for some time. This brings us to today; ATI unleashes its latest graphics champion, the Radeon X1950XTX, which uses GDDR4 memory for the first time.