Intel’s re-entry in the market has generated a lot of excitement and new expectations. Intel is one of the few companies with both the financial wherewithal to get into the game and the requisite technology war chest to develop cutting-edge discrete GPUs. Intel is, thus, only competitor that can realistically shake up the established pecking order in this market. Nonetheless we should avoid any tendency to get carried away.
A decade ago, many people were thinking along the same lines about Intel’s arrival on the scene, and hopes were high. At that time, the market was divided into two categories: start-ups that offered first-rate 3D performance (3Dfx, Nvidia, and PowerVR) and heavy hitters who seemed to think that 3D acceleration was just a gadget (Matrox, S3, and ATI before AMD purchased it).
A lot of people hoped that Intel’s arrival would improve the health of the market by providing the assurance of a well-known name and unheard-of performance. Admittedly, at that time, Intel had a trump card up its sleeve: it had just bought out Real3D, which was famous for having worked on the Sega Model 2 and Model 3 arcade cards and was at the top of the heap at that time. Yet, despite its qualities, the i740 didn’t really live up to all the expectations that were riding on it, and cards like the Voodoo² and TNT quickly eclipsed its performance. Rather than continue to fight it out on this market, Intel decided to use the technology in its chipsets to be able to offer an integrated platform, which was a strategy that turned out to work well.
Things might have stayed the way they were, if certain researchers, given the trend for increasing the programmability of GPUs, hadn’t had the slightly offbeat idea of using it as a massively parallel general-purpose calculator.
While no one took the idea very seriously initially, it quickly garnered a lot of support--to the point where Nvidia and AMD, smelling an opportunity to conquer a new market, began issuing a lot of communication about it when launching their new GPUs. That was more than Intel could take. It wasn’t about to let GPUs start horning in on its bread and butter. The GPU, by taking over the most compute-intensive tasks, would make its high-end processors completely superfluous. The arrival of the GPU had already limited the effect of high-end CPUs in the gaming world, and it was essential for Intel to avoid having the same thing happen this time (and with broader consequences).
So it decided to react and offer an alternative--and as is often the case with Intel, that alternative was bound to be based on the old tried-and-true x86 architecture.