Last year was rich in new developments in the world of 3D graphics. There was the first real update of the Nvidia architecture, the GT200. We also witnessed AMD’s return to the big leagues with the RV770 and its radically different approach, as the firm bet everything on its small, efficient chip rather than a larger, more complex design.
On the application programming interface (API) side, the first details of DirectX 11 were revealed and OpenGL 3 was made available. But even with all of that going on, the event that really marked the year was Intel’s official introduction of Larrabee at SIGGRAPH.
What’s all the excitement about? Simply: the fact, as we’ll see in this article, that Larrabee is radically different from any GPU currently available, and as such, it’s intriguing. Enthusiasts are wondering if this is going to be the design that changes the perception of how well Intel and graphics go together.
It also marks Intel’s return to the high-end GPU market after it killed off the i740 some 10 years ago. That retreat coincided with the end of the golden age in PC graphics, when numerous companies were struggling to reach the top of the GPU market before disappearing one after the other, or else refocusing on a less competitive sector. Today, the high-end GPU market boils down to AMD and Nvidia, and the few attempts to change that state of affairs have met with failure. Matrox, with its Parhelia almost seven years ago, XGI with its Volari, and 3DLabs with its Realizm all threw in the towel, and with good reason--modern GPUs are extremely complex and, consequently, require considerable investment and skills that only a few companies can afford.


But as I sad before here if the drivers aren't good, even the best hardware design is for nothing. I hope Intel invests more on to the software side of things and will be nice to have a third player.
Does the duo-core NV330 have the same sort of ring architecture?
just thought i'd point out that with the current amd vs intel fight..if intel takes away the x86 licence amd will take its multithreading and ht tech back leaving intel without a cpu and a useless gpu
I wonder just how compatible larrabee is going to be with older games?
That would be FANTASTIC! Maybe the same people who make the Omega drivers could make alternate Larrabee drivers? We all know Intel sucks balls at drivers.
Yeah but that 50% includes all the integrated cards that no consumer even realizes they're buying most of the time.. but not in discrete cards. I'd like to see a bit more competition on the discrete side.
Umm, what makes you think that AMD pioneered multi-threading? And Intel doesnt use HyperTransport, so they cant take it away.
I really don't see the 1st gen. being successful-it's not like AMD and nVidia are goofing around waiting for Intel to join up and show them a real GPU. Although there's no numbers on this that I've seen, I'm thinking Larry's going to have a pretty big die size to fit all those mini-cores so it better perform, because it will cost a decent sum.