Larrabee: Intel's New GPU

Introduction

Last year was rich in new developments in the world of 3D graphics. There was the first real update of the Nvidia architecture, the GT200. We also witnessed AMD’s return to the big leagues with the RV770 and its radically different approach, as the firm bet everything on its small, efficient chip rather than a larger, more complex design.

On the application programming interface (API) side, the first details of DirectX 11 were revealed and OpenGL 3 was made available. But even with all of that going on, the event that really marked the year was Intel’s official introduction of Larrabee at SIGGRAPH.

What’s all the excitement about? Simply: the fact, as we’ll see in this article, that Larrabee is radically different from any GPU currently available, and as such, it’s intriguing. Enthusiasts are wondering if this is going to be the design that changes the perception of how well Intel and graphics go together.

It also marks Intel’s return to the high-end GPU market after it killed off the i740 some 10 years ago. That retreat coincided with the end of the golden age in PC graphics, when numerous companies were struggling to reach the top of the GPU market before disappearing one after the other, or else refocusing on a less competitive sector. Today, the high-end GPU market boils down to AMD and Nvidia, and the few attempts to change that state of affairs have met with failure. Matrox, with its Parhelia almost seven years ago, XGI with its Volari, and 3DLabs with its Realizm all threw in the towel, and with good reason--modern GPUs are extremely complex and, consequently, require considerable investment and skills that only a few companies can afford. 

  • thepinkpanther
    very interesting, i know nvidia cant settle for being the second best. As always its good for the consumer.
    Reply
  • IzzyCraft
    Yes interesting, but intel already makes like 50% of every gpu i rather not see them take more market share and push nvidia and amd out although i doubt it unless they can make a real performer, which i have no doubt on paper they can but with drivers etc i doubt it.
    Reply
  • I wonder if their aim is to compete to appeal to the gamer market to run high end games?
    Reply
  • Alien_959
    Very interesting, finally some more information about Intel upcoming "GPU".
    But as I sad before here if the drivers aren't good, even the best hardware design is for nothing. I hope Intel invests more on to the software side of things and will be nice to have a third player.
    Reply
  • crisisavatar
    cool ill wait for windows 7 for my next build and hope to see some directx 11 and openGL3 support by then.
    Reply
  • Stardude82
    Maybe there is more than a little commonality with the Atom CPUs: in-order execution, hyper threading, low power/small foot print.

    Does the duo-core NV330 have the same sort of ring architecture?
    Reply
  • "Simultaneous Multithreading (SMT). This technology has just made a comeback in Intel architectures with the Core i7, and is built into the Larrabee processors."

    just thought i'd point out that with the current amd vs intel fight..if intel takes away the x86 licence amd will take its multithreading and ht tech back leaving intel without a cpu and a useless gpu
    Reply
  • liemfukliang
    Driver. If Intel made driver as bad as Intel Extreme than event if Intel can make faster and cheaper GPU it will be useless.
    Reply
  • IzzyCraft
    Hope for an Omega Drivers equivalent lol?
    Reply
  • phantom93
    Damn, hoped there would be some pictures :(. Looks interesting, I didn't read the full article but I hope it is cheaper so some of my friends with reg desktps can join in some Orginal Hardcore PC Gaming XD.
    Reply