Tim Sweeney Laments Intel Larrabee Demise

Intel's decision to shelve its graphics chip development project codenamed Larrabee caught some off guard. While the project was often delayed and didn't appear to compete on the same level as the best that AMD and Nvidia have to offer, it did have flashes of brilliance that could have changed the way many thought of GPGPUs.

Epic Games' Tim Sweeney, best known for his work behind the popularly used Unreal Engine, told Bright Side of News why he was excited about Larrabee.

"I see the instruction set and mixedscalar/vector programming model of Larrabee as the ultimate computing model, delivering GPU-class numeric computing performance and CPU-class programmability with an easy-to-use programming model that will ultimately crush fixed-function graphics pipelines," Sweeney said, adding that Intel's technology would be revolutionary whether sold as an add-in card, an integrated graphics solution, or part of the CPU die.

GPU makers today may boast about how many Teraflops its chips can pull off, but Sweeney says that focusing on pure performance "misses a larger point about programmability."

"Today's GPU programming models are too limited to support large-scale software, such as a complete physics engine, or a next-generation graphics pipeline implemented in software," Sweeney told bsn. "No quantity of Teraflops can compensate for a lack of support for dynamic dispatch, a full C++programming model, a coherent memory space, etc."

Stay tuned for more reaction.

Follow us on Twitter for more tech news and exclusive updates here.

Marcus Yam
Marcus Yam served as Tom's Hardware News Director during 2008-2014. He entered tech media in the late 90s and fondly remembers the days when an overclocked Celeron 300A and Voodoo2 SLI comprised a gaming rig with the ultimate street cred.
  • omnimodis78
    It was a joke from day one! Intel graphics...it even sounds comical.
    Reply
  • DjEaZy
    ... just look on the graphics card hierarchy chart...
    http://www.tomshardware.com/reviews/geforce-310-5970,2491-7.html
    Reply
  • LORD_ORION
    I don't understadn why everyone thinks this is shelved? They aren't selling it retail. They are selling (giving?) generation 1 as a stream processor add-on (like CUDA) to intel partners so they can learn it. They will then try and get gen 2 up to spec and sell it at retail.
    Reply
  • Honis
    "No quantity of Teraflops can compensate for a lack of support for dynamic dispatch, a full C++programming model, a coherent memory space, etc."
    Isn't the whole point of DirectX and OpenGL to make graphics cards and all their teraflops easier to access and utilize by making standard programming APIs?
    Reply
  • scryer_360
    I see what Tim is saying, but don't ATi and nVidia both have GPGPU's in the pipeline?
    Reply
  • omnimodis78
    LORD_ORIONI don't understadn why everyone thinks this is shelved? They aren't selling it retail. They are selling (giving?) generation 1 as a stream processor add-on (like CUDA) to intel partners so they can learn it. They will then try and get gen 2 up to spec and sell it at retail.Well, Larrabee in all its glory has been canned by the board of directors, and having a dedicated piece of hardware replaced by software development platform sort of bring a certain finality to this fiasco. Not saying that the specs didn't look good on paper, but if we only praised blueprints then nVidia should take home the crown for the best current graphics card... Sorry to say, but it's not how things work.
    Reply
  • tommysch
    Just make my 48 HTed core chip and leave the GPU market to nVidia/ATI. I want it ASAP!
    Reply
  • dark_lord69
    omnimodis78It was a joke from day one! Intel graphics...it even sounds comical.I think an on die GPU working as one of the CPU cores is a genuis idea and would allow amazing graphical capabilities.
    Reply
  • zerapio
    omnimodis78It was a joke from day one! Intel graphics...it even sounds comical.I really don't get why people expect a $7 piece of silicon to perform the same as a ~$15 or even a $100 piece from ATI & NVIDIA. Grab the list from DjEaZy and divide the performance of each card by the silicon size, cost or power requirements and you'll see why they have about 90% of the laptop market. It's not good enough to play games or do anything intensive (by a LONG shot) but I think it does a lot with very little.
    Reply
  • Miharu
    It's just logic. Larrabee have level instruction to work in mixedscalar/vector. Orriented Object in C++ always cost more by this specification (It's an object). So it's normal, any CPU need to undestand the object work with the GPGPU, the GPGPU must interstand it, before he could use scalar/vector expression on the current object.
    So your CPU/GPGPU work 2-3 times more (for simple object / complexe could need 10 times more) running that kind of expression.

    It's interresting features for a little revolution (at low cpu cost) but pc game won't see difference... you use ATi or Nvidia.
    So this just open the door to ATi and Nvidia to make scalar/vector function at low cost.
    Reply