Tim Sweeney Laments Intel Larrabee Demise

Intel's decision to shelve its graphics chip development project codenamed Larrabee caught some off guard. While the project was often delayed and didn't appear to compete on the same level as the best that AMD and Nvidia have to offer, it did have flashes of brilliance that could have changed the way many thought of GPGPUs.

Epic Games' Tim Sweeney, best known for his work behind the popularly used Unreal Engine, told Bright Side of News why he was excited about Larrabee.

"I see the instruction set and mixedscalar/vector programming model of Larrabee as the ultimate computing model, delivering GPU-class numeric computing performance and CPU-class programmability with an easy-to-use programming model that will ultimately crush fixed-function graphics pipelines," Sweeney said, adding that Intel's technology would be revolutionary whether sold as an add-in card, an integrated graphics solution, or part of the CPU die.

GPU makers today may boast about how many Teraflops its chips can pull off, but Sweeney says that focusing on pure performance "misses a larger point about programmability."

"Today's GPU programming models are too limited to support large-scale software, such as a complete physics engine, or a next-generation graphics pipeline implemented in software," Sweeney told bsn. "No quantity of Teraflops can compensate for a lack of support for dynamic dispatch, a full C++programming model, a coherent memory space, etc."

Stay tuned for more reaction.

Follow us on Twitter for more tech news and exclusive updates here.

Create a new thread in the US News comments forum about this subject
This thread is closed for comments
30 comments
    Your comment
    Top Comments
  • Honis
    Quote:
    "No quantity of Teraflops can compensate for a lack of support for dynamic dispatch, a full C++programming model, a coherent memory space, etc."
    Isn't the whole point of DirectX and OpenGL to make graphics cards and all their teraflops easier to access and utilize by making standard programming APIs?
    14
  • scryer_360
    I see what Tim is saying, but don't ATi and nVidia both have GPGPU's in the pipeline?
    10
  • Other Comments
  • omnimodis78
    It was a joke from day one! Intel graphics...it even sounds comical.
    8
  • DjEaZy
    ... just look on the graphics card hierarchy chart...
    http://www.tomshardware.com/reviews/geforce-310-5970,2491-7.html
    5
  • LORD_ORION
    I don't understadn why everyone thinks this is shelved? They aren't selling it retail. They are selling (giving?) generation 1 as a stream processor add-on (like CUDA) to intel partners so they can learn it. They will then try and get gen 2 up to spec and sell it at retail.
    3