Sign in with
Sign up | Sign in

Tim Sweeney Laments Intel Larrabee Demise

By - Source: Tom's Hardware US | B 30 comments

Unreal Engine programmer thinks Larrabee could have been something cool.

Intel's decision to shelve its graphics chip development project codenamed Larrabee caught some off guard. While the project was often delayed and didn't appear to compete on the same level as the best that AMD and Nvidia have to offer, it did have flashes of brilliance that could have changed the way many thought of GPGPUs.

Epic Games' Tim Sweeney, best known for his work behind the popularly used Unreal Engine, told Bright Side of News why he was excited about Larrabee.

"I see the instruction set and mixedscalar/vector programming model of Larrabee as the ultimate computing model, delivering GPU-class numeric computing performance and CPU-class programmability with an easy-to-use programming model that will ultimately crush fixed-function graphics pipelines," Sweeney said, adding that Intel's technology would be revolutionary whether sold as an add-in card, an integrated graphics solution, or part of the CPU die.

GPU makers today may boast about how many Teraflops its chips can pull off, but Sweeney says that focusing on pure performance "misses a larger point about programmability."

"Today's GPU programming models are too limited to support large-scale software, such as a complete physics engine, or a next-generation graphics pipeline implemented in software," Sweeney told bsn. "No quantity of Teraflops can compensate for a lack of support for dynamic dispatch, a full C++programming model, a coherent memory space, etc."

Stay tuned for more reaction.

Follow us on Twitter for more tech news and exclusive updates here.

Discuss
Display all 30 comments.
This thread is closed for comments
Top Comments
  • 14 Hide
    Honis , December 7, 2009 6:23 PM
    Quote:
    "No quantity of Teraflops can compensate for a lack of support for dynamic dispatch, a full C++programming model, a coherent memory space, etc."
    Isn't the whole point of DirectX and OpenGL to make graphics cards and all their teraflops easier to access and utilize by making standard programming APIs?
  • 10 Hide
    scryer_360 , December 7, 2009 6:26 PM
    I see what Tim is saying, but don't ATi and nVidia both have GPGPU's in the pipeline?
Other Comments
  • 8 Hide
    omnimodis78 , December 7, 2009 6:11 PM
    It was a joke from day one! Intel graphics...it even sounds comical.
  • 5 Hide
    DjEaZy , December 7, 2009 6:19 PM
    ... just look on the graphics card hierarchy chart...
    http://www.tomshardware.com/reviews/geforce-310-5970,2491-7.html
  • 3 Hide
    LORD_ORION , December 7, 2009 6:21 PM
    I don't understadn why everyone thinks this is shelved? They aren't selling it retail. They are selling (giving?) generation 1 as a stream processor add-on (like CUDA) to intel partners so they can learn it. They will then try and get gen 2 up to spec and sell it at retail.
  • 14 Hide
    Honis , December 7, 2009 6:23 PM
    Quote:
    "No quantity of Teraflops can compensate for a lack of support for dynamic dispatch, a full C++programming model, a coherent memory space, etc."
    Isn't the whole point of DirectX and OpenGL to make graphics cards and all their teraflops easier to access and utilize by making standard programming APIs?
  • 10 Hide
    scryer_360 , December 7, 2009 6:26 PM
    I see what Tim is saying, but don't ATi and nVidia both have GPGPU's in the pipeline?
  • 5 Hide
    omnimodis78 , December 7, 2009 6:27 PM
    LORD_ORIONI don't understadn why everyone thinks this is shelved? They aren't selling it retail. They are selling (giving?) generation 1 as a stream processor add-on (like CUDA) to intel partners so they can learn it. They will then try and get gen 2 up to spec and sell it at retail.

    Well, Larrabee in all its glory has been canned by the board of directors, and having a dedicated piece of hardware replaced by software development platform sort of bring a certain finality to this fiasco. Not saying that the specs didn't look good on paper, but if we only praised blueprints then nVidia should take home the crown for the best current graphics card... Sorry to say, but it's not how things work.
  • -4 Hide
    tommysch , December 7, 2009 6:49 PM
    Just make my 48 HTed core chip and leave the GPU market to nVidia/ATI. I want it ASAP!
  • 3 Hide
    dark_lord69 , December 7, 2009 7:21 PM
    omnimodis78It was a joke from day one! Intel graphics...it even sounds comical.

    I think an on die GPU working as one of the CPU cores is a genuis idea and would allow amazing graphical capabilities.
  • 0 Hide
    zerapio , December 7, 2009 7:21 PM
    omnimodis78It was a joke from day one! Intel graphics...it even sounds comical.

    I really don't get why people expect a $7 piece of silicon to perform the same as a ~$15 or even a $100 piece from ATI & NVIDIA. Grab the list from DjEaZy and divide the performance of each card by the silicon size, cost or power requirements and you'll see why they have about 90% of the laptop market. It's not good enough to play games or do anything intensive (by a LONG shot) but I think it does a lot with very little.
  • 0 Hide
    Miharu , December 7, 2009 7:26 PM
    It's just logic. Larrabee have level instruction to work in mixedscalar/vector. Orriented Object in C++ always cost more by this specification (It's an object). So it's normal, any CPU need to undestand the object work with the GPGPU, the GPGPU must interstand it, before he could use scalar/vector expression on the current object.
    So your CPU/GPGPU work 2-3 times more (for simple object / complexe could need 10 times more) running that kind of expression.

    It's interresting features for a little revolution (at low cpu cost) but pc game won't see difference... you use ATi or Nvidia.
    So this just open the door to ATi and Nvidia to make scalar/vector function at low cost.
  • 2 Hide
    gmcboot , December 7, 2009 8:28 PM
    HonisIsn't the whole point of DirectX and OpenGL to make graphics cards and all their teraflops easier to access and utilize by making standard programming APIs?


    Not when it is possible to eliminate the middle man, DirectX and OpenGL and write directly to the chip. Right now, developers have their game engine, then DirectX or Open GL then it gets to the chip. If they were able to code to something like the x86 standard, trust me, your graphics processor would be doing a lot more with less coding. I never thought this thing would kill Nvidia or ATI, but I was hoping it would see the light of day.
  • 9 Hide
    reichscythe , December 7, 2009 9:15 PM
    Really?? Get the hell outta here! Like Tim Sweeney, Mr. "We're designing the next Unreal Engine specifically to console standards... PC considerations to come later... in fact, we don't have a single DX11 game in the pipeline since we're still concentrating on our precious DX9 console releases" Really gives a flying flip about innovative trends in PC graphics technology...
  • -5 Hide
    Anonymous , December 7, 2009 10:05 PM
    Sorry it's not coming to market, or sorry that it was a bad idea and the performance sucks? Because if it was as great as he makes it out to be, then it wouldn't have been cancelled...

    He sounds like one of the people defending Global Warming theory after Climategate happened. The truth doesn't need lies to support it, and if an Atom CPU does 3 gigaflops, then 30 of them on one die doesn't do 2000 gigaflops...
  • 2 Hide
    tacoslave , December 7, 2009 11:28 PM
    reichscytheReally?? Get the hell outta here! Like Tim Sweeney, Mr. "We're designing the next Unreal Engine specifically to console standards... PC considerations to come later... in fact, we don't have a single DX11 game in the pipeline since we're still concentrating on our precious DX9 console releases" Really gives a flying flip about innovative trends in PC graphics technology...


    if valve said it i would care...
  • -2 Hide
    belardo , December 8, 2009 12:24 AM
    Tim Sweeney... why should YOU or WE care what you think?

    You're company (EPIC)/unreal are going to consoles. With NO (or little) games left for PC, what does it matter? DX11 or DX12 will be meaningless.

    AMD and Nvidia need to see the writing on the wall... because the GPU card market can simply end in a few years. NO games = NO NEED for a GAMING video card! The onboard graphics with AMD/ATI chipsets does very well already.

    Also Tim Sweeney.... you totally screwed up the lastest UT3 game. Bad menu GUI, horrible maps with so few good choices that most people got BORED with the game before the fans could learn and make good maps - but because the maps are soo huge, so few servers actually host such maps. So UT3 servers are EMPTY of humans. Controls for vehicles are worse - some stupid reason, the UT2004 version was too good?

    Tim Sweeney opinion isn't worth that much in my book.

    But I'm just a gamer. So I don't matter.

  • 0 Hide
    Dario_D , December 8, 2009 2:43 AM
    Awwwww, I hope this doesn't hold things back. The ability to program your own pipelines was going to be the next gaming revolution.
  • -2 Hide
    matt87_50 , December 8, 2009 4:14 AM

    DjEaZy... just look on the graphics card hierarchy chart... http://www.tomshardware.com/review [...] 491-7.html


    omnimodis78It was a joke from day one! Intel graphics...it even sounds comical.



    you are all missing the point! its meant to be an competitor to GPgpu, NOT GPU!! GPGPU is about doing anything and everything OTHER than graphics with the graphics card! so Larrabee's graphics performance is irrelevant!!

    honestly, they should have never tried to mix it with GPUs at their OWN game! trying to be a graphics card!! it should have been marketed AND released as a ultrathreaded co-processor - which is what it really is! like the SPUs in the ps3! they should have taken that as a warning! sony was originally going to try using two cell processors - one for graphics - and not have a gpu, but even they new better! a CPU will never be as good as a gpu at what a GPU is SPECIFICALLY meant to do!!

    there strategy should have been accentuating its EASE OF USE. which is what has all us game devs excited. being x86, I could have imagined using all my existing c++ knowledge, and using it as a tool to really see proper multi threaded programming take off. the trick with this would have been releasing it ASAP! while GPGPU was still in its infancy. as GPGPU has matured, with dx11 and such, it may be too late to play the 'EASE OF USE' card. personally, i'd still buy one.
  • 3 Hide
    ravewulf , December 8, 2009 4:19 AM
    google_climategateHe sounds like one of the people defending Global Warming theory after Climategate happened. The truth doesn't need lies to support it


    All this "Climategate" crap is absolute nonsense. Some hacker found a couple emails out of thousands over 13 years that when certain phrases are taken out of context "help" the politicians who say global warming is a lie (while getting payed to say that by those in industry who would have to submit to climate change laws were they to pass).

    The emails, when read in their entirety, have absolutely nothing that contradicts the MOUNTAINS of evidence that global warming is influenced by humans and is harmful. This is the warmest decade in recorded history! Ocean levels are rising, land is disappearing because of the rising oceans, icecaps are melting, record high global temps, etc.

    200 of the top scientists with mountains of undeniable evidence vs industry payed politicians with a few phrases taken out of context from a few emails. Who is more likely to be telling the truth?

    Take a look at this: http://www.youtube.com/watch?v=YctV731kS8I
  • -5 Hide
    FoShizzleDizzle , December 8, 2009 8:00 AM
    ravewulfAll this "Climategate" crap is absolute nonsense. Some hacker found a couple emails out of thousands over 13 years that when certain phrases are taken out of context "help" the politicians who say global warming is a lie (while getting payed to say that by those in industry who would have to submit to climate change laws were they to pass).The emails, when read in their entirety, have absolutely nothing that contradicts the MOUNTAINS of evidence that global warming is influenced by humans and is harmful. This is the warmest decade in recorded history! Ocean levels are rising, land is disappearing because of the rising oceans, icecaps are melting, record high global temps, etc.200 of the top scientists with mountains of undeniable evidence vs industry payed politicians with a few phrases taken out of context from a few emails. Who is more likely to be telling the truth?Take a look at this: http://www.youtube.com/watch?v=YctV731kS8I

    You sound awfully sure of yourself, and dangerously so. There is insurmountable evidence that the planet is actually getting colder, yet you still sound far assured that it is getting warmer. George Orwell anyone?
  • 3 Hide
    ravewulf , December 8, 2009 8:57 AM
    FoShizzleDizzleYou sound awfully sure of yourself, and dangerously so. There is insurmountable evidence that the planet is actually getting colder, yet you still sound far assured that it is getting warmer. George Orwell anyone?


    Do you see the icecaps and glaciers growing and extending over more of the planet like when we get an ice age, or are they shrinking?

    It doesn't take a genius to figure it out, but apparently people like to ignore the scientists who are experts in this area.
Display more comments