Sign in with
Sign up | Sign in

Watch Intel's First Demo of Larrabee GPGPU

By - Source: Tom's Hardware US | B 42 comments

Larrabee's raytracing Quake Wars: Enemy Territory!

While Intel makes many chips and technologies for all sorts of computing, its known for its CPUs. Now Intel is ready to take on a whole new challenge in the area of graphics with its upcoming Larrabee GPU.

Larrabee's raison d'être is to give Intel something to push back with against AMD and Nvidia. It won't be a direct competitor to Radeons and GeForces, as Larrabee is fundamentally different from present GPUs on the market

Notably, Larrabee's architecture is based off the Pentium P54C design and will use the x86 instruction set. The nature of the design makes Larrabee better suited to the term of the GPGPU. Larrabee is expected to function as a modest rasterizer, but could have the edge the computationally-heavy method of raytracing.

At IDF 2009, Intel made its first public demonstration of Larrabee – running on a Gulftown system, no less. Check it out in the video embedded below:

IDF 2009: Larrabee Demo

Display 42 Comments.
This thread is closed for comments
Top Comments
  • 22 Hide
    valcron , September 24, 2009 6:36 PM
    Ok so that video showed me absolutely nothing. Or did I miss something?
  • 20 Hide
    crom , September 24, 2009 6:43 PM
    Is it me, or does that video look like its at quite a low frame rate?
  • 11 Hide
    tipoo , September 24, 2009 10:15 PM
    It's doing way more than 50 xeons were a few years back, and you people aren't impressed? That's probably because you do not understand what you are seeing and what it entails for the future.

    Rasterisation is great but for effects like shadows and mirrors it is a real mess. Processing requirements also scale linearly hence these ridiculous graphics cards. Rasterisation just lacks the realism of ray tracing. Developers have to do so much work with rasterisation to get all those nice effects and they can't do every surface.

    Ray tracing engines will change all this and the developer will simply define transparency, reflectivity etc. rather than having to create them by hand.

    Essentially raytracing is a more physics like approach treating the simulation like it is in the real world.




    Besides, Intel made no attempt to promote this as a high-end/enthusiast product. So why is there a straight presumption that it will be? It won't. They are aiming for the mainstream market, NOT the top end. (Don't be surprised that ATI and Nvida will own Intel on the performance side in 2010...In fact, I know they will.)

    Dont get me wrong, AMD's 5870 demo with Crysis on Eyefinity was much more impressive to me. That and the fact that AMD's card hits 2.72 terraflops on a single chip, while Larrabee is targeting a measly 1 terraflop.
Other Comments
  • 22 Hide
    valcron , September 24, 2009 6:36 PM
    Ok so that video showed me absolutely nothing. Or did I miss something?
  • 20 Hide
    crom , September 24, 2009 6:43 PM
    Is it me, or does that video look like its at quite a low frame rate?
  • 3 Hide
    charlesxuma , September 24, 2009 6:45 PM
    "keep simple things simple" .......... i'd like to hear more about that...also...was expecting more to it then a real time ray-tracing demo, at least an fps counter on the screen.... i guess thats part of the keep simple things simple campaign.
  • 2 Hide
    nforce4max , September 24, 2009 6:47 PM
    Hmmm interesting proof of concept but would of liked to see aplications something that most people use like games such as crysis, bioshock or stalker. Also programs such as 3dsmax or softidemage or arcmap.
  • 0 Hide
    tektek , September 24, 2009 7:16 PM
    So it cant play Crysis? ..ok joking aside.. this demo is not a good seller on the possibilities this could bring.. but so far i think WOW players will love finding cheaper laptops with no intigrated video cards that can play with more detail. Heavy gamers... not the time..not the place...................... YET!
  • 0 Hide
    tntom , September 24, 2009 7:21 PM
    What kind of power consumption are we looking at? I am completely sure this will never compete with high end GPUs performance wise but I would like to see a performance per watt comparison though.
  • 0 Hide
    charlesxuma , September 24, 2009 7:23 PM
    tektekSo it cant play Crysis? ..ok joking aside.. this demo is not a good seller on the possibilities this could bring.. but so far i think WOW players will love finding cheaper laptops with no intigrated video cards that can play with more detail. Heavy gamers... not the time..not the place...................... YET!


    who said anything about cheaper???? to me this looks like its gona be more expensive, plus i think theyd probably sell most builds if not all builds with discrete graphics only. however this does depend on how much larabee actually benchmarks, "give time, time".
  • -1 Hide
    Anonymous , September 24, 2009 7:24 PM
    @tntom: could be wrong, but I believe prior reports indicated significantly higher TDP than similar performing GPUs (which is to say, the top GPUs from a couple generations ago)
  • 3 Hide
    WheelsOfConfusion , September 24, 2009 7:29 PM
    The demo was for real-time ray tracing, not standard rasterization. RTRT is a pretty intensive task, that's why most people choose the raster route.
    Of course, Larrabee will have to do rasterizing too, regardless of whether or not RTRT makes any headway.
  • 1 Hide
    Yuka , September 24, 2009 7:34 PM
    So... Larafail ain't so fail after all. That's good to know.

    If i heard correctly on the video, they re-rendered a scene (map?) using Raytracing alone for lights and reflections adding another process for it, that's pretty impressive... Too bad it has so low FPS for a *gamer* to care. It has some impressive capabilities for rendering though, hope Intel puts more juice for gamers to care.
  • -4 Hide
    caskachan , September 24, 2009 7:48 PM
    WHys tatic scenario Move around inside D;
  • 1 Hide
    aneasytarget , September 24, 2009 8:40 PM
    I think he looked very life like for a ray trace rendering.
  • -1 Hide
    gaevs , September 24, 2009 8:43 PM
    Actually, that's not bad, for animation rendering in realtime, i'm thinking in movies and short movies, as render times are huge, with a lot of network computers, if you can use one or two of those, that will shorten render time to days instead of months..
  • -1 Hide
    gaevs , September 24, 2009 8:45 PM
    and as it uses x86 instructions, the net renderers could recognize it as another multicore CPU, with little coding..
  • -1 Hide
    XD_dued , September 24, 2009 8:48 PM
    My guess is since its not specialized for graphics only, price/performance ration will be poor compared to graphics cards.
  • 4 Hide
    cryogenic , September 24, 2009 8:55 PM
    Somehow, after all the Larabee buzz, this has not impressed me.
  • -1 Hide
    warezme , September 24, 2009 9:03 PM
    hmmm, why are all the huge corp spokes persons required to have a euro trash accent.
  • 1 Hide
    eklipz330 , September 24, 2009 9:47 PM
    you know what, despite intel having the majority number of shares in the market, they still seem to be putting a whole lot in R&D, and even if they do mess up, they've been leading for quite a while, and unlike nvidia, they haven't been slacking... i really hope them the best

    i mean they already announced 22nm for 2011, that is really impressive.
  • 4 Hide
    Ehsan w , September 24, 2009 9:58 PM
    the water actually looks kinda lame :/ 
    didn't they say they would totally own Ati/Nvidia?
    or was that with something else?
Display more comments