Sign in with
Sign up | Sign in

AMD Responds to Intel's Larrabee Delay

By - Source: Tom's Hardware US | B 49 comments

AMD says of Larrabee: "GPUs are hard to design and you can’t design one with a CPU-centric approach that utilizes existing x86 cores."

Over the weekend we learned that Intel has pulled the reins back on Larrabee, the chip giant's supposed answer to the GPGPU question.

Although we already knew that Larrabee wasn't going to be a world beater in the world of 3D gaming, it proposed new thinking that excited game developers and engine programmers such as Tim Sweeney, the mastermind behind the Unreal Engine.

Designing some of the world's best CPUs, Intel's attempt to make a new GPGPU was marred by delays that would have made the product uncompetitive.

We decided to ask AMD for its take on the Larrabee situation, as it's a company that also has to juggle both CPU and GPU development. Of course the story for AMD is different due to the acquisition of graphics specialist ATI.

"From the outset, we have seen Larrabee as further validation of the importance of visual computing. We continue to assert that GPU technology is essential to the computing experience, today and tomorrow," Dave Erskine, Graphics Public Relations of AMD, told Tom's Hardware. "AMD is the technology leader in GPU technology for 3D graphics, video and GPU Compute."

Having both a CPU and graphics market already established, AMD is in a unique position with its integration strategy.

"With only CPU, or GPU, a company is limited in its ability to respond to the needs of the industry," Erskine added. "AMD is the only company in command of both GPU and CPU IP portfolios, and in response to the clear direction of the computer industry we’re bringing CPU and GPU together in Fusion."

Larrabee's architecture was different from today's GPUs because it was based on a Pentium P54C design uses the x86 instruction set. The nature of the design makes Larrabee better suited to the term of the GPGPU – but it's one that AMD doesn't see as the right one to go with.

"It really comes down to design philosophy," said Erskine. "GPUs are hard to design and you can’t design one with a CPU-centric approach that utilizes existing x86 cores."

What does AMD propose instead? Erskine explains, "We’re entering a new era in PC computing and it requires that visual computing technologies drive the pace of innovation. We call this Velocity. AMD Velocity builds on our already established GPU design cycle to achieve a faster pace of innovation than AMD previously achieved with a CPU-only development focus. AMD velocity is designed to deliver performance breakthroughs via teraFLOPS-class GPU compute power in tandem with performance and low-power x86 core options. We expect this will result in a clear, compelling platform differentiation for AMD, and the delivery of the best APU on the market every year."

Discuss
Display all 49 comments.
This thread is closed for comments
Top Comments
  • 24 Hide
    tacoslave , December 7, 2009 11:16 PM
    Amd pwns
  • 21 Hide
    tonewheelmonster , December 8, 2009 12:07 AM
    AMD for me
  • 19 Hide
    climber , December 7, 2009 11:12 PM
    Imagine GPU accelerated applications for laptops when they're plugged into the wall and CPU only when on battery power with minimal video acceleration and low power state GPU functionality. Sort of like a math-co-processor on steroids.
Other Comments
  • 0 Hide
    imrul , December 7, 2009 11:11 PM
    interesting
  • 19 Hide
    climber , December 7, 2009 11:12 PM
    Imagine GPU accelerated applications for laptops when they're plugged into the wall and CPU only when on battery power with minimal video acceleration and low power state GPU functionality. Sort of like a math-co-processor on steroids.
  • 24 Hide
    tacoslave , December 7, 2009 11:16 PM
    Amd pwns
  • 13 Hide
    lumpy , December 7, 2009 11:37 PM
    i like seperate cpu/gpu,its too dam expensive as is for high end stuff.
    Put it all on one chip and well...$$$$
    I suppose someday even RAM and SDD Could all be on one chip.I wonder.
  • 18 Hide
    Shadow703793 , December 7, 2009 11:51 PM
    lumpyI suppose someday even RAM and SDD Could all be on one chip.I wonder.

    That would be a Bad Thing as we won't be able to upgrade individual parts with out replacing the entire box.
  • 7 Hide
    festerovic , December 8, 2009 12:01 AM
    Shadow703793That would be a Bad Thing as we won't be able to upgrade individual parts with out replacing the entire box.


    Just thinking the same thing...
  • 1 Hide
    Honis , December 8, 2009 12:01 AM
    lumpyI suppose someday even RAM and SDD Could all be on one chip. I wonder.
    The problem with this is we are stuck on an archaic architecture (x86). Even the latest 64-bit chips are x86-64. The architecture requires the use of a north bridge to access RAM and the south bridge which access the hard drive (through the north bridge). To fit all of this onto a single chip would lead to a headache in production since the dye size would be enormous (leading to a greater lose in production).

    System on a Chip processors greatly reduce the bridge required by the processor but they are highly specialized for the system they are implementing.

    More on SoC:
    http://en.wikipedia.org/wiki/System-on-a-chip
  • 4 Hide
    cliffro , December 8, 2009 12:02 AM
    Shh!!! don't give them any ideas.....
  • 1 Hide
    belardo , December 8, 2009 12:05 AM
    With intel owning the CPU market, its mostly good that its another business area they are not taking over.

    Of course Intel is doing very well with their SSDs. Which because they are very good - they are on the top of everyones list.
  • 21 Hide
    tonewheelmonster , December 8, 2009 12:07 AM
    AMD for me
  • 11 Hide
    ravewulf , December 8, 2009 12:09 AM
    Interesting points, but it would be nice if the marketing talk was stripped out.
  • 1 Hide
    JamesSneed , December 8, 2009 12:16 AM
    Actually it may be easier to upgrade especially if pricing is competitive. Just pop out old APU and replace with new APU and you have and all in one upgrade. Like mentioned it does remind one of the math coprocessor or front side bus being assimilated into the CPU. For now I assume Fusion will not focus on gamers but 5 years from now we all may be running an APU or two in our rigs. Fusion could be a huge hit in laptops and HTPC.
  • 4 Hide
    XD_dued , December 8, 2009 12:41 AM
    HonisThe problem with this is we are stuck on an archaic architecture (x86). Even the latest 64-bit chips are x86-64. The architecture requires the use of a north bridge to access RAM and the south bridge which access the hard drive (through the north bridge). To fit all of this onto a single chip would lead to a headache in production since the dye size would be enormous (leading to a greater lose in production).System on a Chip processors greatly reduce the bridge required by the processor but they are highly specialized for the system they are implementing.More on SoC:http://en.wikipedia.org/wiki/System-on-a-chip


    um...x86 is for the processor only. How about p55 without north bridge? Or how about Phenom with Hyper transport?
  • 0 Hide
    rambo117 , December 8, 2009 12:43 AM
    I was pretty bummed out when i found out larrabee wasnt happening, But there are still very many things to come in the next few years. Fusion is a facinating concept, excited to see how it performs.
  • 0 Hide
    buwish , December 8, 2009 12:44 AM
    I concur. The big market for the APU's will surely be in the laptop market; possibly HTPC market depending on what they can handle, i.e. HD content in a decent manner.
  • 18 Hide
    ik242 , December 8, 2009 12:50 AM
    i don't see it that way - in fact i dare to call "keep them separate" claims silly.

    integration is what has brought low prices and high availability of any product (and specially electronics).

    memory and memory controller integrated in cpu don't cost much and since part of the cpu, get replaced together with cpu.

    just because there is some cache on the cpu, or some flash memory on some new digital camera (just to make point), it does not mean that you cannot add more ram (on computer) or larger storage (SD card for example in case of camera).

    for those who don't remember, there was a time where cache was not integrated in cpu. it was damn expensive and often costed more than cpu.

    there was time when CD drive needed dedicated controller (before they could attach to IDE for example) and it would occupy mobo slot. aneedless to say it was cluther, with slow performance and high cost.

    there was also time when chipset was just that -> collection of few dozens chips (a set) performing only few very basic functions (didn't include modem, serial or parallel port, network card, sound card, hdd or fdd controller etc. - think about what comes in today's moos or the north and south bridge).

    my first network card, sound card, modem etc. costed each about same as the CPU of the day. nowdays those things are part of chipset/motherboard just like video output which may not be faster than discrete card but it's good enough for 95% of applications and - it's "free". and just because there is onboard video, nobody says that you can't add another graphic card (or two, or three...).

    another thing is with integration, many things can be resolved more efficiently including size, power consumption, foootprint, bandwith etc.

    so AMD and Intel, please make my next pc small, size of a dime sounds about right as i would like to carry it around without straining my arm. heck, integrate it into glasses that can double as high definition monitor.
  • 0 Hide
    matt_b , December 8, 2009 1:30 AM
    If my money was on one company to successfully pull this off, it would be on AMD. They are the only company to house both sides of the court and they already have the know-how and technology from both sectors to do it. The interesting part will be to see how they manage to marry the two together into one product.
  • 0 Hide
    mman74 , December 8, 2009 2:14 AM
    If this was a low cost, low power I can totally see this chip being put into numerous devices. Standardization of the CPU/GPU platform, with a bear minimum of FullHD as with the ION chipset, less board space, and power. There could be no limits to what they put this chip into - microwaves, fridges, etc.
  • 1 Hide
    biofrog , December 8, 2009 2:14 AM
    Then again, with Intel displaying their 48-core processor recently perhaps they realised the processor development was a lot further along than expected, making Larrabee somewhat superceded already.
  • 0 Hide
    elel , December 8, 2009 2:22 AM
    ik242so AMD and Intel, please make my next pc small, size of a dime sounds about right as i would like to carry it around without straining my arm. heck, integrate it into glasses that can double as high definition monitor.

    lol, nice point. But if you are afraid of cell phones, do you have any idea how much electrical noise this would make? right next to your eyes? with a high frequency clock? But I do like the idea of integrating more stuff on one chip, if it saves me money.
Display more comments