Sign in with
Sign up | Sign in

Next-gen AMD Fusion CPU + GPU Coming in 2015

By - Source: Tom's Hardware US | B 70 comments

AMD planning big things for its Fusion of the future.

AMD's initial product based on its Fusion initiative, codenamed Llano, is now sampling with vendors. This first iteration will combine the GPU and CPU on the same die, which will drive down power requirements and costs.

While Llano will be based off of Phenom II technology and will be paired with ATI Radeon GPU design, AMD says that its next-generation Fusion will blur the line between CPU and GPU. That next big change is planned for 2015, Leslie Sobon, vice president of marketing at AMD, told IDG.

The next Fusion will have the philosophy of full integration of both GPU and CPU, aiming to take full advantage of new standards like OpenCL.

"The second iteration [in] 2015 ... you're not going to be able to tell the difference. It's all going away," Sobon said.

While some computations will still be best for the traditional CPU architecture, software that would run more efficient on a GPU-like design is where Fusion should thrive.

"The GPU is perfect for antivirus. It's a perfect parallel-processed application. In the Fusion-based time frame that's where it needs to go," Sobon said.

Still, that's five years away. Right now Fusion is sampling to potential customers, and Dell and Apple are in the rumored bunch.

In related news, Digitimes cites sources saying that AMD will be enlisting the help of Taiwan Semiconductor Manufacturing Company (TSMC) to fabricate its Fusion chips on the 40nm process.

Display 70 Comments.
This thread is closed for comments
Top Comments
  • 22 Hide
    N.Broekhuijsen , May 18, 2010 1:05 PM
    beautiful stuff, but the true system builder (most of us on toms) will want a GPU seperate from the CPU, just so we can choose excactly what we want.

    I do see this becoming ideal in netbooks, small desktops, office computers, HTCP's etc.

    Love to see computer evolution!
  • 21 Hide
    joytech22 , May 18, 2010 1:09 PM
    This should definitely shake up the market a bit, but by that time the major CPU manufacturers would have already done this, hopefully AMD isn't going to make us wait 5 whole years (as said) to put a GPU onto a already outdated Phenom II CPU architecture.
  • 18 Hide
    worl , May 18, 2010 1:15 PM
    Please AMD dont make this another larrabee. This is a great chance to pull ahead of intel dont mess up.

    Can't to see the resultss of the 2nd gen.
Other Comments
  • 22 Hide
    N.Broekhuijsen , May 18, 2010 1:05 PM
    beautiful stuff, but the true system builder (most of us on toms) will want a GPU seperate from the CPU, just so we can choose excactly what we want.

    I do see this becoming ideal in netbooks, small desktops, office computers, HTCP's etc.

    Love to see computer evolution!
  • 21 Hide
    joytech22 , May 18, 2010 1:09 PM
    This should definitely shake up the market a bit, but by that time the major CPU manufacturers would have already done this, hopefully AMD isn't going to make us wait 5 whole years (as said) to put a GPU onto a already outdated Phenom II CPU architecture.
  • 0 Hide
    joytech22 , May 18, 2010 1:14 PM
    joytech22This should definitely shake up the market a bit, but by that time the major CPU manufacturers would have already done this, hopefully AMD isn't going to make us wait 5 whole years (as said) to put a GPU onto a already outdated Phenom II CPU architecture.


    (Forgot how to edit my own post)
    What i mean by "outdated" is that Phenom II is nothing revolutionary performance-wise on a clock for clock basis, AMD need's to implement it's GPU core into a newer architecture with better performance per clock, this would ensure nobody would have to sacrifice CPU performance for a IGP (or whatever you wish to call a GPU on a CPU)
  • 18 Hide
    worl , May 18, 2010 1:15 PM
    Please AMD dont make this another larrabee. This is a great chance to pull ahead of intel dont mess up.

    Can't to see the resultss of the 2nd gen.
  • 18 Hide
    burnley14 , May 18, 2010 1:21 PM
    This seems like too little WAY too late. 5 years? Really?
  • -5 Hide
    hundredislandsboy , May 18, 2010 1:32 PM
    Intel who? Go AMD!! Along with the GPU, if they can throw in the audio, LAN, and a TB SSD in the CPU die, then I'll be impressed because my desktop won't be the noisy tower it is now.

    If you had a dual socket system mobo and threw in two of these, is that considered SLI?
  • 2 Hide
    digiex , May 18, 2010 1:33 PM
    AMD should convince also software developers for lot of support, with lack of software running it, it will end up like Itanium.
  • 2 Hide
    apache_lives , May 18, 2010 1:42 PM
    nvidia wont like this one bit...
  • -2 Hide
    gekko668 , May 18, 2010 1:44 PM
    It will be nice if AMD give the user an option to disable the integrated GPU.
  • 17 Hide
    HVDynamo , May 18, 2010 2:01 PM
    gekko668It will be nice if AMD give the user an option to disable the integrated GPU.


    I think that rather than disable it, they should implement the same tech thats hitting notebooks where the hardware can switch between the integrated and dedicated GPU's depending on the work load, or even allowing them to be operating at the same time giving the system more computational power depending on the workload and how tasks can be divided up.
  • -2 Hide
    Pei-chen , May 18, 2010 2:07 PM
    Sounds a lot like what Nvidia has been saying about combing GPU and general computing into the same hardware.
  • 9 Hide
    rooseveltdon , May 18, 2010 2:19 PM
    techguy911By that time bio chips will be out making this tech obsolete.http://www.physorg.com/news192801007.html

    lol no offense but DNA powered computers are at least ten years away the ethical and moral implications of such topic alone would cause tons of debates in congress plus half the nerds here (me included) would fear the potential rise of a sky net like computer that will want to replace us with machines, i will stick with silicone and metal thank you lol
  • 8 Hide
    antisyzygy , May 18, 2010 2:19 PM
    joytech22This should definitely shake up the market a bit, but by that time the major CPU manufacturers would have already done this, hopefully AMD isn't going to make us wait 5 whole years (as said) to put a GPU onto a already outdated Phenom II CPU architecture.


    They are releasing a fusion chip to vendors now, meaning a version of the fusion chip is being released probably by next year. This article says that a fully integrated Fusion chip will be out in 2015. Whereas the Fusion chip coming out now is a slapped together Phenom II and a GPU, the next gen Fusion chips will probably be designed from the ground up as one cohesive unit.
  • 2 Hide
    mikeangs2004 , May 18, 2010 2:22 PM
    Quote:
    but the true system builder (most of us on toms) will want a GPU seperate from the CPU, just so we can choose excactly what we want.

    I do see this becoming ideal in netbooks, small desktops, office computers, HTCP's etc.


    You should read more about fusion before you make anymore of these ~~`. It does not mean under fusion the external GPU sub-system option will not be available. If more graphics acceleration is needed then certainly one can add a discrete video card that goes along the APU
  • 4 Hide
    deweycd , May 18, 2010 2:28 PM
    One thing about integrating the GPU with CPU and having a dedicated graphics card is that the fully integrated GPU can assist the CPU do calculations like Anti-virus, Physics, and other GPU assisted calculations. This leaves the dedicated GPU to do its own work without having to add in these extra calculations. It may also be easier to code a CPU with integrated GPU then a CPU and seperate GPU.
  • 3 Hide
    antisyzygy , May 18, 2010 2:29 PM
    mikeangs2004You should read more about fusion before you make anymore of these ~~`. It does not mean under fusion the external GPU sub-system option will not be available. If more graphics acceleration is needed then certainly one can add a discrete video card that goes along the APU


    Not to mention the point is that combining a GPU and CPU adds computational options beyond just graphics. You can add a graphics card and still take advantage of the GPU on a Fusion processor in other ways.
  • 16 Hide
    Teen Geek , May 18, 2010 2:43 PM
    hundredislandsboyIntel who? Go AMD!! Along with the GPU, if they can throw in the audio, LAN, and a TB SSD in the CPU die, then I'll be impressed because my desktop won't be the noisy tower it is now.If you had a dual socket system mobo and threw in two of these, is that considered SLI?

    No, it's considered Crossfire
  • 0 Hide
    tolham , May 18, 2010 3:08 PM
    now just put that chip on a PCI-E card with 4GB of RAM and we can reinvent the desktop computer.
  • 3 Hide
    figgus , May 18, 2010 3:26 PM
    xbeaterbeautiful stuff, but the true system builder (most of us on toms) will want a GPU seperate from the CPU, just so we can choose excactly what we want.I do see this becoming ideal in netbooks, small desktops, office computers, HTCP's etc.Love to see computer evolution!


    Actually, I predict that in the future you will have generic sockets on your motherboard that you can either slot a fusion, a dedicated GPU, or a dedicated CPU into or any combination thereof. Talk about the ultimate in customization for power users!!!
Display more comments