Sign in with
Sign up | Sign in

28nm Trinity Successor Rumored To Debut in Q2 2013

By - Source: Fudzilla | B 39 comments

Rumors about the successor of Trinity are beginning to surface on the Internet.

The new APU could be released to manufacturers in sample quantities sometime in Q4 of this year and make their its way to consumers by late Q2 2013.

Fudzilla reports that the FM2-socket based Richland will be released in dual- and quad-core versions with a Radeon 8000-series DirectX 11 GPU core with up to 384 processing cores in the flagship A10 APU, and 128 cores in dual-core mainstream versions.

It is unclear whether AMD will be able to finish the design of its Steamroller CPU core for the Richland APU. While AMD is seeing quite a bit of pressure from Intel to compete with the company's 22 nm upgrade "Haswell", there appears to be internal pressure as well. Rumor has it that AMD is thinking about building impoved accelerators for web technologies such as WebCL directly into its hardware.

Steamroller could be the first APU to support this strategy.

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
Top Comments
  • 23 Hide
    back_by_demand , July 9, 2012 4:14 PM
    The low end versions of these would be low enough TDP to use for nettops, but powerful enough for relatively powerful gaming. Obviously not very high end games but totally ready for an all-in-one HTPC.
  • 21 Hide
    The Greater Good , July 9, 2012 4:25 PM
    back_by_demandThe low end versions of these would be low enough TDP to use for nettops, but powerful enough for relatively powerful gaming. Obviously not very high end games but totally ready for an all-in-one HTPC.


    A lot of us tech guys (and girls) forget that not everyone needs the power that we do. Heck, sometimes WE don't even need it. This APU would meet the needs of most computer users and coupled with an SSD, would be great.
  • 16 Hide
    The_Trutherizer , July 9, 2012 4:15 PM
    goodguy713the release cycles are already too long ..


    Wut? Feels like just yesterday the first APU came out. What's the hurry these days? Nobody can afford to buy every shiny new toy anyway.
Other Comments
    Display all 39 comments.
  • -3 Hide
    goodguy713 , July 9, 2012 4:07 PM
    the release cycles are already too long ..
  • 23 Hide
    back_by_demand , July 9, 2012 4:14 PM
    The low end versions of these would be low enough TDP to use for nettops, but powerful enough for relatively powerful gaming. Obviously not very high end games but totally ready for an all-in-one HTPC.
  • 16 Hide
    The_Trutherizer , July 9, 2012 4:15 PM
    goodguy713the release cycles are already too long ..


    Wut? Feels like just yesterday the first APU came out. What's the hurry these days? Nobody can afford to buy every shiny new toy anyway.
  • 21 Hide
    The Greater Good , July 9, 2012 4:25 PM
    back_by_demandThe low end versions of these would be low enough TDP to use for nettops, but powerful enough for relatively powerful gaming. Obviously not very high end games but totally ready for an all-in-one HTPC.


    A lot of us tech guys (and girls) forget that not everyone needs the power that we do. Heck, sometimes WE don't even need it. This APU would meet the needs of most computer users and coupled with an SSD, would be great.
  • 6 Hide
    Maher90 , July 9, 2012 4:59 PM
    Trinity and "RichLand" are the most recommended Processors (APU's i know) for any Budget Limited People,infact i watched on youtube the upcoming A10 and it seems good with Low and some Med settings on BF3 and other things :D  (although i don't know why they added DX11 if it's not runnable at all :l?) i really like those APU's and well i guess Intel will succeed in everything even if they made Mini CPU's like what i like to call them,they will win with it.
  • 0 Hide
    A Bad Day , July 9, 2012 5:19 PM
    Quote:
    FM2-socket based Richland will be released in dual- and quad-core versions with a Radeon 8000-series


    I thought APUs typically use the previous generation GPU architecture?
  • 3 Hide
    dudewitbow , July 9, 2012 5:23 PM
    A Bad DayI thought APUs typically use the previous generation GPU architecture?

    The IGP in trinity uses 7xxx, it would make sense that the next gen chip will use the 8xxx. This also brings a high probability that the next radeon generation should be released before then as well.
  • 4 Hide
    boiler1990 , July 9, 2012 5:47 PM
    Looking forward to these. Would like to assemble a storage server/HTPC in the near future and might hold out for these if they come out before the Ivy Bridge i3s.
  • 11 Hide
    supall , July 9, 2012 6:07 PM
    goodguy713the release cycles are already too long ..


    I fail to see how "a year" for a new Trinity APU is "too long".
  • 1 Hide
    werfu , July 9, 2012 6:18 PM
    A Bad DayI thought APUs typically use the previous generation GPU architecture?


    I makes no sense to use previous generation design in an APU, as the GPU part of it will take part of the global envelope. You want to squeeze out the best performance per Watt, to leave the most thermal capacity to the CPU part, where it is badly needed.
  • 0 Hide
    shloader , July 9, 2012 6:18 PM
    The_TrutherizerWut? Feels like just yesterday the first APU came out. What's the hurry these days? Nobody can afford to buy every shiny new toy anyway.


    More like yesteryear. Even then it was packing Phenom cores while Piledriver came out. Granted that was the best choice at the time but APUs should be keeping up with the times from now on. Having a A8-3850 I see no compelling reason to scrap the motherboard to upgrade the CPU but I'm happy to see progress in this area all the same. When APU meets DDR4 I think I'll step up.

    Speaking of 'keeping up' the FX line needs to get the revised (fixed?) design, too. AMD doesn't have a compelling reason to upgraid from the 1090T, yet.
  • 0 Hide
    A Bad Day , July 9, 2012 6:31 PM
    dudewitbowThe IGP in trinity uses 7xxx, it would make sense that the next gen chip will use the 8xxx. This also brings a high probability that the next radeon generation should be released before then as well.

    werfuI makes no sense to use previous generation design in an APU, as the GPU part of it will take part of the global envelope. You want to squeeze out the best performance per Watt, to leave the most thermal capacity to the CPU part, where it is badly needed.


    http://www.tomshardware.com/reviews/a10-5800k-a8-5600k-a6-5400k,3224.html

    Quote:
    Moreover, Trinity employs a newer graphics architecture than Llano. Instead of the VLIW5 arrangement, which also sat at the heart of Radeon HD 6800 and older GPUs, it utilizes the VLIW4 design that went into AMD’s Radeon HD 6900-series cards. Everything after the 6900s swapped over to Graphics Core Next, so VLIW4 isn’t a very prolific implementation. But it’s supposed to be more efficient. Naturally, then, we all want to see how Trinity’s on-die GPU compares to what came before.


    I don't know why AMD would use a older architecture. Maybe it was because the APU and GPU developments aren't synced and/or the APU team has little time by the time the new GPUs arrive.
  • 1 Hide
    A Bad Day , July 9, 2012 6:36 PM
    Also:

    http://www.anandtech.com/Show/Index/4455?cPage=5&all=False&sort=0&page=7&slug=amds-graphics-core-next-preview-amd-architects-for-compute

    Quote:
    What we know for a fact is that Trinity – the 2012 Bulldozer APU – will not use GCN, it will be based on Cayman’s VLIW4 architecture.
  • -6 Hide
    sonofliberty08 , July 9, 2012 6:50 PM
    hope that richland will beat llano on every bench, current trinity still can't beat llano on some bench yet,
    if not, i still think the die shrink and improvement of the starcore are better than the new bulldozer architecture.
  • 7 Hide
    jryan388 , July 9, 2012 7:07 PM
    As much as I like to bash bulldozer/piledriver, I think it's probably fine for most people... and with a great igp, it's better than intel...
  • -4 Hide
    verbalizer , July 9, 2012 7:12 PM
    jryan388As much as I like to bash bulldozer/piledriver, I think it's probably fine for most people... and with a great igp, it's better than intel...

    :/  - you must have lost your mind..
  • 4 Hide
    eddieroolz , July 9, 2012 7:46 PM
    Essentially, this more or less confirms that we'll see desktop HD8000 series between now and Q1 2013.
  • 2 Hide
    blazorthon , July 9, 2012 8:04 PM
    dudewitbowThe IGP in trinity uses 7xxx, it would make sense that the next gen chip will use the 8xxx. This also brings a high probability that the next radeon generation should be released before then as well.


    The Trinity IGP uses die-shrunk Radeon 6900 series VLIW4 cores. The Llano IGP used Radeon 5000 VLIW5 cores, not even Radeon 6000 VLIW5 cores. They might be called Radeon 7000 and 6000 IGPs, but that's just because of their release times and some of their feature sets. For example, Trinity is supposed to have the Radeon 7000 VCE feature. However, it still is a 32nm die shrink of VLIW4, not a GCN implementation.
  • 4 Hide
    blazorthon , July 9, 2012 8:10 PM
    NinjawithagunToo little, too late - AMD is done. I foresee AMD giving up on the CPU market and exclusively developing graphics cards only by end of 2013. AMD had a chance to keep up with Intel starting back in the mid-2000s. But, unfortunately thanks to extremely sloppy CEO management, they are no longer competitive within the CPU market. Intel is literally outclassing and outperforming AMD CPUs in every range of the CPU families. How sad it was to see AMD release its brand new Bulldozer CPU family, only to see it outperformed by Intel's 1st generation Sandybridge CPU family! Seeing a quad-core CPU with hyperthreading beat the pulp out of a true octa-core CPU is sad indeed.


    AMD wasn't competing well with Intel in the older days of superior AMD CPUs being outsold by slower and more expensive Intel CPUs because of Intel's illegal and monopolistic practices that they are still being fined for to this day. AMD later on had sloppy management problems and still does, but back then, that was not their problem. Furthermore, Intel is not winning in everything. At any given price point, AMD easily wins in highly threaded performance and when you get down to the very low end, Intel has nothing but dual core CPUs that lack even Hyper-Threading Technology, so they have nowhere even near AMD's highly threaded performance or even near AMD's quad threaded performance.

    Also, taking an FX-6100 or FX-8120 and disabling one core per module (or prioritizing one core per module over using both cores except for highly threaded workloads) gives them a significant speed boost in per core performance while decreasing power consumption even more greatly. A $170 or so 8120 that can compete with the non K edition i5s in gaming performance and the 6100 in the same situation at a lower price point and only up to triple threaded performance can be very competitive today, although Haswell would almsot defintiely outclass them both substantially.
  • 2 Hide
    blazorthon , July 9, 2012 8:18 PM
    A Bad Dayhttp://www.tomshardware.com/review [...] ,3224.htmlI don't know why AMD would use a older architecture. Maybe it was because the APU and GPU developments aren't synced and/or the APU team has little time by the time the new GPUs arrive.


    An APU needs to be built off of GPU and CPU cores that already work. They will be slightly modified for the APU to work with both parts together, but they will be mostly the same as preexisting implementations. Can't include GPU/CPU cores that aren't built or almost built when the APU designs start being worked on because they would not be able to account for the newer parts because the newer parts aren't even built yet. It would be like trying to use a Core 2 CPU in a P4 motherboard before Core 2 is even taped out.

    So, AMD uses the best that they can for the time. When Trinity was being designed, GCN was not finished yet, but Cayman's VLIW4 was finished quite a while before Trinity started being designed and was the best that AMD had at the time. One die shrink later, it's probably about as energy efficient as a 28nm GCN GPU of similar performance would have been anyway, so the only major loss would probably be in compute performance.
Display more comments