Just after the turn of the century, AMD took a big gamble on K8—better known as Athlon 64—and gave up the pursuit of clock speed in the interest of successfully executing more instructions per clock, in addition to introducing native 64-bit extensions. Meanwhile, Intel leveraged its manufacturing superiority to push the NetBurst architecture as fast as possible. It expected to see the Pentium 4 hit 10 GHz, in fact.
Of course, the Pentium 4’s high clocks quickly ran into the immovable walls of physics and power usage, and the realistic limit turned out to be closer to 4 GHz. If you wanted the highest-performing CPU money could buy at that time, you probably bought an Athlon 64; back then, Pentium 4 processors cost more and achieved less. It took a while for the market to accept it, but AMD’s David was beating Intel’s Goliath.
But Goliath didn’t give up; rather, it woke up. Intel moved on from the Pentium 4’s doomed NetBurst design and started over with the Core architecture—though it wasn’t really starting over at all. The tenets of Core were born from earlier efforts in the mobile space. Naturally, it came out better, faster, and less power-hungry. If you do a lot of fast-forwarding, the Nehalem-based Core i7 came next, followed most recently by the Sandy Bridge-based 32 nm desktop Core i3/i5/i7 CPUs.
Somewhere along the line, AMD allowed its unexpected advantage in computing to shrink and then disappear. Now, to be brutally honest, AMD’s fastest Phenom II processors fare better against Core 2 Quads than modern Core i7s. In fact, the $125 dual-core Core i3-2100, manufactured at 32 nm, stands right up to AMD’s $150 quad-core Phenom II X4 955 (a 45 nm part) in many benchmarks. AMD is more than a generation behind when it comes to desktop CPU performance, and continues to leverage the same Stars architecture that it first introduced more than two years ago. Squeezing out an extra hundred MHz every couple months kept the company’s momentum moving forward. However, when your main competitor is launching new architectures, it’s almost impossible to compete through incremental speed-ups. Frankly, it’s hard to recommend the AM3 platform for a new build today.
Perhaps realizing that it didn’t have the R&D resources of its primary competitor, AMD made another gamble back in 2006: it acquired ATI, the graphics card company responsible for the Radeon products many of you know and love. Shortly after the merger, AMD’s Fusion initiative was announced. The plan was to combine central processing and graphics processing resources on the same die. It took five years, but the first commercial Fusion processors were released earlier this year on the Brazos platform, and the E- and C-series APUs have already proven very viable in the notebook and netbook space. AMD even claims that it sold out of these APUs in Q1 2011. From the graphics angle, no Intel Atom-based platform can compete. Brazos even outclasses Atom when it’s complemented by Nvidia’s Ion 2 platform.
While low-power netbooks are an ideal market for Fusion, the laptop and desktop segments are far more competitive. All Sandy Bridge-based Core i3/i5/i7 processors are equipped with Intel HD Graphics, which is fairly capable when it comes to basic productivity tasks in Windows, video playback, and even light gaming. If Fusion is to prove itself on its own terms, it has to deliver something special: true discrete-class graphics performance, along with competitive CPU performance.
Today we get our first taste of the Llano APU, which is intended to address mobile and desktop customers. Here’s where we see if the gamble pays off. And it needs to. The current Phenom II and Athlon II have little to offer beyond the $100 price point compared to the competition. Sure, a case can be made for the $160-and-over Phenom II X6 processors if you’re into heavily-threaded applications. But in general, Sandy Bridge-based chips are kicking butt in comparisons based on performance, power, and value.
AMD needs a way to differentiate itself from Intel in order to woo customers. The Fusion initiative might be the key to that goal in the notebook space. After all, the company claims Llano offers better battery life and graphics performance compared to a similarly-priced Sandy Bridge-based platform, with the added promise of OpenCL compute potential from the Radeon core’s shaders. AMD is serious about Fusion's future; over half of its notebook processors are APUs right now, and it anticipates that these will represent more than 90% within a year (Ed.: This isn’t surprising, of course, given AMD’s lack of a commanding presence in the notebook space up until now).
We expect the Fusion initiative to struggle for a foothold a little more in the desktop space, where it’s relatively easy to add discrete graphics. But AMD has a benefit to offer here, too: Llano’s graphics engine can work in conjunction with an add-in card in Dual Graphics mode. In layman’s terms, Dual Graphics is a flexible asymmetrical version of CrossFire that allows the APU’s resources to render in cooperation with a Radeon HD 5000- or 6000-series board for a frame rate boost.
Of course we’d be remiss to ignore AMD’s next-generation micro-architecture, code-named Bulldozer. The Stars replacement should arrive in the third quarter of this year—within the next three months, basically. This represents the first fundamental retooling of AMD’s CPU design since the Athlon 64. So, right out of the gate, Llano’s days are numbered, and its replacement (code-named Trinity) is already slated to swap the CPU block with Bulldozer-derived silicon.
But let’s not get ahead of ourselves. It’ll be 2012 before we see Trinity, and that’s if it’s on time. Let’s just focus on the here and now.
What are Llano’s sexiest attributes? Roughly half of its die is a Phenom II X4 CPU stripped of the 6 MB L3 cache, but with L2 cache doubled to 4 MB. The other half is composed of something very similar to a Radeon HD 5570, with up to 400 Radeon cores (what AMD used to call Stream cores; apparently that name went out of vogue already) and an updated UVD3 video block. All of this is plumbed together on a single 32 nm chip.
That’s the short explanation. Of course, there’s a lot more going on here and we’re about to dig in to the details. Having said that, if you know what a Phenom II X4 and a Radeon HD 5570 can do together then you already have a pretty good idea of where we’ll end up in this piece.
....big win there...
Ditto on the "Good Job AMD" definitely on the right track.
The NDA is up on the 30th.
1) What happened to the Game Charts results for the Radeon HD 5570, when the games were benchmarked? I thought you made a point to say you were going to compare the APU's 6620G with a discrete card (that has the same number of SPs and same clock). So much for that, unless you thought only comparing the two with a synthetic test was enough. Oh well. Tom's can be such a tease!
2) I'm just a little disappointed that the APU's graphics power was not able to double Intel's.... Under the best of circumstances, AMD's latest integrated graphics came close to being twice as fast, but i guess that is ok since we are not playing horseshoes. I just thought it would be nice if it had made a nice even doubling, or more. Now, i'm worried IVY BRIDGE will beat it....
we ARE playing horseshoes...
and i have to give credit where credit is due: props to AMD for almost doubling Intel's HD Graphics in the integrated space....