The Story Of Fusion Begins
“Nothing is more difficult than the art of maneuver. What is difficult about maneuver is to make the devious route the most direct and to turn misfortune to advantage.”
—Sun Tzu, The Art of War
When I interviewed Dave Orton, then the president of ATI Technologies, in 2002, one of the first things he told me, "It’s always what’s possible in the business that keeps people going." No more prophetic words could describe the coming merger between his company and CPU manufacturer AMD. The big question, of course, isn’t what is possible or even whether the possible will become reality. The real question is whether the possible will become reality soon enough.
Orton spent most of the '90s with Silicon Graphics and, in 1999, when almost anything in technology seemed possible, he left SGI to join a little core logic startup called ArtX. The little company won the development contract for Nintendo’s GameCube, which went on to sell a few units (somewhere north of 20 million). That fall, ArtX showed its first integrated chipset at Comdex, and immediately the company flashed on the industry’s radar as a prime acquisition target.
Ultimately, ATI was able to put ArtX in its pocket and made Orton its president and COO. Then the tech bubble burst, driver problems abounded, schedules slipped, and, for a while, it seemed that ATI could do nothing right.
Part of the road back to glory hinged on Orton figuring out how to complete the meshing of these two development teams. He was the one who figured out how to get ATI on a 12-month cycle for new architectures and six- to nine-month cycles for iterative design revisions. Product teams were given more control and responsibility. And slowly, over 18 months, perhaps, with Nvidia kicking it in the ribs at every turn, ATI managed to get back on its feet. The company rediscovered how to execute.
"Just step back and understand your roots," said Orton. "Constantly build. You can never be satisfied with where you are. You’ve got to be satisfied with where you can be and then drive to that."
Back on top of its game, Orton knew it was time to keep driving—but to where? I detected no glimmer of the future in our 2002 discussion. ATI continued to excel at integrating graphics into northbridge chips, and Intel, which still viewed integrated graphics as only needing to be good enough for business apps, was still more of a partner than a competitor.
However, in a keenly prescient moment, Orton told me, "I guess if I could change one thing about computing, I’d like it to be more open to create a broader range of innovation. I recognize the advantages of standards. Standards provide opportunity."
At two different points in our conversation, Orton lamented his daily Silicon Valley commute, even saying that if he could invent anything, no matter how fantastic, it would be a Star Trek-esque transporter. So perhaps we can take him at his word when, in 2007, he left his post as executive vice president of AMD in order to spend more time with his family. But this is jumping ahead. First, Orton’s drive from Toronto was about to take a hard southern turn, straight down to Texas.
With Haswell coming next year, Intel might just beat AMD at HSA. They need to deliver a competitive product.
I think you were being overly kind about the current CEO's ability to guide the company forward.
Dirk Meyer's vision is what he is currently leveraging anyway.
A company like that needs executive leadership from someone with engineering vision ... not a beancounter from retail sales of grey boxes.
History will agree with me in the end ... life in the fast lane on the cutting edge isn't the place for accountants and generic managers to lead ... its for a special breed of engineers.
They don't have the efficiency of Ivy Bridge, or Medfield, they don't have the power of Ivy Bridge, and they're missing out on this round of the Discrete Graphics battle (they were ahead by so far, but nvidia seems to have pulled an Ace out of their butt with the 600 series). So what exactly IS AMD doing well? HTPC CPUs? Come on! The adoption rate for the system they're proposing with HSA is between 5 and 10 years off....and because they moved too early, and won't be able to compete until then, they have to give the technology away for free to attract developers.
Financially, this a company's (and a CEOs) worst nightmare...they're too far ahead of their time, and the hardware just isn't there yet.
This will end of being just like the tablet in the late '90s, and early '00s. It won't catch on for another decade, and another company will spark, and take advantage of the transition properly, much to AMD's chagrin.
I'm not sure if it was the acquisition of ATI that made AMD feel like it was forced to do this so early, but they aren't going to force the market to do anything. This work should have been done in parallel while making leaps and bounds within the framework of the current model.
You can't lead from behind.
I've always been a fan of AMD. They've brought me so e of the nicest machines I've ever owned...the one that had me, and still have me most excited. But I have, and always will buy what's fastest, or best at the job I need the rig for. And right now...and for the foreseeable future, AMD can't compete on any platform, on any field, any where, at anytime.
AMD just bet it's entire company, the future of ATI (or what was the lovely discrete line at AMD), the future of their x86 platform, and their manufacturing business all on something that it wasn't sure it would even be around to see. They bet the farm on a dream.
Nonetheless, i disagree that you were being overly kind about the CEOs ability to lead the company. I think you're being overly kind for thinking this company has a viable business model at all. Theyll essentially have to become a KIRF (sell products that are essentially a piece o' crud, dirt cheap) f a compay to stay alive.
This is mostly me ragin at the fail. The writer of this article deserves whatever you journalist have for your own version of a Nobel.
This was a seriously thorough analysis, and by far the best tech piece i've seen all year. We need more long-form journalism in the world, for i her way too many people shouting one line blurbs, with zero understanding of the big picture.But i have to say, that while this artucle is 98% complete, you missed speaking anout the fact that this company is a company...an enterprise that survives only with revenue.
Now, does anyone want to play Crysis in software rendering with max eyecandy?
It's not over until the fat lady sings. As I read your post, I felt that you were missing a (or the) big point of the APU and this article.
It's about how software is developed nowadays and how there is such a huge reserve of potential performance waiting to be tapped into. I could imagine that if future software bite into this "evolution" to more GPGPU programming then I would expect a huge jump in performance even on the current, or shall I see currently being phased out, Llano APU's.
Yes, current discrete GPU systems would improve in performance as well significantly I would think, but to the same degree that APU's would improve, especially with the new technologies to be implemented like unifying memory spaces, etc? I don't think so.
I'm not saying that you're totally wrong. AMD might end up croaking, but we can't say for certain 'til it happens. Don't you agree? :-) (I'm not picking any fights BTW. Just sharing my thoughts.)
Funny, but last I checked, AMD's Radeon 7970 GHz edition is the fastest single GPU graphics card for gaming right now, not the GTX 680 anymore. Furthermore, AMD can compete in many markets in both GPU and CPU performance and price. AMD's FX series has great highly threaded integer performance for its price (much more than Intel) and the high end models can have one core per module disabled to make them very competitive with the i5s and i7s in gaming performance. Going into the low end ,the FX-4100 and Llano/Trinity are excellent competitors for Intel. Some of AMD's APUs can be much faster in both CPU and GPU performance than some similarly priced Intel computers, especially in ultrabooks and notebooks where Intel uses mere dual-core CPUs that either lack Hyper-Threading or have such a low frequency that Hyper-Threading isn't nearly enough to catch AMD's APUs. Is this always the case? No, not at all. However, you ignore this when it happens (which isn't rare) and you ignore many other achievements of AMD.
As of right now, there is no retail Nvidia card that has better performance for the money (at least when overclocking is concerned) than some comparably performing AMD cards anymore. The GTX 670 ca't beat the Radeon 7950 in overclocking performance and it can't beat the 7950 in price either. The GTX 680 is no more advantageous against the Radeon 7970 and 7970 GHz Edition. I'm not saying that these cards don't compete well or that they don't have great performance for the money (that would be lying), but they don't win outside of power consumption, which, although important, isn't significant enough of an advantage when the numbers are this close.
Whether or not AMD will fail as a company remains to be seen. Maybe they will, maybe they won't. However, if you want to say that they do, then the supporting info that you give should be more accurate.