Merger And Mayhem
In the summer of 2006, most people didn’t grasp the strategy of what would eventually be called Fusion, the melding of CPU and GPU on a common die. Like most, Ars Technica at the time viewed the merger as a way for AMD to bolster its portfolio breadth, freeing it from reliance on third parties for chipsets and expanding the company into areas such as ultramobile graphics and digital TV.
For their part, AMD and ATI remained thoroughly mute. While some of the silence may have been mandated, owing to lengthy legal process of two large companies melding, a more practical reason might have been the safeguarding of existing sales.
"At any point in time, you’re married to a lot of partners," says AMD’s Macri. "I use the word married because they’re very deep relationships, both in business and, at the end of the day, a personal level. Business is about personal relationships. We make commitments to each other. We might embody them in contracts, but part of that is we’re making a big personal commitment. And you are only as good as your word. If you throw the grand vision out there without the time that it takes to move all your partners to the vision, you lose all your partners. AMD at the time had Nvidia as a very strong chipset and graphics partner. You can’t flip those relationships on and off like a switch. So the guys were somewhat limited in their ability to explain to the world this grand vision and how it would all play out."
In early 2006, AMD’s stock price hovered just above $40 per share. One year later, at a time when the market was nearing its pre-recession peak, AMD had tumbled to under $15. Two years, later, it was bouncing on a $2 floor. A five-year comparison between AMD and Intel shows the story from pre- to post-recession. While Intel looks relatively flat, the rise and fall of AMD is as exhilarating as it is heartbreaking.
Economic downturn aside, what happened? Heading into late 2006, AMD entered into the first of what would become seven consecutive quarterly losses preceding Hector Ruiz’s resignation. Intel’s Core architecture was out and ramping. Nvidia’s GeForce 7-series, launched in June 2005 to considerable fanfare, gave way to the even better 8-series in November 2006. Meanwhile, the delay-plagued ATI Radeon X1000 series arrived in 2005; there was no major 2006 update. The follow-up Radeon HD 2000 didn’t launch until April 2007—in Tunisia, of all places—and even though AMD/ATI’s performance was starting to edge back up, its momentum in the market had significantly slipped.
And those were just the visible problems. Behind the scenes, in the back rooms where the two companies were trying to figure out how to coexist and blend, matters were even more muddled.