Today is AMD’s 2010 Financial Analyst Day, and we have some more details on its Fusion APUs. Llano is still a few months away. In the meantime, we have a preview of the Brazos platform, which will voraciously tackle mobility under the $500 price point.
According to recent Q3 market share numbers, the perpetual back and forth between AMD and Intel is once again in a state of flux. The larger company is taking substantial chunks of away from AMD in the high-end server market, thanks mostly to the Xeon family's transition to Nehalem-based designs in the 1P, 2P, and MP segments. And we do mean huge, as AMD’s overall server market share fell from ~10% to 6.5% (source: IDC). The Opteron 6000-series is holding its own, but it really isn’t putting up a good enough of a fight to retain its foothold.
This reinforces our belief that the Magny-Cours-based server processors only serve as a temporary placeholder. That product line is meant to hold Intel back until Bulldozer becomes available. This isn’t necessarily a bad strategy, as Q3 was the 4th consecutive quarter that we saw the price paid per CPU rise (Oct. 27th report from Mercury Research), which only helps to advocate AMD’s lower price points.


AMD is doing better in the desktop world, which makes up the largest portion of its CPU portfolio. Comparatively, Intel has a much beefier portion of the notebook market. However, all three market segments have seen some of the slowest per quarter growth (1.9% for Q3), less than a third of the historical numbers. Server sales still outrank desktop, while mobile CPU numbers fall to third place. This disproportionate growth is why AMD saw some mild gains in Q3, as the company's larger desktop CPU foothold helped shore up revenue lost to a slowdown in the mobile market.

Source: IDC, Jon Peddie, Mercury, Intel
Why does this all this matter? We’ve covered much of this information in our manager surveys, but Intel and AMD are both about to throw monkey wrenches into the graphics battle, too. The two companies are on the cusp of unveiling new processor platforms that feature integrated graphics technology. In September, we saw Sandy Bridge at IDF, which will put CPU and graphics processing on a single piece of silicon etched at 32 nm. It is definitely exciting stuff. But Intel's solution is, ironically, facing the unlikely underdog position, as very few industry veterans have much faith in the company's ability to deliver even mediocre graphics capabilities. Moreover, it is keeping the architecture's fixed-function media encoder close to its chest. Very few folks have seen it in action.
Meanwhile on October 19th, AMD showed off its upcoming Llano APU (Accelerated Processing Unit--the company’s acronym for a CPU/GPU hybrid).

In general, the real divider for the graphics industry isn't integrated versus discrete. What separates the proverbial men from the boys is performance. It’s the reason every gamer worth his salt cringes when you stick him with an IGP-based platform. Performance is the very reason people complain about high bit rate HD video playback on low-end systems like netbooks.
Historically, IGPs have never approached the low-end discrete space. There's just too much of a power/heat difference between something you stick under a passive heatsink the size of a postage stamp soldered onto a motherboard and the real estate available on even a single-slot add-in card. As a result, those two markets are as divided as oil and water. Intel’s Sandy Bridge architecture and AMD’s Fusion initiative are about bringing in the tide that'll blur this line in the sand.

Remember, Intel has a bigger portion of the graphics pie, thanks to its northbridge-based integrated graphics solutions and more recent HD Graphics engine, built onto the Clarkdale and Arrandale CPUs.
That leaves AMD and Nvidia to duke it out in the discrete graphics space, while Intel watches from its cushy vantage point, not really needing a competitive offering. Even when we were dealing with front-side bus-hobbled CPUs, Intel could always outmode Nvidia, AMD, SiS, and VIA chipsets in sheer sales thanks to price and compatibility. System vendors could always trust an Intel CPU paired to an Intel chipset. This isn’t to say that third-party chipsets didn’t work. However, they often required extra effort on the part of the ODM or OEM. As they say, when you have a problem, it's always better to have one throat to choke.
As we start working with more proprietary interconnects like DMI and UMI, Intel and AMD can both deny Nvidia the ability to sell its own compatible chipsets. Particularly now that we no longer need a separate northbridge, the integrated graphics fight is going to be purely AMD versus Intel--that is, until the Delaware courts tell Nvidia otherwise or VIA achieves more than a 1% market share.
As far as graphics performance goes, it's fair to say that Intel has a lot more to prove with Sandy Bridge than AMD does with the upcoming designs in its Fusion program, if only because of the expertise introduced by ATI. Of course, we'll spend more time with Sandy Bridge soon, but today is AMD’s 2010 Financial Analyst day, and so we can finally spill a few beans on what Fusion will mean in the months and years to come.
Well, at least that's what i think, probably going to get downed a bit.
Cheers,
Andrew Ku
TomsHardware.com
Well, at least that's what i think, probably going to get downed a bit.
* Decent battery life (if needed for plane trips) *possibly??
* Around 15" screen (I like this size, 10" seems too small for me) *possibly??
* Be able to play HD (whether online or off a DVD) *sounds like a good possibly??
* Be able to play most recent games on (even low settings is fine) *possibly??
*
* Decent LCD resolution on 15" of at least 1080p resolutions (16x9 or 1600 x 900) would be better than the 1366 x 768 crap that their pushing out these days. I even like my current 1280 x 800 resolution better than the 1366 x 768 stuff, IMO.
Primarily, it seems odd to develop a propritary southbridge interface (UMI) when it would have been much more cost and power effective to include a few more gatable PCIe controllers on die instead. And thereby have all non-display/non-memory related off-die communication consolidated into a single standard interface.
That would enable much more cost-effective 3rd party controller chips, like standard PCIe USB chips, standard PCIe legacy i/o chips, standard PCIe SATA controllers, and so on. It would offer much more versatility for OEM design, at an even better price point.
I also seems odd to have a on-die VGA DAC when that could just as easily have been relegated to be produced from a 3rd party HDMI-to-VGA bridge controller. It would have saved some die real-estate as well as not having to directly support a legacy display that is rapidly disappearing.
I have a feeling nvidia may suffer without branching out to the cpu market as well, as it seems that it's becoming the more dominant.
UMI is based on pci-e, or at least that is how it connects to the cpu (which has 8 pci-e lanes)