A History Of Mobile GPUs
In 2011, the dominant players in mobile graphics were PowerVR and Qualcomm’s Adreno. Both trace their roots back to failed x86 gaming products.
PowerVR is the most prominent contender, as Intel, Apple, and even TI use its technology. Back in the 90s, Imagination Technologies (then known as VideoLogic) started working on an infinite planes, deferred renderer graphics chip with the assistance of NEC. The PowerVR architecture was the first consumer deferred renderer in which visible pixels were drawn and occluded, or covered pixels were thrown away. At that time, other graphics chips were drawing everything, even if the person on the other side of the screen would never see the rendered output. Thus, PowerVR made much better use of available memory bandwidth, and its effective fill rate was higher, too.
The problem with PowerVR was that its team was built by visionary mathematicians and engineers without experience in chip-building and gaming. The original PowerVR PCX chip lacked bilinear filtering, which meant that buyers of the $300 graphics card saw the same pixelated textures as anyone gaming on the original PlayStation, rather than the smooth, bilinear-filtered images associated with the Nintendo 64 and 3Dfx graphics cards from the same time period. It wasn’t that bilinear filtering was a particularly challenging concept; it just wasn’t something that the engineers thought to include when they developed the chip. VideoLogic quickly came back with the PowerVR PCX2, which had a higher clock rate and bilinear filtering. Unfortunately, the PowerVR team wasn’t run by people with game development expertise. And as a result, they did not anticipate the need for src*dst texture blending. This was required for colored lighting—basically the effect needed for awesome explosions, laser beams, and alien-looking hallways. Again, that wasn’t any sort of technical challenge, but rather an issue of just not thinking about the need for this texture blending mode.
Everything was supposed to change with PowerVR Series 2, the platform used for Sega’s Dreamcast. A PC equivalent could have been the most popular graphics chip in the industry. Unfortunately, VideoLogic ran into problem after problem with its chip design. It’d tape-out and get prototypes back, only to discover a fatal glitch somewhere. One of the last problems had to do with the Windows hardware mouse cursor. Again, that wasn’t an engineering challenge, but a mistake nonetheless. The failure of PowerVR Series 2 in the PC world was ultimately what caused the company to exit the high-performance market and focus on low-power designs. It did have a short run of Series 3 chips, which lacked a hardware transformation and lighting engine, and it saw some financial success by powering digital poker machines in casinos, where the magic of deferred rendering was immediate.
PowerVR then switched from a true graphics chip manufacturer to a designer, selling its efforts the same way as ARM. This was the smartest move it could have made because it meant concentrating on core competencies based on math and block design, rather than making sure the logic was laid out ideally for a physical product. Since the company’s original architecture from the late 90s was already engineered for multi-chip design, it has been easy for PowerVR to continue to grow and evolve into the superb platform it offers today.
Putting this sentence aside, its an interesting article.
Finally, I would say I did not like these global claims that intel has never failed in fab as I think they have been delayed for a bit on their last process or always demonstrated great platforms (since the original atoms I would not consider great to use for running windows...). I like intel and own their stock so I hope they do well, but I think they face more of an uphill battle that you see. I don't think that people did not think they would come into the market at a somewhat competitive place in analysis, but I really feel they are a disconnected fit (and this could just be me...) to this market. I have read money market people say that they will have a harder time entering into the smartphone market with ARMS market share expanding greatly in the next 3 years. I like the idea of the pairing with motorola for their chips because I think that will a) tie them to android (as I think meego is dead...) b) may let them offer solution akin to what the Atrix ideal could have been. Overall, an interesting article about future challenges with FAB/Design
You look at just Intel and Qualcomm,ignoring players that are more than capable to compete.
You also assume that performance is the most important aspect when in the end the reality is that CPUs are getting cheaper,a lot cheaper and those cheap chips will keep gaining market share while Intel can't match those prices without getting crippled. Servers and a growing market will help Intel for a while but at some point the funds available for R&D and fabs will start to shrink.(BTW my post,unlike this article,is not sponsored by anyone.)
Also (and more importantly) will the software help Intel in the same way as during the Wintel dominance? Microsoft itself has planned Windows 8 for less resource requirements than Windows 7 has now. Will there any need be for "above the ARM level" of performance in the coming years?
Also (and even more importantly) how Intel will cope with the mounting pressure on its chip prices? If Intel will not be able to held those prices high enough it could fast loose the revenue it is getting now.
In other words: during those three years Intel's ware may become a commodity where only price or Price/performance what is counting. Even now, as noted in today's news by Digitimes:
"TSMC seeing 3G chip orders boom, sources say
Qualcomm, MediaTek and Broadcom have all introduced their more integrated single-chip solutions targeted at the market for low-priced 3G smartphones in China. Each of the new chips - manufactured using 40nm and below node technologies - accounts for less than US$10 of total component cost a model would carry, the sources pointed out."
How Intel will compete with that, not in 3 years, but in 2012? Than in 2013? And finally in 2014?
So, given all that above I could subscribe to your prophecy at all!
though at the end of the article, christian bale didn't have a twin.
The most important piece of the Jigsaw is missing, power consumption. But you would expect thaf from somebody fixated on performance. Intel will struggle to make X86 work in anything other than tablets and High end handsets, it will have a tiny niche in three years, if it is lucky. And with MS opening up Windows they will lose share in thin clients and laptops.