An Introduction To The Major ARM-Based SoCs

Apple's A-Series SoCs

Over the past few years, Apple demonstrated an increasing desire to create its own custom SoC hardware, tuning architecture, differentiating functionality, and optimizing for cost.

Apple A-Series

Until the iPad and iPhone 4, Apple only used off-the-shelf SoCs from other companies. At that point, however, it began adding more customization. The CPU and GPU were still reference (800 MHz Cortex-A8 and PowerVR SGX535). However, Apple implemented technology around those complexes to improve battery life.

In 2011, the following year, Apple shipped the A5 SoC in its iPad and iPhone. Also designed by Apple (naturally), the A5 employed a dual-core 1 GHz Cortex-A9 CPU and PowerVR SGX543MP2 graphics engine that were much faster that their predecessors and could handle higher resolutions much better.

The third-gen iPad emerged in 2012 with an Apple A5X. It was almost identical to the A5 but had twice as many GPU cores. Therefore, in graphics-bound titles, it was up to two times as fast. Apple's newer GPU was nevertheless unable to properly handle the iPad's Retina screen. Even though it was twice as powerful, the new iPad had four times as many pixels, requiring even more performance to match its predecessor. So, later that same year, Apple introduced the iPad 4 with an A6X SoC that not only quadrupled the A5's graphics potential, but also included a proprietary dual-core “Swift” host processor running at 1.3 GHz. In a great many cases, it was faster than the competition. That was a first for Apple. Up until then, the company tended to go conservative with its host processor, preferring to favor long battery life, which invariably led to weaker benchmark results.

Apple A7

The A7 took everyone off-guard, especially competing SoC-makers. No one was expecting a CPU based on the ARMv8 instruction set to arrive for at least another year. But Apple released one in late 2013.

Even now, Qualcomm, the mobile chip leader, doesn’t have its own ARMv8-based architecture. Its only imminent releases involve off-the-shelf Cortex-A53 configurations bearing Snapdragon branding and the Snapdragon 808 and 810, which will utilize off-the-shelf Cortex-A57 cores. Only Nvidia is supposed to ship its Denver CPU this year, and Samsung may or may not release an Exynos chip based on Cortex-A57.

Nobody really knows how Apple managed to design and ship an ARMv8 chip so quickly. But it did, and is milking the marketing for all it's worth by claiming to offer the first desktop-class 64-bit chip for its phones and tablets. Competing processor companies were forced to scramble and promise their own 64-bit devices as soon as possible, too.

Unlike Swift, which is similar to Qualcomm's Krait, the Cyclone architecture is much wider, managing to handle up to six instructions at once (compared to three for Krait and Swift). In the real world, or at least in benchmarks, that seems to translate to roughly 50%-higher performance versus Swift at the same 1.3 GHz frequency.

With the 64-bit Cyclone, Apple showed all chip makers, including Intel, that it's serious about making powerful processors.

  • de5_Roy
    tegra and zune!?! rofl!
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....
    wonder why mediatek, allwinner left out of the "big players" while nvidia in nowhere on the "competitive landscape" or anything that qualifies as such. :pt1cable:
    Reply
  • therogerwilco
    Yay! ARM chips!
    Half the cost and half the performance!
    Reply
  • pierrerock
    Power efficient does not mean performance wise ...
    Reply
  • InvalidError
    14007665 said:
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....
    The Starks have been dropping like flies. Maybe Nvidia got worried HBO would finish killing them off in the fifth season.
    Reply
  • adamovera
    tegra and zune!?! rofl!
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....
    wonder why mediatek, allwinner left out of the "big players" while nvidia in nowhere on the "competitive landscape" or anything that qualifies as such. :pt1cable:
    Haven't heard peep about Stark for a very long time, but the followup article, scheduled for next week, focuses on lesser-known Chinese ARM-based SoCs ;)
    Reply
  • urbanman2004
    Tegra gives ARM a run for its money
    Reply
  • Jak Atackka
    I'm interested to see how well the Tegra K1 performs in market. It would be great if it was successful, because that will push Qualcomm and other manufacturers to develop more powerful chips as well. Competition benefits us consumers, and technology as a whole.

    As ARM chips become more powerful and x86 chips become more power efficient, it won't be long until the two of them meet. I'm curious to see which format will win that war. One thing's for sure, the next decade will be a very exciting time for mobile computing.
    Reply
  • InvalidError
    14018982 said:
    I'm interested to see how well the Tegra K1 performs in market. It would be great if it was successful, because that will push Qualcomm and other manufacturers to develop more powerful chips as well. Competition benefits us consumers, and technology as a whole.
    The Shield Tablet murders its battery in just over two hours when its IGP gets pushed to its limits so I doubt the K1 will be particularly popular for products where small size and long battery life are priorities. If it does manage to succeed, it will be in larger devices that can accommodate larger batteries like Chromebooks and mobile devices specifically designed for mobile gamers.
    Reply
  • palladin9479
    Tegra 4 was actually pretty powerful graphics wise. The problem is that it wasn't power efficient and thus got throttled when used in a smartphone. The Shield on the other hand actually lets it go full out, it's even got a small heatsink and airvents which do get hot after you've been using it for awhile. The K1 is similiar, it provides great visuals and is very powerful, but sucks power and generates heat doing so.
    Reply
  • Bulat Ziganshin
    everyone reports that 5433 will be 64-bit: http://www.droid-life.com/2014/08/20/galaxy-note-4-powered-by-64-bit-exynos-5433-benchmarked-only-beat-by-one-other-chipset/
    Reply