An Introduction To The Major ARM-Based SoCs

When it comes to mobility, ARM-based SoCs dominate. Join us as we take a look at where the flagship processors from Qualcomm, Apple, Nvidia, and Samsung land in the competitive landscape.

History

ARM is a multinational fabless semiconductor and software design company based in Cambridge, England. It was founded in 1990 through a joint venture between Acorn Computers, Apple, and VLSI Technology. The company's name originally came from “Acorn RISC Machine”. At the time of incorporation in 1990, ARM became Advanced RISC Machines Ltd. Later, when it went public in 1998, the name changed again to ARM Holdings.

Throughout its history, ARM has acquired multiple companies. One of the most notable was 3D graphics firm Falanx, which owned the technology that paved the way for Mali and turned ARM from mainly a CPU IP designer into a provider of graphics processing, too. Of course, Mali GPUs are perhaps best known for their role in Samsung's Exynos chips, but also in many SoC from China. Imagination remains the leader in mobile graphics, but thanks to those Chinese chip companies and the popularity of Samsung’s devices, Mali’s market share is on a steady rise.

ARM is mainly using the ARMv7 instruction set, and it’s deprecating the ARMv6 ISA (even at the lowest end of the market), with ARM11 CPUs based on ARMv6 being replaced by the ARMv7-based Cortex-A5 and -A7. ARM is also about to transition to the brand new 64-bit ISA, ARMv8, which will facilitate products like Cortex-A53 and Cortex-A57, along with the already-shipping Apple A7 and upcoming Denver CPU core from Nvidia.

Business Model

Unlike Intel, ARM doesn’t build its own chips. It only designs them. Then, companies with access to manufacturing license the intellectual property. ARM's approach allowed its technology to become ubiquitous in the mobile market, and ARM-based processors are now in billions of devices.

ARM currently sells a variety of CPU and GPU IP, from ultra-efficient and low-end CPU cores like Cortex-A5 and -A7, to higher-performance ones like Cortex-A15, as well as the new ARMv8-based Cortex-A53 and -A57. At the same time, it has granted architecture licenses to companies like Qualcomm, Apple, and, more recently, Nvidia, which all have proprietary SoCs based on the ARMv7 or ARMv8 ISA. This model allows the flexibility needed for each contender in the space to differentiate, giving us the diversity in platforms we enjoy today.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
15 comments
    Your comment
  • de5_Roy
    tegra and zune!?! rofl!
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....
    wonder why mediatek, allwinner left out of the "big players" while nvidia in nowhere on the "competitive landscape" or anything that qualifies as such. :pt1cable:
    0
  • therogerwilco
    Yay! ARM chips!
    Half the cost and half the performance!
    1
  • pierrerock
    Power efficient does not mean performance wise ...
    2
  • InvalidError
    585683 said:
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....

    The Starks have been dropping like flies. Maybe Nvidia got worried HBO would finish killing them off in the fifth season.
    3
  • adamovera
    Quote:
    tegra and zune!?! rofl! the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it.... wonder why mediatek, allwinner left out of the "big players" while nvidia in nowhere on the "competitive landscape" or anything that qualifies as such. :pt1cable:

    Haven't heard peep about Stark for a very long time, but the followup article, scheduled for next week, focuses on lesser-known Chinese ARM-based SoCs ;)
    1
  • urbanman2004
    Tegra gives ARM a run for its money
    -3
  • Jak Atackka
    I'm interested to see how well the Tegra K1 performs in market. It would be great if it was successful, because that will push Qualcomm and other manufacturers to develop more powerful chips as well. Competition benefits us consumers, and technology as a whole.

    As ARM chips become more powerful and x86 chips become more power efficient, it won't be long until the two of them meet. I'm curious to see which format will win that war. One thing's for sure, the next decade will be a very exciting time for mobile computing.
    0
  • InvalidError
    749851 said:
    I'm interested to see how well the Tegra K1 performs in market. It would be great if it was successful, because that will push Qualcomm and other manufacturers to develop more powerful chips as well. Competition benefits us consumers, and technology as a whole.

    The Shield Tablet murders its battery in just over two hours when its IGP gets pushed to its limits so I doubt the K1 will be particularly popular for products where small size and long battery life are priorities. If it does manage to succeed, it will be in larger devices that can accommodate larger batteries like Chromebooks and mobile devices specifically designed for mobile gamers.
    1
  • palladin9479
    Tegra 4 was actually pretty powerful graphics wise. The problem is that it wasn't power efficient and thus got throttled when used in a smartphone. The Shield on the other hand actually lets it go full out, it's even got a small heatsink and airvents which do get hot after you've been using it for awhile. The K1 is similiar, it provides great visuals and is very powerful, but sucks power and generates heat doing so.
    1
  • slapshot136
    Is 702p a typo, or is Apple going to break another standard?
    0
  • somebodyspecial
    125865 said:
    585683 said:
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....
    The Starks have been dropping like flies. Maybe Nvidia got worried HBO would finish killing them off in the fifth season.


    It's not related to the HBO show. Rather related to IRON MAN. As in, Tony Stark. His movies are still doing very well (a billion well that is, and so is downey's salary at about $80mil+ for ironman3...LOL). I think it's just delayed after erista/parker chips. These are superhero's, not hbo characters.

    I believe most of the moves are due to NV (and others) not being able to count on fabs to get what they wanted in their chips (based on previous history), so we have a few stopgap chips now as CYA stuff I guess. A few years ago they probably started wondering, will they get to 20nm ok or not, will they get to 16/14 or not, will finfet be in or out and at what node, will we be able to do 3d stacked ram etc etc. Tons of questions so they put more chips on the roadmap just in case. Wise IMHO, based on fab track records, even if they do seem to be getting their crap together finally for these last few big moves at TSMC/Samsung/GF. TSMC seems to be on schedule and GF/Samsung have swappable process etc now since working together with IBM.

    Maybe you guys should get on the right roadmap instead of the wrong show ;)
    0
  • somebodyspecial
    179891 said:
    Tegra 4 was actually pretty powerful graphics wise. The problem is that it wasn't power efficient and thus got throttled when used in a smartphone. The Shield on the other hand actually lets it go full out, it's even got a small heatsink and airvents which do get hot after you've been using it for awhile. The K1 is similiar, it provides great visuals and is very powerful, but sucks power and generates heat doing so.


    But you don't HAVE to run it full out as anandtech showed, you can drop it to a 30fps limit (still beating everyone else) and avoid the problems you're talking about with battery and power sucking. They also mentioned it isn't bad on heat.
    http://anandtech.com/show/8354/tegra-k1-lands-in-acers-newest-chromebook
    Very battery efficient at 11.5-13hrs in a chromebook.

    http://anandtech.com/show/8329/revisiting-shield-tablet-gaming-ux-and-battery-life
    "In addition to the low GPU clocks, we see that the skin temperatures never exceed 34C, which is completely acceptable."
    So once you drop the perf some instead of running the chip in a way NO GAME will run it (maxed permanently for a test), the temps drop and so does battery. Games don't do what their benchmark does as they clearly showed in the "revisiting shield" article. Which comically anandtech make not so easy to find...ROFL. K1 tag won't get it, you have to hit joshua's articles. AMD's checks are still coming I guess...ROFL. ***cough, AMD PORTAL, cough ***
    0
  • InvalidError
    925801 said:
    But you don't HAVE to run it full out as anandtech showed, you can drop it to a 30fps limit (still beating everyone else) and avoid the problems you're talking about with battery and power sucking. They also mentioned it isn't bad on heat.

    If you cannot use the chip for more than 50% of what it is worth without murdering the battery, might as well make the chip 50% weaker in the first place and not have to bother with throttling to make battery life reasonable; it would still beat everything else currently on the market without having to bother with artificially capping its performance. The chip would be a buck or two cheaper to manufacture and yields would likely be better on top of that.
    0
  • somebodyspecial
    125865 said:
    925801 said:
    But you don't HAVE to run it full out as anandtech showed, you can drop it to a 30fps limit (still beating everyone else) and avoid the problems you're talking about with battery and power sucking. They also mentioned it isn't bad on heat.
    If you cannot use the chip for more than 50% of what it is worth without murdering the battery, might as well make the chip 50% weaker in the first place and not have to bother with throttling to make battery life reasonable; it would still beat everything else currently on the market without having to bother with artificially capping its performance. The chip would be a buck or two cheaper to manufacture and yields would likely be better on top of that.


    Why remove the OPTION to run full out if desired? That is ridiculous. I can plug in and use FULL power all day, which is how most would use it if hooked to a TV as noted in reviews (for gaming with a gamepad).

    You're arguing to limit choice of something they are giving you for free...LOL. Ok. Whatever. Manufacturers can put the chip at whatever they want, as Acer did running it at 2.1ghz instead of 2.3ghz, thus giving it massive battery life for their chromebook. They can also govern the clocks of the gpu any way they want (sure I can override whatever they do in most cases, but they can set it at whatever for sale). Should AMD sell all their gpus with less power because they use more watts than NV cards? That's dumb and they wouldn't be competitive then.

    The point of having the power in there is you can use it WHEN DESIRED (like next year or the year after when games using this kind of power actually land). Your way, would have you require a NEW device at that point because they chose to artificially limit the soc forcing a new purchase. Your argument is ridiculous and even at 750 you get better battery than 852mhz it's clocked at, no need to drop it to half.

    The user has no idea anything is happening anyway. Bother? Bother who? Logic in the device does all this for you, just like a desktop drops speed when not used etc. No difference here. The ONLY correct move is giving me full power that won't damage the device (if you get to damage levels THEN and only THEN is it giving me too much). I'd rather have a super-powered device I could plug in and use for an extra year or two in really intense gaming, than be forced to buy a new product because they limited me for ABSOLUTELY no reason. I want the fastest clocks my gpu in my pc can run at, unless it damages the unit. ALWAYS. I'll gladly turn it down if I don't want as much heat in my room etc (or too noisy), rather than NOT have the ability to use the free power.

    By your logic they should just start shipping all current laptops as half speeds (heck ship everything at half), for great battery...LOL. What? Whatever. I'll plugin when needed and run intense games IN the house where I have a power outlet.
    0