Sign in with
Sign up | Sign in

Upcoming 64-bit SoCs

An Introduction To The Major ARM-Based SoCs
By , Dorian Black

ARMv8 is the instruction set that adds support for 64-bit processing, AES encryption and SHA2 hardware acceleration, 128-bit NEON registers, and full backwards compatibility for apps currently built for the ARMv7 ISA. 

Apple A8

Since Apple is usually quite secretive, we don’t know much about its next-gen SoC. There have been rumors that it'll operate at up to twice the clock rate (2.6 GHz) though, which would either mean that Apple found a way to make Cyclone more efficient through 20 nm manufacturing or is coming out with another architectural tweak better able to scale frequency.

The new A8 should also come with PowerVR graphics pushing twice as much performance, which is what Apple has historically done when upgrading the GPU. The new graphics engine will prove useful, since the upcoming iPhone is rumored to employ a much higher resolution (somewhere between 702p and 1080p) for the 4.7” screen size.

Nvidia Tegra K1

Tegra K1 is Nvidia’s current and highest-end mobile processor, armed with either four Cortex-A15 cores or two of the company's ARMv8-based Denver cores, along with the Kepler architecture made popular by any number of GeForce cards.

The Denver/Kepler combination of Tegra K1 is particularly promising. On one hand, you have the 64-bit ARMv8-based CPU that should be significantly faster than those revised Cortex-A15s in the 32-bit Tegra K1. On the other, there's Nvidia’s PC graphics architecture, with a handful of modifications to be more mobile-friendly. Still, it supports APIs that other SoCs can't touch. Beyond OpenGL ES 3.1, you get OpenGL 4.4, DirectX 11.1 and CUDA.

The mobile Kepler-based GPU is also specified for performance that should go unmatched by anything except maybe Imagination’s upcoming mobile GPUs. Nvidia uses the Xbox 360 and PS3 as its comparison points in marketing material, and that should help garner attention in any set top boxes running Android. At least in theory, PC and console games are made easier to port over thanks to comprehensive API support. Whether or not developers jump on that remains to be seen.

While the upcoming Snapdragon and Exynos chips appear to be slight upgrades compared to current products, the most interesting chips coming out in the mobile market this year should be Nvidia’s first ever proprietary CPU design, Denver, and Apple’s A8, both based on the ARMv8 instruction set. Both advanced architectures, combined with the higher clock rates expected from them, should facilitate the "desktop-class" performance we've been told to expect.

Qualcomm, Samsung, Nvidia, and Apple are all important players in the chip market now, and they've individually come a long way. But this is only the beginning for them. It will be interesting to observe how competitive all four vendors are once they have processors based on ARM's 64-bit instruction set, allowing them to compete on a more equal footing.

Display all 15 comments.
  • 0 Hide
    de5_Roy , August 21, 2014 9:30 AM
    tegra and zune!?! rofl!
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....
    wonder why mediatek, allwinner left out of the "big players" while nvidia in nowhere on the "competitive landscape" or anything that qualifies as such. :pt1cable: 
  • 1 Hide
    therogerwilco , August 21, 2014 10:09 AM
    Yay! ARM chips!
    Half the cost and half the performance!
  • 2 Hide
    pierrerock , August 21, 2014 10:59 AM
    Power efficient does not mean performance wise ...
  • 3 Hide
    InvalidError , August 21, 2014 4:15 PM
    Quote:
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....

    The Starks have been dropping like flies. Maybe Nvidia got worried HBO would finish killing them off in the fifth season.
  • 1 Hide
    adamovera , August 21, 2014 9:19 PM
    Quote:
    tegra and zune!?! rofl!
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....
    wonder why mediatek, allwinner left out of the "big players" while nvidia in nowhere on the "competitive landscape" or anything that qualifies as such. :pt1cable: 

    Haven't heard peep about Stark for a very long time, but the followup article, scheduled for next week, focuses on lesser-known Chinese ARM-based SoCs ;) 
  • -3 Hide
    urbanman2004 , August 22, 2014 4:27 PM
    Tegra gives ARM a run for its money
  • 0 Hide
    Jak Atackka , August 22, 2014 9:35 PM
    I'm interested to see how well the Tegra K1 performs in market. It would be great if it was successful, because that will push Qualcomm and other manufacturers to develop more powerful chips as well. Competition benefits us consumers, and technology as a whole.

    As ARM chips become more powerful and x86 chips become more power efficient, it won't be long until the two of them meet. I'm curious to see which format will win that war. One thing's for sure, the next decade will be a very exciting time for mobile computing.
  • 1 Hide
    InvalidError , August 23, 2014 4:14 AM
    Quote:
    I'm interested to see how well the Tegra K1 performs in market. It would be great if it was successful, because that will push Qualcomm and other manufacturers to develop more powerful chips as well. Competition benefits us consumers, and technology as a whole.

    The Shield Tablet murders its battery in just over two hours when its IGP gets pushed to its limits so I doubt the K1 will be particularly popular for products where small size and long battery life are priorities. If it does manage to succeed, it will be in larger devices that can accommodate larger batteries like Chromebooks and mobile devices specifically designed for mobile gamers.
  • 1 Hide
    palladin9479 , August 23, 2014 10:49 AM
    Tegra 4 was actually pretty powerful graphics wise. The problem is that it wasn't power efficient and thus got throttled when used in a smartphone. The Shield on the other hand actually lets it go full out, it's even got a small heatsink and airvents which do get hot after you've been using it for awhile. The K1 is similiar, it provides great visuals and is very powerful, but sucks power and generates heat doing so.
  • 0 Hide
    Bulat Ziganshin , August 23, 2014 11:51 AM
    everyone reports that 5433 will be 64-bit: http://www.droid-life.com/2014/08/20/galaxy-note-4-powered-by-64-bit-exynos-5433-benchmarked-only-beat-by-one-other-chipset/
  • 0 Hide
    slapshot136 , August 25, 2014 12:30 PM
    Is 702p a typo, or is Apple going to break another standard?
  • 0 Hide
    somebodyspecial , August 25, 2014 2:06 PM
    Quote:
    Quote:
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....

    The Starks have been dropping like flies. Maybe Nvidia got worried HBO would finish killing them off in the fifth season.


    It's not related to the HBO show. Rather related to IRON MAN. As in, Tony Stark. His movies are still doing very well (a billion well that is, and so is downey's salary at about $80mil+ for ironman3...LOL). I think it's just delayed after erista/parker chips. These are superhero's, not hbo characters.

    I believe most of the moves are due to NV (and others) not being able to count on fabs to get what they wanted in their chips (based on previous history), so we have a few stopgap chips now as CYA stuff I guess. A few years ago they probably started wondering, will they get to 20nm ok or not, will they get to 16/14 or not, will finfet be in or out and at what node, will we be able to do 3d stacked ram etc etc. Tons of questions so they put more chips on the roadmap just in case. Wise IMHO, based on fab track records, even if they do seem to be getting their crap together finally for these last few big moves at TSMC/Samsung/GF. TSMC seems to be on schedule and GF/Samsung have swappable process etc now since working together with IBM.

    Maybe you guys should get on the right roadmap instead of the wrong show ;) 
  • 0 Hide
    somebodyspecial , August 25, 2014 2:17 PM
    Quote:
    Tegra 4 was actually pretty powerful graphics wise. The problem is that it wasn't power efficient and thus got throttled when used in a smartphone. The Shield on the other hand actually lets it go full out, it's even got a small heatsink and airvents which do get hot after you've been using it for awhile. The K1 is similiar, it provides great visuals and is very powerful, but sucks power and generates heat doing so.


    But you don't HAVE to run it full out as anandtech showed, you can drop it to a 30fps limit (still beating everyone else) and avoid the problems you're talking about with battery and power sucking. They also mentioned it isn't bad on heat.
    http://anandtech.com/show/8354/tegra-k1-lands-in-acers-newest-chromebook
    Very battery efficient at 11.5-13hrs in a chromebook.

    http://anandtech.com/show/8329/revisiting-shield-tablet-gaming-ux-and-battery-life
    "In addition to the low GPU clocks, we see that the skin temperatures never exceed 34C, which is completely acceptable."
    So once you drop the perf some instead of running the chip in a way NO GAME will run it (maxed permanently for a test), the temps drop and so does battery. Games don't do what their benchmark does as they clearly showed in the "revisiting shield" article. Which comically anandtech make not so easy to find...ROFL. K1 tag won't get it, you have to hit joshua's articles. AMD's checks are still coming I guess...ROFL. ***cough, AMD PORTAL, cough ***
  • 0 Hide
    InvalidError , August 25, 2014 7:53 PM
    Quote:
    But you don't HAVE to run it full out as anandtech showed, you can drop it to a 30fps limit (still beating everyone else) and avoid the problems you're talking about with battery and power sucking. They also mentioned it isn't bad on heat.

    If you cannot use the chip for more than 50% of what it is worth without murdering the battery, might as well make the chip 50% weaker in the first place and not have to bother with throttling to make battery life reasonable; it would still beat everything else currently on the market without having to bother with artificially capping its performance. The chip would be a buck or two cheaper to manufacture and yields would likely be better on top of that.
  • 0 Hide
    somebodyspecial , August 26, 2014 3:56 PM
    Quote:
    Quote:
    But you don't HAVE to run it full out as anandtech showed, you can drop it to a 30fps limit (still beating everyone else) and avoid the problems you're talking about with battery and power sucking. They also mentioned it isn't bad on heat.

    If you cannot use the chip for more than 50% of what it is worth without murdering the battery, might as well make the chip 50% weaker in the first place and not have to bother with throttling to make battery life reasonable; it would still beat everything else currently on the market without having to bother with artificially capping its performance. The chip would be a buck or two cheaper to manufacture and yields would likely be better on top of that.


    Why remove the OPTION to run full out if desired? That is ridiculous. I can plug in and use FULL power all day, which is how most would use it if hooked to a TV as noted in reviews (for gaming with a gamepad).

    You're arguing to limit choice of something they are giving you for free...LOL. Ok. Whatever. Manufacturers can put the chip at whatever they want, as Acer did running it at 2.1ghz instead of 2.3ghz, thus giving it massive battery life for their chromebook. They can also govern the clocks of the gpu any way they want (sure I can override whatever they do in most cases, but they can set it at whatever for sale). Should AMD sell all their gpus with less power because they use more watts than NV cards? That's dumb and they wouldn't be competitive then.

    The point of having the power in there is you can use it WHEN DESIRED (like next year or the year after when games using this kind of power actually land). Your way, would have you require a NEW device at that point because they chose to artificially limit the soc forcing a new purchase. Your argument is ridiculous and even at 750 you get better battery than 852mhz it's clocked at, no need to drop it to half.

    The user has no idea anything is happening anyway. Bother? Bother who? Logic in the device does all this for you, just like a desktop drops speed when not used etc. No difference here. The ONLY correct move is giving me full power that won't damage the device (if you get to damage levels THEN and only THEN is it giving me too much). I'd rather have a super-powered device I could plug in and use for an extra year or two in really intense gaming, than be forced to buy a new product because they limited me for ABSOLUTELY no reason. I want the fastest clocks my gpu in my pc can run at, unless it damages the unit. ALWAYS. I'll gladly turn it down if I don't want as much heat in my room etc (or too noisy), rather than NOT have the ability to use the free power.

    By your logic they should just start shipping all current laptops as half speeds (heck ship everything at half), for great battery...LOL. What? Whatever. I'll plugin when needed and run intense games IN the house where I have a power outlet.
React To This Article