Sign in with
Sign up | Sign in

Samsung's Exynos Line of SoCs

An Introduction To The Major ARM-Based SoCs
By , Dorian Black

Unlike Qualcomm, Apple, and now Nvidia, Samsung doesn’t design its own CPUs. Instead, it licenses both the CPU and GPU logic from ARM. However, one advantage Samsung has is that it manufactures the chips itself, and doesn’t have to pay other foundries to do it.

Exynos Series

Samsung shipped its first Exynos (3) SoC in 2010 alongside its highly successful (at the time) Galaxy S smartphone. Exynos 3 came with a single-core 1 GHz Cortex-A8 CPU, which, at the time, was the only popular mobile processor in an Android device that wasn't a Snapdragon. Apple launched its iPhone 4 around the same time that also featured a Cortex-A8, though that one operated at 800 MHz.

A year later came the Exynos 4210, Samsung’s first dual-core chip, which appeared in the company's Galaxy S2. It included Mali-400 graphics, a part that went on to become quite popular in other devices.

Not long after, Samsung introduced the Exynos 4212 in its Galaxy Note with a host processing complex running 200 MHz faster and a GPU purportedly 50% quicker. The Note had a 1280x800 resolution, and of course that was quite a bit higher than the Galaxy S2's 800x480 screen, necessitating increased graphics processing.

In 2012, Samsung shipped its first quad-core chip, Exynos 4 Quad, inside the Galaxy S3 and Galaxy Note 2 smartphones. It also used 20% less power than the previous-generation SoC, while retaining a quad-core Mali-400 GPU.

Towards the end of the year, Samsung began using the Exynos 5 Dual, based on Cortex-A15, in its Nexus 10 tablet. The processor was significantly more powerful even than the Exynos 4 Quad and had Mali-T604 graphics employing the new unified shader Midgard architecture. This chip was also used in the most popular Chromebook device at the time, helping the machine hit a low $250 price point, while offering respectable performance.

Exynos 5 Octa

In 2013, Samsung shipped its first eight-core chip for the Galaxy S4 and Galaxy Note 3. It was named Exynos 5 Octa, or Exynos 5410.

Exynos 5 Octa was the first SoC based on ARM's big.Little configuration, made of four-core Cortex-A7 and Cortex-A15 CPU clusters able to work separately to save battery life or together for maximum performance.

The part ended up with several issues, many related to firmware. This wasn't surprising, given the technology's newness. Samsung’s engineers were still learning how to make the Cortex-A7s and -A15s work well together, in the most energy efficient way. Because of those problems, and because Qualcomm had better LTE modem integration in its SoC, Samsung continues using Snapdragon processors in most of its markets.

To make matters even more complicated, Samsung also stepped back to using PowerVR graphics for this SoC, instead of the Mali architecture. Presumably, it did that because ARM’s Mali-T628 wasn’t ready, giving the company no other choice.

This year, Samsung changed direction again and went back to Mali-T628MP6 graphics. It also seems to have fixed the issues with Exynos 5 Octa in a new Exynos 5422 version. The latest revision also sports a 300 MHz boost to its Cortex-A15 cores and an extra 100 MHz for the Cortex-A7 cores. Moreover, Samsung added a mode that makes all eight cores operate at the same time, when needed, rather than completely separate the two clusters.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 15 comments.
This thread is closed for comments
  • 0 Hide
    de5_Roy , August 21, 2014 9:30 AM
    tegra and zune!?! rofl!
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....
    wonder why mediatek, allwinner left out of the "big players" while nvidia in nowhere on the "competitive landscape" or anything that qualifies as such. :pt1cable: 
  • 1 Hide
    therogerwilco , August 21, 2014 10:09 AM
    Yay! ARM chips!
    Half the cost and half the performance!
  • 2 Hide
    pierrerock , August 21, 2014 10:59 AM
    Power efficient does not mean performance wise ...
  • 3 Hide
    InvalidError , August 21, 2014 4:15 PM
    Quote:
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....

    The Starks have been dropping like flies. Maybe Nvidia got worried HBO would finish killing them off in the fifth season.
  • 1 Hide
    adamovera , August 21, 2014 9:19 PM
    Quote:
    tegra and zune!?! rofl!
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....
    wonder why mediatek, allwinner left out of the "big players" while nvidia in nowhere on the "competitive landscape" or anything that qualifies as such. :pt1cable: 

    Haven't heard peep about Stark for a very long time, but the followup article, scheduled for next week, focuses on lesser-known Chinese ARM-based SoCs ;) 
  • -3 Hide
    urbanman2004 , August 22, 2014 4:27 PM
    Tegra gives ARM a run for its money
  • 0 Hide
    Jak Atackka , August 22, 2014 9:35 PM
    I'm interested to see how well the Tegra K1 performs in market. It would be great if it was successful, because that will push Qualcomm and other manufacturers to develop more powerful chips as well. Competition benefits us consumers, and technology as a whole.

    As ARM chips become more powerful and x86 chips become more power efficient, it won't be long until the two of them meet. I'm curious to see which format will win that war. One thing's for sure, the next decade will be a very exciting time for mobile computing.
  • 1 Hide
    InvalidError , August 23, 2014 4:14 AM
    Quote:
    I'm interested to see how well the Tegra K1 performs in market. It would be great if it was successful, because that will push Qualcomm and other manufacturers to develop more powerful chips as well. Competition benefits us consumers, and technology as a whole.

    The Shield Tablet murders its battery in just over two hours when its IGP gets pushed to its limits so I doubt the K1 will be particularly popular for products where small size and long battery life are priorities. If it does manage to succeed, it will be in larger devices that can accommodate larger batteries like Chromebooks and mobile devices specifically designed for mobile gamers.
  • 1 Hide
    palladin9479 , August 23, 2014 10:49 AM
    Tegra 4 was actually pretty powerful graphics wise. The problem is that it wasn't power efficient and thus got throttled when used in a smartphone. The Shield on the other hand actually lets it go full out, it's even got a small heatsink and airvents which do get hot after you've been using it for awhile. The K1 is similiar, it provides great visuals and is very powerful, but sucks power and generates heat doing so.
  • 0 Hide
    Bulat Ziganshin , August 23, 2014 11:51 AM
    everyone reports that 5433 will be 64-bit: http://www.droid-life.com/2014/08/20/galaxy-note-4-powered-by-64-bit-exynos-5433-benchmarked-only-beat-by-one-other-chipset/
  • 0 Hide
    slapshot136 , August 25, 2014 12:30 PM
    Is 702p a typo, or is Apple going to break another standard?
  • 0 Hide
    somebodyspecial , August 25, 2014 2:06 PM
    Quote:
    Quote:
    the stark soc seems to have vanished from the latest soc roadmap... wonder what happened to it....

    The Starks have been dropping like flies. Maybe Nvidia got worried HBO would finish killing them off in the fifth season.


    It's not related to the HBO show. Rather related to IRON MAN. As in, Tony Stark. His movies are still doing very well (a billion well that is, and so is downey's salary at about $80mil+ for ironman3...LOL). I think it's just delayed after erista/parker chips. These are superhero's, not hbo characters.

    I believe most of the moves are due to NV (and others) not being able to count on fabs to get what they wanted in their chips (based on previous history), so we have a few stopgap chips now as CYA stuff I guess. A few years ago they probably started wondering, will they get to 20nm ok or not, will they get to 16/14 or not, will finfet be in or out and at what node, will we be able to do 3d stacked ram etc etc. Tons of questions so they put more chips on the roadmap just in case. Wise IMHO, based on fab track records, even if they do seem to be getting their crap together finally for these last few big moves at TSMC/Samsung/GF. TSMC seems to be on schedule and GF/Samsung have swappable process etc now since working together with IBM.

    Maybe you guys should get on the right roadmap instead of the wrong show ;) 
  • 0 Hide
    somebodyspecial , August 25, 2014 2:17 PM
    Quote:
    Tegra 4 was actually pretty powerful graphics wise. The problem is that it wasn't power efficient and thus got throttled when used in a smartphone. The Shield on the other hand actually lets it go full out, it's even got a small heatsink and airvents which do get hot after you've been using it for awhile. The K1 is similiar, it provides great visuals and is very powerful, but sucks power and generates heat doing so.


    But you don't HAVE to run it full out as anandtech showed, you can drop it to a 30fps limit (still beating everyone else) and avoid the problems you're talking about with battery and power sucking. They also mentioned it isn't bad on heat.
    http://anandtech.com/show/8354/tegra-k1-lands-in-acers-newest-chromebook
    Very battery efficient at 11.5-13hrs in a chromebook.

    http://anandtech.com/show/8329/revisiting-shield-tablet-gaming-ux-and-battery-life
    "In addition to the low GPU clocks, we see that the skin temperatures never exceed 34C, which is completely acceptable."
    So once you drop the perf some instead of running the chip in a way NO GAME will run it (maxed permanently for a test), the temps drop and so does battery. Games don't do what their benchmark does as they clearly showed in the "revisiting shield" article. Which comically anandtech make not so easy to find...ROFL. K1 tag won't get it, you have to hit joshua's articles. AMD's checks are still coming I guess...ROFL. ***cough, AMD PORTAL, cough ***
  • 0 Hide
    InvalidError , August 25, 2014 7:53 PM
    Quote:
    But you don't HAVE to run it full out as anandtech showed, you can drop it to a 30fps limit (still beating everyone else) and avoid the problems you're talking about with battery and power sucking. They also mentioned it isn't bad on heat.

    If you cannot use the chip for more than 50% of what it is worth without murdering the battery, might as well make the chip 50% weaker in the first place and not have to bother with throttling to make battery life reasonable; it would still beat everything else currently on the market without having to bother with artificially capping its performance. The chip would be a buck or two cheaper to manufacture and yields would likely be better on top of that.
  • 0 Hide
    somebodyspecial , August 26, 2014 3:56 PM
    Quote:
    Quote:
    But you don't HAVE to run it full out as anandtech showed, you can drop it to a 30fps limit (still beating everyone else) and avoid the problems you're talking about with battery and power sucking. They also mentioned it isn't bad on heat.

    If you cannot use the chip for more than 50% of what it is worth without murdering the battery, might as well make the chip 50% weaker in the first place and not have to bother with throttling to make battery life reasonable; it would still beat everything else currently on the market without having to bother with artificially capping its performance. The chip would be a buck or two cheaper to manufacture and yields would likely be better on top of that.


    Why remove the OPTION to run full out if desired? That is ridiculous. I can plug in and use FULL power all day, which is how most would use it if hooked to a TV as noted in reviews (for gaming with a gamepad).

    You're arguing to limit choice of something they are giving you for free...LOL. Ok. Whatever. Manufacturers can put the chip at whatever they want, as Acer did running it at 2.1ghz instead of 2.3ghz, thus giving it massive battery life for their chromebook. They can also govern the clocks of the gpu any way they want (sure I can override whatever they do in most cases, but they can set it at whatever for sale). Should AMD sell all their gpus with less power because they use more watts than NV cards? That's dumb and they wouldn't be competitive then.

    The point of having the power in there is you can use it WHEN DESIRED (like next year or the year after when games using this kind of power actually land). Your way, would have you require a NEW device at that point because they chose to artificially limit the soc forcing a new purchase. Your argument is ridiculous and even at 750 you get better battery than 852mhz it's clocked at, no need to drop it to half.

    The user has no idea anything is happening anyway. Bother? Bother who? Logic in the device does all this for you, just like a desktop drops speed when not used etc. No difference here. The ONLY correct move is giving me full power that won't damage the device (if you get to damage levels THEN and only THEN is it giving me too much). I'd rather have a super-powered device I could plug in and use for an extra year or two in really intense gaming, than be forced to buy a new product because they limited me for ABSOLUTELY no reason. I want the fastest clocks my gpu in my pc can run at, unless it damages the unit. ALWAYS. I'll gladly turn it down if I don't want as much heat in my room etc (or too noisy), rather than NOT have the ability to use the free power.

    By your logic they should just start shipping all current laptops as half speeds (heck ship everything at half), for great battery...LOL. What? Whatever. I'll plugin when needed and run intense games IN the house where I have a power outlet.