Samsung to Soon Preview Quad-Core Exynos Processor

The Exynos 4412 is likely to attract most of the attention due to its four ARM Cortex-A9 cores that are expected to run at 200 MHz to 1.5 GHz. Rumors also suggest that the chip will include four ARM Mali graphics cores, a 64-bit Neon media engine and a dual-channel memory controller with support for LP-DDR2, DDR2 and DDR3 memory.

As a competitor for Nvidia Tegra 3, the Exynos 4412 could be making its way into tablets and high-end smartphones this year and pave the way to a category that is often referred to as "superphones". According to Samsung, a quad-core Exynos would be powerful enough to support a phone with an integrated projector and 1080p 3D display.

We also expect Samsung to show its Exynos 5250, which began sampling in November 2011 and is on target for mass production in Q2. The 5250 is a 32 nm, dual-core Cortex-A15 chip running at a clock speed of 2.0 GHz. Samsung claims that the chip can process 14 billion Dhrystone instructions per second, which is almost twice the performance of the current Cortex-A9-based 1.5 GHz model that delivers 7.5 billion Dhrystone instructions per second.

The 5250 also provides twice the memory bandwidth (12.8 GB/s) and will be able to run displays with a resolution of up to 2560 x 1600 (WQXGA).

  • velocityg4
    What I'd be interested in Tom's doing is posting how many Gigaflops these CPU's are capable of and how many the GPU's are capable of. From what I've read it seems to be between 1.5 and 3.6 GFlops for a single core 1Ghz A9. But I don't know how valid those tests are or if the 1.5 is only the CPU while 3.6 is the GPU or GPU + CPU.

    Mainly I'm interested because I'd like to see exactly where one of these upcoming top end A9 CPU's stack up against a top of the line upcoming 8-core LGA 2011 Xeon and GPU (like the upcoming Radeon HD 7990). Plus see what year of desktop CPU and GPU are a match for these to see how wide the actual gulf is between tablet PC's, smartphones, consumer level notebooks and desktops, and professional/enthusiast desktops and laptops.
    Reply
  • burnley14
    It's simply amazing to me how far CPUs and GPUs have progressed in the last few years. This would trounce my laptop from just a few years ago.
    Reply
  • jn77
    My contract ends in 11 months. I would love to have a Galaxy Note spec. ed out with 64 gb internal memory and a 64 gb card in the slot with this processor and a 2560 x 1600 (WQXGA) 5.3 inch screen.
    Reply
  • nhat11
    velocityg4What I'd be interested in Tom's doing is posting how many Gigaflops these CPU's are capable of and how many the GPU's are capable of. From what I've read it seems to be between 1.5 and 3.6 GFlops for a single core 1Ghz A9. But I don't know how valid those tests are or if the 1.5 is only the CPU while 3.6 is the GPU or GPU + CPU.Mainly I'm interested because I'd like to see exactly where one of these upcoming top end A9 CPU's stack up against a top of the line upcoming 8-core LGA 2011 Xeon and GPU (like the upcoming Radeon HD 7990). Plus see what year of desktop CPU and GPU are a match for these to see how wide the actual gulf is between tablet PC's, smartphones, consumer level notebooks and desktops, and professional/enthusiast desktops and laptops.
    Not even close. ARMs technology sound impressive, and it is for low powered units but to power powerful software like photoshop, rendering of of games, etc, ARMs isn't powerful enough to do that as long as they try to keep the power consumption down.

    Processing for displaying video is engineered in the ARMs chips to do specifically that, that's why it can't be used to create anything high end like gaming or using PS.
    Reply
  • d-isdumb
    "be powerful enough to support a phone with an integrated projector and 1080p 3D display"
    "will be able to run displays with a resolution of up to 2560 x 1600 (WQXGA)" and we need this power in a phone for what reason?
    Battery life, when are they going to get serious about that. Give a thicker phone with 2-3 times the battery life any day over the Razor and such.
    Reply
  • jn77
    You know what is funny, Samsung sells a 5 inch tablet and they market it as an mp3 player at best buy for $249 that is basicly the same thing as a Galaxy note with out all the phone crap in it, wifi only. But the Galaxy Note lists for $749? that is crazy, the electronics for the 3g/4g are the cheapest part of the phone, they have perfected them over the last 8 years........


    Cell phones are such a scam... there is like 10,000% markup in them.
    Reply
  • gregs101
    Zingham wrote: ''Promises, promises and yet more promises! I have been waiting since the beginning of 2010 for a tablet that would be worth to buy and not made by Apple! So Tergra's suck, Android wasn't optimized well and too pragmented, screen resolutions too low, tablets to heavy and no user replaceable batteries, tablets difficult to find or way overpriced... no 3G, no LTE... or simply buggy/badly designed like Transformer Prime. And just one tablet that offers a keyboard dock and they sell it as if it were made of Platinum.
    Other brands like Blackberry or TouchPad, unfinished, unpolished and just failed...
    I would never ever play more money on a tablet than I would pay for a real laptop! That's what manufacturers should understand!
    I would use that gadget primarily as a reading device: books, pdfs, web, because Kindle doesn't suit me. But as of now Android tablets don't even have a proper pdf reader!''

    To be honest mate I'm loving my transformer prime. I think the Tegra 3 is slick and performs beautifully. It can open PDF's without any additional application being downloaded from the Market. The keyboard dock is ace and so far seems true to its word in providing enormous battery life. If you can find a think and light laptop in the £500 price range I'd be very impressed. Not sure what your beef is at all.
    Reply
  • nforce4max
    velocityg4What I'd be interested in Tom's doing is posting how many Gigaflops these CPU's are capable of and how many the GPU's are capable of. From what I've read it seems to be between 1.5 and 3.6 GFlops for a single core 1Ghz A9. But I don't know how valid those tests are or if the 1.5 is only the CPU while 3.6 is the GPU or GPU + CPU.Mainly I'm interested because I'd like to see exactly where one of these upcoming top end A9 CPU's stack up against a top of the line upcoming 8-core LGA 2011 Xeon and GPU (like the upcoming Radeon HD 7990). Plus see what year of desktop CPU and GPU are a match for these to see how wide the actual gulf is between tablet PC's, smartphones, consumer level notebooks and desktops, and professional/enthusiast desktops and laptops.
    Well if those numbers are true than ARM is already much faster than any Intel atom and for a quad core is already beating AMD's Bobcat platform in that regard. They got a long ways to go before being able to compete with x86 but eventually they might catch up.
    Reply
  • X-Nemesis
    What's the purpose of these high powered devices when they are not backed up by the improvement of battery life. I know I'm stating the obvious but Battery life and powering these suckers has to be addressed.
    Reply
  • Cazalan
    velocityg4Mainly I'm interested because I'd like to see exactly where one of these upcoming top end A9 CPU's stack up against a top of the line upcoming 8-core LGA 2011 Xeon and GPU (like the upcoming Radeon HD 7990). Plus see what year of desktop CPU and GPU are a match for these to see how wide the actual gulf is between tablet PC's, smartphones, consumer level notebooks and desktops, and professional/enthusiast desktops and laptops.
    They still have a ways to catch up but the ARM platforms seem to keep doubling in power each year where the x86 platform nets 10-20%. The bigger question is what will the power requirements and price be like once they do converge to a similar performance level.

    Something tells me they won't be that far apart and the winner at that stage will be who has the best development software including compilers. x86 chips today aren't even taken full advantage of because software lags so far behind the technology. New instructions get added that take years to be used, with the exception of some really dedicated projects like F@H which jump on new instructions fairly quick.
    Reply