Qualcomm Snapdragon 805 Performance Preview

The 805 Extends Snapdragon’s Performance Advantage

Snapdragon 805 turns out to be much more interesting than the 801, which we tested in Qualcomm Snapdragon 801: Performance Previewed. While Snapdragon 801 was a mere clock rate bump over the 800 and didn't offer any architectural changes, Snapdragon 805 introduces us to Qualcomm's next-generation Adreno GPU.

The Adreno 420 sees improvements along the entire length of the rendering pipeline, from an improved z-buffer to tuned ROPs. Qualcomm won't say for sure, but there are likely additional TMUs fed by copious amounts of memory bandwidth, and larger texture and L2 caches. All of those improvements lead to impressive performance gains over the Adreno 330 in Snapdragon 800/801, placing the 805 firmly ahead of the PowerVR G6430.

While Adreno 420’s benchmark performance is impressive, does it push the envelope far enough to outperform the PowerVR Series6XT GPU due to arrive later this year? And, equally important, does it compete with Nvidia’s Kepler-powered Tegra K1? If initial performance figures from Nvidia are to be believed, the Adreno 420’s benchmark dominance may be short-lived.

We'll have to wait until 2015 and Snapdragon 810 to see any significant changes to the CPU complex. For now, Krait 450's tuned circuit layer delivers a higher maximum frequency, at least on paper. While our single-core CPU benchmarks confirm performance gains commensurate with a clock rate increase, Snapdragon 805 struggles to achieve its peak frequency with all four cores active. We can’t blame thermal throttling, since the 805 we tested was housed in a large tablet with a cool-running chassis. Also, we spread the benchmarks over several of these reference platforms, which helped keep heat build-up at bay. Keep in mind though that these were development tablets, not shipping units. So, Qualcomm’s software stack may not be fully optimized, or perhaps the company is using a conservative frequency scaling algorithm to keep SoC temperatures under control. This is a topic to revisit once retail devices start shipping.

With the Krait family of CPUs, Qualcomm opted for clock frequency over pipeline width and complexity. This strategy still works, but I don't see it being viable long-term. We’ve already seen a similar strategy fail on the desktop. Remember Intel's Pentium 4? The CPU/SoC either runs into a power/thermal wall or the weakest link in the pipeline becomes a bottleneck that prevents further scaling. Getting more work done in the same amount of time through IPC improvements can have a detrimental impact on power consumption. However, racing to get as much of the SoC back to sleep as possible, along with clever power gating, helps mitigate some of that. I suspect that Snapdragon 810’s new 64-bit architecture will look more like Apple’s Cyclone CPU than Krait.

With its more powerful GPU, Snapdragon 805 seems best suited to high-resolution tablets and smartphones with large screens. Larger form factors also provide more thermal headroom for Krait 450’s higher frequencies. It’s likely we’ll see Snapdragon 801 remain the more popular option for smartphones, while the 805 powers a new generation of tablets.

Follow Matt and Tom’s Hardware on Twitter

  • blackmagnum
    Trying not to be an Apple fanboy, but their A7 processor supports 64-bit instructions since last year. They lead innovation due to their clientele having more open-ended budget for the device than Android users (can't remember link to the study).
    Did I read that right? There won't be ANY 64-bit Android Phones until 2015? It's going to take practically TWO YEARS to play catch-up to the iPhone? Mind you, most iPhone users don't know the difference between 32-bit and 64-bit, anyway, but it's "better" and that their logic for upgrading. Preying on stupid people basically has become (or always has been?) Apple's business model, and it's paid off. Sometimes, I wish I wasn't anti-Apple. C'mon, Android...get with the program!
  • rantoc
    Just find it funny tragic that more and more phone displays are almost at the same resolution as in many general desktop PC's.

    Also find it funny that their marketing team dare to call this "Ultra HD", would be fun to see a benchmark of this running that 4k resolution in any 3D descent detail benchmark=P

    "It actually approaches what a fairly modern desktop CPU's integrated memory controller can do. All of this extra memory bandwidth isn’t for the CPU, though. It's reserved for Qualcomm’s new Adreno 420 GPU."

    Yeah mostly is for the GPU, where a modern PC gpu alone pushes well over 300gb/sec. Close no? =P
  • Memnarchon
    Dam! And I was hoping to see K1 on these benchmarks too, for a comparison. Oh well...
  • acasel
    NO Tegra K1 benchmarks here, itll make the snapdragon 805 a 2 year ago cpu... lol
  • ta152h
    Is it too difficult for you guys to write a consistently good and accurate article? It's like you do the hard stuff, and then screw up details.

    For example, why are some charts from 0 to somewhere above the max score, and others start at, for example, 2300 and go to 3000.

    I realize you guys aren't really computer people from this terrible lack of attention to detail (which someone who does more than write about computers has to have as a personality flaw), but can't you hire someone that can look over this stuff, and at least try to present it in a consistent way? Writers who aren't computer people make these types of mistakes, because their minds aren't ordered enough, but you guys really need someone like that, because all the articles suffer from imprecision and lack of clarity (and over use of words like 'alacrity', which really implies emotion, and isn't a true synonym for speed. Again, precision ...)

    For example, I'm looking at charts, and am shocked by some, then realize it's just because you guys screwed up the scaling, and can't stay consistent.

    Don't worry, a chart that shows very little difference because you used the full scale isn't bad. Because, if you really think about it, neither is the performance, and if you can't see a big difference in the chart, you aren't going to see a big difference in the performance. But, when you see one bar over three times longer than another, and the real difference is less than 20%, don't you think that gives the wrong impression?

    If you do all the hard work, and then screw up details, it's just not as good as it could be. And yes, I've learned these sites like to say things in a way it is correct, but then present it in a manner which gives the opposite impression. Try writing without bias, and maybe this will go away. Charts are one way, comparing Kabinis with Haswells are another. Commenting more on charts that read what you want, while just presenting charts you don't like the results of, are another way. Or commenting on the part of the chart you like, while ignoring the part you don't. It's not as subtle as you might think, or maybe it is, and you don't even realize your bias. But we do.

    I used to love this site, especially when Thomas Pabst used to write in his crazy way. But, it's slowly, and inexorably getting worse. There are better sites now. Maybe skip the really bad car reviews (do you really think your opinions even approximate professional sites like Car and Driver? At all? ) and focus more producing better quality computer articles. It should be easy, you guys get a lot of good information, often do reviews that people want but other sites skip, but then screw it up with a lack of attention to detail and consistency.
  • esrever
    I find the inconsistency makes most of this completely pointless until the software gets actually optimized and the drivers start working.
  • hannibal
    Well I really expect new article in near future where Tegra K1 and 805 are against each other. And then 810. It is interesting to see what 64bit computing will bring to mobile platforms... Mobile gaming is getting quite serious in next few years!
  • edlivian
    you can compare the k1 benchmarks at anandtech
  • irish_adam
    I dont see why you are all so impressed with 64bit. I mean if you believe that the A7 is super amazing because it is 64bit then you're an idiot. The fact that its 64bit adds minimal performance and is 100% gimmick and just so that they can claim to be the first.

    It reminds me of when AMD released their first 64bit chips and microsoft released XP64, you soon realised that unless you had 4gb or more of ram then there was no difference (well except that none of your hardware drivers would work grrrr).
    Why would Qualcomm rush out a 64bit chip when there isnt any real improvement to be had? surely its better that they focus on things that will actually improve performance and battery life? I mean they havent even finished a 64bit version of android yet so what would it even run on? Until we see the need for more than 4gb of ram on phones then i really dont see the point