Qualcomm Snapdragon 820 Deep Learning SDK Nudges Local Neural Nets Closer To Mainstream
Qualcomm announced the deep learning software development kit (SDK) for the company’s Snapdragon 820 SoC. Qualcomm’s deep learning SDK is called the Snapdragon Neural Processing Engine, and it's powered by the company’s Zeroth Machine Intelligence Platform, with optimizations designed to take advantage of the heterogeneous compute capabilities of the Snapdragon SoC. The Zeroth platform drives intelligent deep learning software optimized for mobile devices. Among other things, it may be able to more effectively block malware on mobile devices.
Qualcomm said that the Snapdragon Neural Processing Engine SDK will give developers a “powerful, energy efficient platform” that will help them create mobile neural network tools that can be used for scene detection, text recognition, natural language processing and more.
The idea is that OEMs can run their own neural network applications locally, on-device, and don't have to communicate with the cloud. If that sounds strikingly similar to the work Movidius is doing with its vision processing, that's because it is. Movidius even just announced its Fathom Neural Compute Stick, which is simply a neural net loaded up on a USB flash drive-like stick.
Deep learning is exciting business, and the likes of Nvidia are certainly buying in. The prospect of machines understanding what they see will help usher in the era of artificial intelligence-based computing that has been coming with the rise of Siri and Cortana and Google Now, as well as Microsoft's recently-announced bot-focused "Conversations as a Platform" concept.
The Snapdragon Neural Processing Engine SDK will support any device with a Snapdragon 820 SoC, which includes flagship smartphones such as the Samsung Galaxy S7 and S7 edge and LG G5. Other compatible devices include security cameras, drones and even some automobiles.
The Snapdragon Neural Processing Engine SDK should be available in the second half this year.
Follow Kevin Carbotte @pumcypuhoy. Follow us on Facebook, Google+, RSS, Twitter and YouTube.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Kevin Carbotte is a contributing writer for Tom's Hardware who primarily covers VR and AR hardware. He has been writing for us for more than four years.
-
coolitic Deep Learning and Neural Nets? Seriously? Didn't anyone watch "The Terminator"?
You obviously don't know what neural nets are.
https://en.wikipedia.org/wiki/Artificial_neural_network -
Skip27 I was under the impression one needed some serious hardware to effectively implement such systems. I can understand Nvidia GPU's being up to the task, but not my mobile phone.Reply -
adgjlsfhk Part of the reason such heavy gpu's are used when mobile chips can do similar things is versatility. Much of the way speed is gained in these neural net chips is by using half precision (16 bits) while this works really well for neural nets, (more inaccuracy from small samples than floating point problems), it isn't very useful for graphics. Basically you are throwing very specialized hardware at a task to make it much more efficient. As a side note, part of it is that these are probably meant to start with a base model and incorporate new data in real time, which means it is never building a full model itself.Reply -
XaveT I'm far more excited about not needing the cloud and "hopefully ambivalent" services for voice recognition/natural language features. If that can be done on device... well, a lot of voice activated tech sounds much more private and appealing to me.Reply -
LordConrad
Yes I do, and that article was exactly what I was referring to.Deep Learning and Neural Nets? Seriously? Didn't anyone watch "The Terminator"?
You obviously don't know what neural nets are.
https://en.wikipedia.org/wiki/Artificial_neural_network -
darth_adversor Deep Learning and Neural Nets? Seriously? Didn't anyone watch "The Terminator"?
You obviously don't know what neural nets are.
https://en.wikipedia.org/wiki/Artificial_neural_network
Hilarious. I read the Wikipedia article, and it sounds eerily similar to Skynet. You obviously have never seen Terminator or Battlestar Galactica.
I'm with LordConrad. This stuff scares me a little.
-
XaveT The prospect of decoupling natural language interaction from the cloud by itself is extremely exciting to me... I'd much rather not have to depend on network signal, low latency, and "hopefully benevolent" (not harvesting personal data) cloud services for something that can be done on-device.Reply