Google today announced that it designed a custom system on a chip (SoC) to improve the machine learning and artificial intelligence performance of its next smartphones. The new chip, which the company dubbed Tensor, is set to debut in the Pixel 6 and Pixel 6 Pro "later this fall" alongside the Android 12 operating system.
The company uses the Tensor identifier for many of its ML- and AI-related projects. Its open source machine learning platform is called TensorFlow. The application-specific integrated circuits (ASICs) on which Google Cloud relies are called Tensor Processing Units. And now it's built a custom Tensor SoC for its own smartphones.
Unfortunately the announcement was light on details — Google didn't offer any information about the Tensor SoC's tech specs or provide a specific release date for the Pixel 6 lineup. But it did offer some insight into what it set out to accomplish with its first mobile SoC and what that could mean for Android users once it debuts.
"AI is the future of our innovation work," Google said, "but the problem is we’ve run into computing limitations that prevented us from fully pursuing our mission. So we set about building a technology platform built for mobile that enabled us to bring our most innovative AI and machine learning (ML) to our Pixel users."
The company said it "thought about every piece of the chip and customized it to run Google's computational photography models" so it could deliver "entirely new features, plus improvements to existing ones," in the latest phones. It also added a security core to complement a next-gen Titan M chip debuting with the Pixel 6 line.
"You’ll see this in everything from the completely revamped camera system to speech recognition and much more," Google said. "So whether you're trying to capture that family photo when your kids won’t stand still, or communicate with a relative in another language, Pixel will be there — and it will be more helpful than ever."
If this sounds familiar, it might be because Apple often touts the ML and AI performance of its custom silicon as well. Its most recent processors, from the A14 Bionic found in the iPhone 12 line to the M1 chip at the core of its new Macs and iPads, all feature a Neural Engine with the same general purpose as the Tensor SoC.
It seems the companies agree on at least one thing: ML and AI have become increasingly important to the day-to-day experience of interacting with smartphones, PCs, and web-based services. We'll see how their responses to this shift compare when the Pixel 6 and iPhone 13 lineups make their debuts sometime this fall.