Google Unveils 4th-Gen TPU Chips for Faster Machine Learning

 At its I/O conference tomorrow Google will unveil a preview of Google Cloud’s latest machine-learning clusters, which not only aim for nine exaflops of peak performance, but do it using 90% carbon-free energy. It will be the world’s largest publicly available machine learning hub.

At the heart of the new clusters is the TPU V4 Pod. These tensor processing units were announced at Google I/O last year, and AI teams from the likes of Meta, LG, and Salesforce have already had access to the pods. The V4 TPUs allow researchers to use the framework of their choice, whether Tensorflow, JAX, or PyTorch, and have already enabled breakthroughs at Google Research in areas such as language understanding, computer vision, and speech recognition.

Based in Google’s Oklahoma data center, potential workloads for the clusters are expected to be similar, chewing through data in the fields of natural language processing, computer vision algorithms, and recommendation systems.

Tensor Processing Units in a Google Data Center

(Image credit: Google)
Ian Evenden
Freelance News Writer

Ian Evenden is a UK-based news writer for Tom’s Hardware US. He’ll write about anything, but stories about Raspberry Pi and DIY robots seem to find their way to him.