The First Frontier for Quantum: Data Center Accelerators

Quantum Computing
(Image credit: Shutterstock)

Even as quantum computing develops at an increasingly fast pace, the technology is still far from achieving mainstream distribution. There are several reasons for that - physics and engineering complexity, cost, and the relatively nascent implementations being some of them. There are computing environments that have carried the torch for the complexity of the so-called classical systems: High-Performance Computing (HPC), the domain of the datacenters and supercomputers of the world. There too, it seems, lies the first frontier for quantum.

Pawsey's Supercomputing Research Centre in Australia has claimed the world's first installation of a Quantum Computing Processor (QPU) in an HPC-first environment. Based on Quantum Brilliance's diamond-based qubits, the partnership has been strategized to supercharge the pairing of quantum and classical systems through a hybrid research environment. The integration was facilitated by the fact that Quantum Brilliance's QPU can operate at room temperature - something that other qubit types, such as IBM's own superconducting transmon qubits, can't.

In Munich, Germany, the Leibniz Supercomputing Centre already has a quantum computing hub that's focused on creating the algorithms and tools that can bridge the quantum and classical realms via its Future Computing initiative. The hub is currently integrating one of AI accelerator's darlings, Cerebras' Wafer Scale Engine (CS-2). Further up in the globe, the UK government has recently also dipped its institutional toes in the world of quantum, acquiring a photonics-based quantum computing system from Orca Computing.

Another AI-forward chip designer, Ampere, has also entered into an HPC-integration partnership with Rigetti, which produces superconducting-qubit-based QPUs.

The hyper-sensitiveness of quantum computers to their surroundings has also meant that most quantum processing offerings available today are only accessible through a cloud-enabled environment. This allows quantum systems to be physically located on their designers' special-purpose installations while allowing for remote access. QPUs such as Xanadu's record-breaking Borealis are made available through the company's cloud environment. The same process holds true with IBM's Quiskit, and Nvidia's software-based quantum simulation cuQuantum platform. These stand as examples of cloud-accessible quantum computing simulators available today for researchers worldwide - with the only requirement being an active internet connection.

Amazon, which offers its own cloud-based supercomputing services, has also extended its offering towards the quantum computing realm by partnering with a number of quantum-forward companies. For example, Amazon Braket offers customers cloud access to various quantum topologies: quantum annealing systems from D-Wave, ion-trap quantum processors from IonQ, and superconducting qubit systems from Rigetti and, again, IonQ.

Phillipe Notton, CEO of SiPearl, envisions the future of QPUs as co-processors to the CPU and GPU accelerators of classical computing. The France-based company stands as one of the leading chipmakers for European exascale systems and is currently developing its Arm-based Rhea CPUs for integration as early as 2023. According to Notton, classical systems will be an indispensable part of quantum, serving as mediators for quantum accelerators.

It'll take long development times until mainstream quantum computing solutions are made available - and some might never be - in an off-the-shelf manner. Until then, HPC centers' secure, leading-edge infrastructure, cooling, and power delivery designs stand as essential elements towards enabling and democratizing access to quantum computing.

Francisco Pires
Freelance News Writer

Francisco Pires is a freelance news writer for Tom's Hardware with a soft side for quantum computing.