Researchers build world’s first “microwave brain” chip that can think like AI and talk like a radio — all at gigahertz speeds

The low-power microchip researchers call a “microwave brain” is the first processor to compute on both ultra-fast data signals and wireless communication signals by harnessing the physics of microwaves.
(Image credit: Cornell University)

In a quiet Cornell University lab, researchers have taken a hammer to decades of digital circuit convention. The result is a silicon chip that thinks less like a clock-driven processor and more like a living brain—only instead of neurons, it uses controlled bursts of microwave energy.

Dubbed the “microwave brain,” this experimental processor can juggle two jobs at once: crunching ultrafast data streams and talking wirelessly, all inside a footprint small enough for a smartwatch. And it does it at just 200 milliwatts, a fraction of the power a comparable digital neural network would consume.

How it works

The secret lies in abandoning the step-by-step digital approach. Traditional chips march data through binary logic gates in sync with a clock. Cornell’s design instead pushes information through tunable microwave waveguides, letting patterns emerge and be recognized in real time at tens of gigahertz with no waiting or bottlenecks.

Each waveguide acts like a “physical neuron,” where the microwave signal’s amplitude, phase, and frequency can be shaped to represent data features. These features interact and interfere with each other in the analog domain, producing a rich set of patterns before the signal is ever digitized. This physical mixing and propagation essentially perform the feature extraction and transformation that digital networks usually achieve through multiple software layers.

Microwave Brain

A scientist examining a brain's neural pathways and fiber tracts, akin to those that inspire physical neural models in computing research. (Image credit: Getty Images / Westend61)

The chip’s design builds a type of AI framework directly into the hardware, using the natural behavior of microwaves to process incoming data streams. Instead of storing values in memory and repeatedly performing huge numbers of calculations, it lets the microwave network itself handle the heavy lifting. Small adjustable components—such as electronic tuners and signal shifters—can change the pathways inside the chip on the fly, allowing it to switch between different AI tasks without having to retrain from scratch.

What it entails

In tests, the chip classified wireless signals with 88% accuracy or better, matching the performance of far bulkier digital models. Crucially, that accuracy held steady across both simple and complex jobs, without the extra circuitry and error correction digital systems typically require.

Because the hardware is naturally sensitive to changes in signal behavior, its uses extend beyond AI computation. It could watch for anomalies in wireless traffic, track radar targets, or decode crowded radio channels. With further refinements, the team believes it could even sit inside personal devices, running local AI models without leaning on cloud servers.

The "microwave brain" is still in prototype form. Still, backed by DARPA and the National Science Foundation, the Cornell team is already working on scaling and integrating it into existing microwave and digital systems. If they succeed, the line between computing and communication hardware may soon blur—ushering in an era where your phone's processor is also its antenna, and your watch thinks without ever calling home.

Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

TOPICS
Hassam Nasir
Contributing Writer

Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.

  • acadia11
    The age of the thinking machine is nigh on the horizon. Positronic Brain here we come! Humanity you will be replaced! Not sure if it will cyberteon or the Matrix … a better model is on the way.
    Reply
  • bit_user
    Okay, so yet another analog neural network chip, but with a twist.

    Since each neuron is a transmitter/receiver, I wonder how well this scales. Furthermore, what about EMI - both on the susceptibility front, and interference with other devices?
    Reply
  • chaz_music
    bit_user said:
    Okay, so yet another analog neural network chip, but with a twist.

    Since each neuron is a transmitter/receiver, I wonder how well this scales. Furthermore, what about EMI - both on the susceptibility front, and interference with other devices?

    Oh yes, EMI. Especially if the configuration creates upper harmonics. There goes the WiFi.
    Reply
  • acadia11
    chaz_music said:
    Oh yes, EMI. Especially if the configuration creates upper harmonics. There goes the WIFI
    Could you imagine if Data of Star Trek was stopped cause of the Xfinity router in the living room! Now that’s comedy.
    Reply