China builds brain-mimicking AI server the size of a mini-fridge, claims 90% power reduction — BI Explorer 1 packs in 1,152 CPU cores and 4.8TB of memory, runs on a household power outlet
Chinese scientists have unveiled what they call the "world's first brain-inspired computing entity," an AI "supercomputer" called the BIE-1, inspired by the operation of the human brain. While this claim to be the world's first isn't entirely accurate, the BIE-1 still offers exciting neuromorphic performance numbers, reportedly featuring 1,152 CPU cores and 4.8 TB of DDR5 memory while claiming to consume 90% less power than standard AI datacenter server systems.
As reported by the South China Morning Post, the BI Explorer 1 was revealed by the Guangdong Institute of Intelligent Science and Technology at a presentation in Macao. The server's size was likened to a mini fridge by the presenters, an appropriate comparison, as the machine reportedly never exceeds 70°C CPU temperature at max load.
A supercomputer the size of a mini-fridge? #China’s #BIE-1 is released as the world’s 1st brain-like intelligent self-contained neural processing supercomputer.The portable unit uses one-tenth the energy of traditional systems, making advanced computing easier to deploy.… pic.twitter.com/f6PZ9RxrVQOctober 25, 2025
The field of brain-like computing is actually not new, despite what the BIE-1 hype would have you believe. Intel's Hala Point neuromorphic computer system is perhaps the best-known brain-inspired computer, powered by 1,152 Loihi 2 CPUs, each capable of simulating one million neurons. The SpiNNaker 2 is another machine from the same mold, designed in a way that does not utilize any SSDs, HDDs, or GPUs to run brain-inspired AI models or store its data.
Where the BIE-1 breaks from the pack of today's brain-inspired computers is its size and its performance claims. While the Hala Point and SpiNNaker 2 platforms are traditional server-sized hardware, each taking up multiple server rack units for a full deployment, the BIE-1 is a standalone machine that functions using standard wall power, and claims to draw 90% less power than "a room-sized supercomputer" while offering the same performance. Obviously, the word "supercomputer" has such a loose definition that this claim is difficult to quantify.
The BIE-1 "has low power consumption and low noise, and can be called a miniaturised supercomputer, making high-end intelligent computing capabilities within reach," said GDIIST on its website. "It can be easily deployed in homes, small offices and even mobile environments."
The BIE-1 contains 1,152 CPU cores (matching Intel's Hala Point's 1,152 Loihi 2 CPUs), 4.8 TB of DDR5 memory, and 204 TB of storage space. The CPU hardware and software alike are both independently developed, unique systems, with the CPU cores designed to imitate brain neurons, and software designed to act as an AI neural network, performing standard AI training and inference tasks in an incredibly non-standard method.
Researchers claim the machine reaches training speeds of 100,000 tokens per second and inference speeds of 500,000 tokens per second. To put these huge numbers into context, Nvidia's flagship Blackwell GB200 NVL72 AI server advertises inference speeds of 1.5 million tokens per second.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
The BIE-1 computer will almost certainly be a big part in making neuromorphic computing at least that much more accessible to new engineers, being sold as a single unit rather than a rack-mounted system or part of a larger data center ecosystem. GDIIST hopes that the BIE-1 can be used for some of the more "important" AI use cases, such as medical research.
Prices and availability of the BIE-1 are currently unknown, not least because we have few firsthand sources for the device on the Western internet, but any cutting-edge technology with over 200 TB of high-speed storage is sure to cost at least an arm and a leg. If the BIE-1 does live up to half of its claims, it's sure to be a major win for neuromorphic computing and Chinese AI hardware.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Sunny Grimm is a contributing writer for Tom's Hardware. He has been building and breaking computers since 2017, serving as the resident youngster at Tom's. From APUs to RGB, Sunny has a handle on all the latest tech news.
-
jp7189 What's so special with the number 1152 that both this machine and the Intel machine before it have the exact same number of cores?Reply -
bit_user Reply
Thanks, but the token rate is very specific to the the model. Specifically, its makeup in terms of the number of layers, their sizes, and types. That determines the amounts and types of computations needed to process a single token.The article said:Researchers claim the machine reaches training speeds of 100,000 tokens per second and inference speeds of 500,000 tokens per second. To put these huge numbers into context, Nvidia's flagship Blackwell GB200 NVL72 AI server advertises inference speeds of 1.5 million tokens per second. -
bit_user Reply
Either coincidence, or maybe the Chinese team set a goal to equal or exceed Intel's number. I wouldn't read too much into it, though. It's very superficial and really doesn't tell us a whole lot.jp7189 said:What's so special with the number 1152 that both this machine and the Intel machine before it have the exact same number of cores?