Neural Chip Plays Doom Using a Thousandth of a Watt
BFG, tiny TDP.
Doom is the game that became a benchmark. From its humble beginnings on a 386 PC, it has been ported to run on everything, even the humble Raspberry Pi Pico. "Big deal," you say, but this story from IEEE Spectrum moves away from running Doom on lower-spec hardware. Instead, we see how an ultra-low power chip has learned how to play Doom using just one milliwatt of power!
Let's quantify 1 Milliwatt of power. It is 1/1000 of a Watt, but even that low level of power consumption is hard to comprehend. Take, for example, Nvidia's RTX 4090, this card can consume around 400-450 Watts of power. That is roughly 400,000 times more power than Syntiant's NDP200 uses. Sure, the NDP200 won't make it onto our list of Best GPUs, as it is more about using data to make decisions based on training. The Doom slaying is just for fun.
Syntiant's NDP200 (Neural Decision Processor) is an ultra-low power chip for neural networks. It is primarily used to monitor video and audio to trigger events that other systems will react to. The NDP200 can run at up to 100 MHz and even has 26 GPIO pins, just like the original Raspberry Pi.
Syntiant trained the NDP200's neural network using VizDoom, a version of Doom used for AI research and reinforcement learning from raw visual information. Training required understanding what the neural network was seeing, primarily identifying the enemy and ultimately defining a response. In this case, see demon, shoot demon. The "player" is tasked with defending a circular room that is under constant attack. The neural network had to learn how to play Doom, which also meant learning how to conserve ammunition. The neural network consisted of approximately 600,000 parameters, all of which were squeezed into the NDP200's 640Kb of RAM and the neural core, running at close to 9 gigabytes per second of bandwidth.
The purpose of the demo is not to show how well it can play Doom, but to demonstrate how efficient the NDP200 is at "bounding-box person detection" which would normally require a much more powerful processor. Using just 1 Milliwatt of power to scan six frames of video to perform this task, the NDP200 could be easily integrated into vehicle and home security systems. Syntiant claims that this is 1/100 of the power that an Arm Cortex A53, the same Arm chip that powers the Raspberry Pi 3.
For now, the NDP200's AI is limiting its carnage to demons. We just hope that it doesn't start talking to Bing's chatbot.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Les Pounder is an associate editor at Tom's Hardware. He is a creative technologist and for seven years has created projects to educate and inspire minds both young and old. He has worked with the Raspberry Pi Foundation to write and deliver their teacher training program "Picademy".
-
PlaneInTheSky It's not suprising that it just uses 1 milliwatt, it's a 5 line algorithm with 3 inputs. It's turning left and right and randomly fires until it hits something. It has no sense of 3D space.Reply -
Endymio
Did you not read the article? It's not doing that at all. It had to learn to scan video frame, detect a "demon" within the frame, and kill it, while conserving ammunition in the process: a neural network of approximately 600,000 parameters.PlaneInTheSky said:It's not suprising that it just uses 1 milliwatt, it's a 5 line algorithm with 3 inputs. It's turning left and right and randomly fires until it hits something.
Low-power chips like this are the future of computing. Once you get into the sub-mw range, you can harvest ambient energy from the environment, allowing IOT devices to operate without batteries or wires. The home of the far future might have hundreds of thousands of such devices in it, some small and light enough to literally float in the air. -
Ugh. I want the home of the past. No gizmos plsReply
The more they push, AI, the more I consider it like Battlestar Galactica, we’re gonna need to revert to the old ways if we’re not careful
I don’t want anything in my house having Internet capabilities, unless I choose what it is like my modems pc and 📱 -
Endymio
You should avoid those horseless carriages also. They can kill people outright!Mandark said:Ugh. I want the home of the past. No gizmos pls
The more they push, AI, the more I consider it like Battlestar Galactica, we’re gonna need to revert to the old ways if we’re not careful
I don’t want anything in my house having Internet capabilities, unless I choose what it is like my modems pc and 📱 -
USAFRet
The devices are one thing.Endymio said:The home of the far future might have hundreds of thousands of such devices in it, some small and light enough to literally float in the air.
How the manufacturers and hosts monetize them, and you, is a whole other thing.
A BIG pile of 1980's electronics has been replaced by the little cellphone in your pocket.
With a LOT of monetization and tracking to go along with it. -
bit_user
He doesn't care. His brand of trolling is to dismiss everything as derivative, unremarkable, or downright bad.Endymio said:Did you not read the article? It's not doing that at all.
He talks like he's been there & done that, but he's obviously never attempted anything like this. If he had, he might actually appreciate some of the challenges.
Which is actually quite small, for object detection.Endymio said:a neural network of approximately 600,000 parameters.
Yeah, the key point is that it's low-power enough to embed object detectors in everyday electronics. That's a game-changer, since it means you could potentially have something like a doggie door that unlocks only for your dog and not racoons, squirrels, or even a neigbor's nosy cat that tries to enter. Just to give one example.Endymio said:Low-power chips like this are the future of computing. Once you get into the sub-mw range, you can harvest ambient energy from the environment, allowing IOT devices to operate without batteries or wires. -
bit_user all of which were squeezed into the NDP200's 640Kb of RAM
😬 ...struggling not to make a 640K joke. (and was that KB or really Kb?)
Anyway, the article is somewhat annoying in that it makes bizarre comparisons with high-end GPUs. The better approach would be to compare it with other embedded processors and neural IP blocks.
Also, could've used a few details about the chip. I'm sure it's holding the entire model on-die (probably in SRAM). That power budget leaves no headroom for external DRAM. Also, it's almost certainly using integer or fixed-point datatypes, and probably has very limited amount of programmability.