In a development that must have Disney’s, if not Stark Industries’, lawyers on standby, British robotics guru and part-time Iron Man cosplayer James Bruton (opens in new tab) is building a prosthetic arm that uses machine learning and a Raspberry Pi Zero W to move on its own. What could possibly go wrong?
Bruton wasn’t keen on embedding electrodes in his own skull, so instead looked to his other limbs as sources of input for the arm. He created a wearable motion capture suit to gather the data, outfitting it with various microcontrollers including a Teensy 4.1 and an Adafruit MPU-6050 measurement unit. A similar conglomeration was mounted to a headband to keep track of the user’s head.
By feeding data from the sensors to a Raspberry Pi Zero (opens in new tab), Bruton was able to train with repetitive motions until the arm, which he designed and 3D printed, was able to correctly predict what it should do from incoming sensor data.
Mounted on a backpack, the arm works well for simple tasks, such as raising when he lifts his left leg, and lowering again when raising his right leg. Future plans include additional sensors to enable more actions to be detected, and the possibility of using electromyography to sense brainwaves using a headband rather than sub-dermal probes, allowing the arm to read its user’s mind.
Anyone keen to follow in Bruton’s clanking footsteps can find his CAD files and code on Github (opens in new tab).