Microsoft recently showed off a muscle-controlled interface that it was developing with the help of engineers at the University of Washington and the University of Toronto. Now they've teamed up with a Carnegie Mellon student to develop Skinput; a technology that turns your skin into an input device.
Skinput is described as a bio-acoustic sensing technique that allows the body to be used as an input surface. Redmond and Carnegie Mellon developed a special armband that is worn on the bicep of the user and senses impact or pressure on the skin. It also measures the acoustic signals created by that impact. Variations in bone density, size and mass as well as the different acoustics created by soft tissues and joints mean different locations are acoustically distinct. Software listens for impacts on the skin and classifies each one.
The video below demonstrates a number of uses for using your skin as a controller; playing Tetris using your hands as a control pad or controlling your iPod using taps to your fingers. The engineers also incorporated a pico projector into the armband, meaning virtual 'buttons' can be beamed onto the user's forearm or hands.
Check out the video to see the technology in action.
[Update] Thanks to Carnegie Mellon's Chris Harrison for pointing us in the direction of a clearer video. As a third year Ph.D. student in the Human-Computer Interaction Institute at CMU, Chris worked with Desney Tan and Dan Morris (both from Microsoft Research) on Skinput. His research focuses on new or clever interaction techniques and input technologies that allow for better control of smaller devices. For more on Skinput, and Harrison's other projects, check out his website.
*Image via Chris Harrison @ chrisharrison.net