When we last saw Elliptic Labs back in early 2015, we witnessed some impressive device gesture control using ultrasonic signals. This week at Mobile World Congress, we learned that the company is now technically shipping the product, and representatives said we should expect to see it in smartphones later this summer and early fall.
Piecing together some clues that I heard in my discussions, it sounds very likely that the next iPhone (presumably the iPhone 7) will include the product.
A few reminders about how the technology works, snipped from Matt Humrick’s piece from CES 2015:
This touchless technique requires a minimum of three microphones and one transducer to detect the motion of a hand and its distance from the screen. The demo unit in the video uses four microphones, located in each front corner, and two transducers, located above and below the screen, to improve gesture recognition and precision.The signals from the sensors are fed to a DSP (Digital Signal Processor) in the onboard SoC, which interprets the data without the need for any additional chips. Elliptic Labs provides the DSP driver and gesture software for OEMs to integrate, along with an SDK for developers.
A few other details and changes: First, Guenael Strutt, the company’s product development VP, said that gestures can arrive from distances of up to 15 feet, depending on the implementation.
Also, while the company is looking at detailed hand tracking for the future, right now the implementation is a bit more simplified. For example, most of the demonstrations I saw were just “air” hand taps, rather than swipes and more fine-grained controls. Although it’s a simple thing, one of the demonstrations allowed the phone to show only a picture or other wallpaper, but as your hand gets closer to the screen, the normal app icons begin to appear, clearer and clearer the closer your hand gets to the screen. Another showed a completely blank screen, but when your hand comes toward the phone, the alarm clock comes into view.
The technology isn’t just for smartphones and tablets. All sorts of gadgets — lamps, speakers — that have IoT (Internet of Things) hardware can use the software to enable gesture controls. “If the DSP can accept C code, we can do it,” Strutt said.