uSens Hand Tracking Tech Mixes AR And VR

uSens doesn’t give a flip about hardware, at least not really. That approach may seem at odds with a company that makes a device that enables hand tracking in VR, but as uSens CTO Yue Fei explained to us at GDC 2017, the company’s true focus is software. “Hardware is the easy part,” he told us.

uSens does make hardware, which we first encountered in late 2016, and it’s continued to iterate on that technology. More or less, uSens does the same thing that Leap Motion does--hand-tracking for HMDs using small add-on hardware--and Fei openly admitted that it would be probably unfair to claim that one is necessarily better than the other. He did, however, note that they are different in that each company has its own proprietary algorithms. Thus, some differentiation will likely emerge as the two companies build out their respective IP.

(Badly) Juggling Spheres

When we visited uSens’ booth at GDC, we were presented with a demo. We entered a mostly empty virtual space; in it were several shapes--cubes, pyramids, and spheres--and we could pick them up and play with them. Although the tracking wasn’t perfect--sometimes we had to try a couple of times to grab a shape--it was strong enough that we could toss them gingerly from hand to hand and even throw them. Getting the right angle, and timing the release on a throw, was tough, but the system did response accurately to the velocity of our throws. (I tried to juggle two shapes at once, but that seemed too much for the tracking. It probably didn’t help anything that I’m not very good at juggling to begin with.)

We could also create new shapes, grab the edges and make them larger or smaller, and toss them around and watch them interact with each other. This mechanic worked remarkably well.

But uSens offered a further twist by recognizing hand/finger gestures and triggering actions based on them. For example, you could make a fist and return to a home screen, or give a thumbs to signal that you’re ready for a game, or--you get the idea. uSens currently has four gestures programmed in--a fist, a gun (thumb up, forefinger extended), peace sign, and thumbs up--but Fei said that the company is planning to develop 20 of them in total.

uSens figured that four is enough to get developers rolling for now. The gestures are built into the SDK, which should make devs lives easier, too.

We found the gesture input to be about as fluid and accurate as the shape-grabbing portion of the demo, which is to say, it works fairly well but has room for improvement.

How It Works

Currently, all of uSens’ tasty technology is wrapped up in a slim little package called the Fingo (you’re correct, it sounds like the writing team from Futurama named it). You can slap the Fingo on the front of a variety of HMDs so it can do its thing.

It’s capable of 6DoF inside-out tracking thanks to the sensor’s SLAM scan optimization, and it does skeletal tracking on all of your finger joints (26DoF). We noticed that Fingo still tracked fairly well even when we put one hand behind the other, which we found surprising; Fei confirmed that the tech can track up to about 70% hand overlap.

The latency is noticeable, and uSens has it pegged at about 15-20ms. The sensor can do about a 140-degree FoV, although the software is limited to about 120 degrees horizontal by 100 degree vertical.

One of the questions we had for Fei is that if SLAM works on smartphones (as we’ve seen), and Fingo also uses SLAM, then why does Fingo exist? He replied that smartphone cameras are too optimized for pictures, have too low a frame rate, and have too narrow a field of view, and further noted that you need a higher frame rate to reduce latency.

Even so, one wonders if uSens’ tech couldn’t be moved from an external assembly to an embedded camera that has those necessary features. Fei noted in passing that Qualcomm’s reference design has a pretty good camera for this purpose.

He did not indicate which reference design he was referring to, the VR820 or the VR835, but in either case, the comment pricked up our ears. Is uSens participating in Qualcomm’s HMD accelerator/dev kit program? Fei wouldn’t say either way, but we gleaned enough that we believe it is, or at the very least, that it's worked with Qualcomm to an extent.

Toggling Between VR And MR

Although we didn’t see this particular feature in our GDC demo, one of the most notable capabilities of the Fingo is that it can toggle between virtual reality and mixed reality. For the most part, you’re in a fully occluded VR experience with your hands tracked; but you can also engage the passthrough camera to get a “portal” into the real world.

You can see this in the video below. Skip to 2:50 to see the mixed reality magic happen. (You can see another video demo here.)

This portends some intriguing possibilities. For one thing, the portal lets you interact with virtual objects even as the real world is your backdrop. Your hands are, of course, still tracked, but instead of a rendered facsimile of your hands, you see the real thing. This could also be used as a proximity alert safety system, not unlike what Sensics CEO Yuval Boger suggested when we spoke to him at GDC: When you get too close to an object or barrier, the HMD could kick on the passthrough camera so you can safely avoid peril.

What’s Next: Getting Smaller, Adding Machine Learning

There’s another Fingo coming that offers a color camera, but it seems that’s just a waypoint in the process. The next piece of hardware will be smaller than the Fingo is now; Fei said that it might be as small as a pen (1 x 6cm or so) and will have its own IMU.

Again, though, uSens is ultimately focused on software more than hardware, and to that end, it’s taken a machine learning approach to its hand tracking technology. Fei noted that uSens has been working on AI and machine learning since 2014, and the tech could be optimized for deep learning chips.

Fingo works with Windows 7 and later and Android 4.4 and later. Compatible devices currently include the HTC Vive, Oculus Rift, Gear VR, Google Cardboard, and Daydream. The device is not yet available to the public, but developers can dive in now.