Skip to main content

Mobile, Inside-Out Roomscale 6DoF Tracking With Any Decent Smartphone?

We were sufficiently excited to see what Oculus has cooking with Project Santa Cruz, its untethered Rift prototype with inside-out roomscale mobile tracking, but just the night before, we saw the same clever tech--using regular smartphone cameras and software-based sensor fusion to enable 6DoF tracking--employed by a company called Dacuda.

All You Need Is A Smartphone

In a hotel lobby across the street from Oculus Connect 3 (which was in the San Jose Convention Center), we met with Dacuda’s Erik Fonseka and Lukas Schleuniger. Fonseka pulled out his smartphone and showed us real-time, roomscale 6DoF tracking on it. What was surprising about this demo was the fact that he was using just his smartphone. It had no extra sensors strapped to it, and there were no external markers at play.

Dacuda’s inside-out mobile tracking software is called SLAM Scan, and it employs a technique known as sensor fusion. The software uses data from the smartphone’s existing camera and onboard sensors (such as the gyro and accelerometer) to calculate where the device is in space and translates the physical location and movement into a movement within a virtual environment.

This is a big deal because almost all mobile VR solutions currently on the market--including the excellent Samsung Gear VR and Google’s impending Daydream--have only 3DoF tracking. That is, you can enjoy your virtual experience by strapping on a headset and sitting quietly while you look around. Essentially, 3DoF tracks only your head movements; if you stand up and walk around, the virtual world will not move with you. To get that positional tracking, you need 6DoF.

One mobile XR product that does offer 6DoF is Google’s Project Tango. Shipping soon via Lenovo’s Phab2 Pro smartphone, Project Tango offers a world of (primarily) augmented reality capabilities, but it requires special hardware, such as a motion camera, structured-light projector, and an IR camera.

Looming Competition?

Oculus’ Project Santa Cruz also uses standard smartphone cameras (although it uses four of them to Dacuda’s one) and software sensor fusion. But instead of being nervous about this sudden, Facebook-funded competition, Dacuda was glad to see it.

“It really confirms what we are doing in the space of inside-out tracking and how relevant this is for roomscale VR,” Dacuda’s Lukas Schleuniger told Tom’s Hardware later in an email. “This is fundamentally different from the Oculus approach, which is doing the inside-out tracking with a separate and specialized hardware headset, which results in a much higher price.”

In other words, yes, it’s essentially the same tech, but Oculus is aiming for a highly proprietary, tightly-integrated implementation. By contrast, Dacuda’s SLAM Scan tech can be implemented widely, on essentially any smartphone (iOS, Android, and even specialized hardware) that has the requisite onboard sensors. And because those smartphones serve as the engine and display of most mobile VR--working in tandem with “dumb” HMDs for viewing, such as the aforementioned Gear VR and Daydream--Dacuda has an extremely wide and growing potential install base.

“Our goal is to bring true room-scale VR to as many people as possible, and for that you need a low price point and a broad range of platforms,” said Schleuniger. He further noted that because new smartphones bring better displays and technology every year--in other words, merely because of existing and predictable market forces--the Dacuda-based experience would continue to improve.

Things are moving fast for Dacuda. Just last week, it suddenly forged a collaboration with MindMaze to get a version of its technology into the company’s healthcare-oriented “MMI” platform. Next week, Dacuda CEO Dr. Peter Weigand will be at Qualcomm’s 4G/5G Summit in Hong Kong, presenting a talk about how software can turn standard smartphones into 3D scanners.

  • Joao_Pedro
    When they say Oculus will be higher price... probably we will need a $400 to 700 phone to use this application. I wonder how much expensive a dedicated system will be. I am sure will be around the same price or even cheaper with much better performance. Phones are developed with "phone" in mind not to be a VR equipment. But developing and optimizing is way to go, and probably can be a cheap way for developers to have equipment to test.
    Reply
  • grimfox
    Frankly, I'm surprised it took this long for someone to figure all this out.
    Reply
  • hdmark
    18728691 said:
    Frankly, I'm surprised it took this long for someone to figure all this out.

    I agree with you. Granted... i have VERY limited programming knowledge/experience but I feel like this would have been somewhat simple to do awhile ago.

    Can anyone who knows jump in and explain what is complicated about this? Or potentially details on how it works?
    Reply
  • Jeff Fx
    18728691 said:
    Frankly, I'm surprised it took this long for someone to figure all this out.

    The basic concept is obvious, but tricky to get working well enough to be usable without barcode posters on your walls, ceiling, and maybe even the floor, considering the poor FOV on a cellphone camera.

    Getting it to work on a phone while VR apps are running would also be a challenge. Phones already overheat and have poor battery life when using them for VR. Add more processing requirements, and that gets worse. Phones will get faster and better at dissipating heat, but games will also get more demanding at a similar rate.
    Reply
  • grimfox
    I figured doing edge detection would be relatively easy and then tracking the changes in position placement and scale of those edges would be the key part of tracking. Basically take the image blow out the contrast find edges compare like edges to last image. There are limits to how fast a head can move and ideally the image is relatively low res so blur isn't as much of an issue for a high end smart phone camera. But like you said simple concept, clearly difficult implementation.

    As far as heating I'd bet you could make a buck or two selling a passive heatsink back shell/cover for VR use. Especially if this take on mobile VR is as good as it seems to be and takes off.
    Reply
  • Elektrobomb
    The hard thing is accounting for sensor drift. I tried making something similar to this with an arduino (absolute positional tracking using accelerometers, not using it in vr) and the main problem with it is that practically all accelerometers that are at a reasonable price we have today have a small but significant amount of drift. For example, if I moved the sensor 10cm left then 10 cm right with the setup I was working with, where it should display 0cm total moved, it displayed about +/- 1 or 2 cm. That is quite a significant amount of drift especially in vr where it will accumulate over time and can make people sick
    Reply
  • aivijay
    You should check out Impression PI on Kickstarter which does inside out tracking on for mobile VR and the shipping is going to start very soon. Dacuda's tech is even cheaper where its just using the smartphone camera and doing SLAM and sensor fusion. Impression PI also uses SLAM with hand tracking along with positional tracking in a single package.
    Reply
  • kenyee
    Qualcomm had a demo at a VR hackathon I went to that did this as well, but it needed two cameras for calculating the room mesh. They said Oculus and HTC were moving too slowly so they wanted to push their own version :-)
    Next year should be a good year for mobile VR....
    Reply
  • scolaner
    18728282 said:
    When they say Oculus will be higher price... probably we will need a $400 to 700 phone to use this application. I wonder how much expensive a dedicated system will be. I am sure will be around the same price or even cheaper with much better performance. Phones are developed with "phone" in mind not to be a VR equipment. But developing and optimizing is way to go, and probably can be a cheap way for developers to have equipment to test.

    It's true that you need a good smartphone, but the value there is that almost everyone *already* has one. So buying the phone is not really part of the expense. (Maybe you'd be lured into buying a more expensive phone than you would otherwise, though.)

    With Rift/Vive, you need a burly PC and THEN buy the $800 VR components. With this sort of tech, all you need is an inexpensive HMD. Cardboard is just a few bucks.

    You raise a good point about dedicated VR hardware versus something that does double duty. In the next year or so, we're going to see how well that concept plays out. Google's Daydream is built on that concept, so at least one ginormous company is wagering that this sort of lower-end VR is going to make people happy and engaged, at least to an extent.
    Reply
  • bit_user
    18729033 said:
    Can anyone who knows jump in and explain what is complicated about this? Or potentially details on how it works?
    People have demonstrated SLAM on smartphones for a while. Here's a video describing sensor fusion on Android, dated 6 years ago:

    https://www.youtube.com/watch?v=C7JQ7Rpwn2k

    I think what Dacuda isn't telling you is that the environment needs a certain amount of visual clutter. If you're standing in a small room with bare plain walls, then you'll need to hang some posters. This is why Oculus, MS, ... everyone else needs multiple cameras. They at least need to be able to see things like multiple corners, where the walls meet the ceiling or floor, in order to reliably extract pose information accurate enough for AR or VR.

    IMO, it's not surprising they're running this on a phone. The questions are: what is the platform requirement and how much of the CPU/GPU is left for your app? The mid-range/low-end phones have been improving at a pretty good pace, so most 64-bit phones will probably have adequate performance. As Jeff says, battery life & heat will be issues, if the app depends on their SLAM engine continuing to run at full frame-rate, for very long.
    Reply