Adshir: Using Ray Tracing For Better AR Quality

LAS VEGAS, NV -- A small company called Adshir promises to bring ray tracing to AR, thus significantly increasing the quality of the images we see. It promises movie studio-quality rendering, but in real time, all on mobile hardware, with a software product called LocalRay.

Towards Total Realism

You can read a primer of ray tracing here, but for the purposes of AR, suffice it to say that it provides superior lighting, reflections, and shadows, which affords much more realism than rasterized images. It’s often used in rendering animations for films. Adshir reps described rasterized AR images as being disconnected from their environment--lively and lovely animations awkwardly crammed into the real world. By contrast, they want to use ray tracing to make an AR object part of the scene. The goal is total realism.

And indeed, the film industry tends to do a fabulous job of realism in animation, and ray tracing is a key technique to use to that end. The tricky part, for AR, is that movie studios use powerful machines to do all that rendering, and it takes a long time to complete; Adshir wants to do it in real time.

Another Dino Demo

Seeing is believing, so Adshir showed us a demo of a dinosaur stomping on a tabletop. They used a Microsoft Surface tablet to run it. As you can see, the dinosaur reflects the real light in the room, and his shadow obeys it as well. When the dinosaur moves, it all changes accordingly. (Our favorite bit in the demo was when the dinosaur “walked” across a phone lying on the table--and left little dino footprints on the black touchscreen.)

The dinosaur itself was sometimes jagged and jittery, but the lighting, reflections, and shadows persisted, and there was no certainly no observable lag otherwise. The image was running at 60 fps.

The Adshir reps readily admitted that the tabletop and lighting were set up explicitly to help this demo run--in that way it was sort of a lab environment--but even so, it was an effective demonstration.

Adshir also said that LocalRay is “battery power aware,” but claimed that performance isn’t affected by the feature.


So they can do it; the question is how. The company describes its secret sauce thusly:

LocalRay uses proprietary, patented algorithms designed from the bottom up for VR/AR physically accurate ray tracing technology. This new approach is based on the elimination of the acceleration structures (AS), which are a core component in every ray tracing system today. This elimination reduces the expensive traversals time, and saves the repeating reconstructions of AS for every major change in the scene. Both, traversals and reconstruction, which are stoppages for real-time, are now a thing of the past.

Simply, it requires fewer rays. This AS replacement is called the DAS (dynamically aligned structures), and it’s proprietary to Adshir, but it just uses a conventional GPU pipeline.

Adshir said it has 10 granted patents and another 11 pending around LocalRay. The company expects to have an SDK for licensing ready soon. The software is designed to be a plug-in to Unity, Unreal, ARkit, ARcore, Vuforia, and more, and as demonstrated, it can run on existing hardware. Adshir reps wouldn’t say precisely, but it’s apparent to us that they’re courting all the big names in XR.

This isn’t the first time that we’ve seen ray tracing appear on surprisingly light hardware; a year ago, Imagination demoed ray tracing on mobile hardware, too. Adshir, though, intends to use LocalRay on mobile XR hardware.

  • derekullo
    The better question is how is the 1/25 scale Allosaurus going to drink the water out of those narrow glasses?

    Maybe that is why he looks so angry
  • bit_user
    Cool, but why limit it to AR? I really want to see raytraced VR. And if their algorithm truly scales (big "if", there), it should fly on desktop hardware. Of course, given what they were saying about doing away with conventional scene graph traversal methods, it's quite possible that it won't scale to match conventional rasterization in VR environments.

    I think a key point is that they seem to be assuming the lighting and environment can be learned with sufficient precision and accuracy. IMO, this is possibly a harder problem to do well than the actual rendering part.

    I think AR needs to work pretty robustly, to gain acceptance. It's no use having graphics that are more realistic in tightly constrained scenarios, if they look glitchy and wrong in more real-world cases. Therefore, I expect most apps will opt for slightly less realism, in favor of fewer glitches and less flickering.
  • bit_user
    20576112 said:
    It’s often used in rendering animations for films.
    I'm not up-to-speed on the rendering techniques currently used in the film industry, but I know this is historically false. Indeed, the 2009 article to which you linked says this:
    First of all, many gamers think that ray tracing is intrinsically a better algorithm than rasterization because "it's what the movies use." That's false. The majority of films using synthesized images (and all of Pixar's) use an algorithm called REYES that's based on rasterization.
    Of course, it goes on to add a qualification, but at least I felt vindicated.

    That said, I like ray tracing. It has a certain elegance to it, even if most of its benefits can be hacked into polygon-based rasterizers.

    Before Apple scuttled their arrangement with Imagination, I was really hoping they'd bring it into the mainstream by including it in their next iPhone. With their direct control over the hardware & APIs for such a large phone market, they could've moved the industry, all by themselves. Even Google couldn't do that.