At the end of March, we tried the HoloLens augmented reality (AR) experience again at Microsoft BUILD. One demo in particular, called Destination: Mars had our News Director, Seth Colaner, on the surface of Mars with a virtual avatar of Buzz Aldrin. Although the experience was targeted for tourists (it will be installed at the Kennedy Space Center Visitor Complex this summer) the demo used OnSight, an actual tool used by NASA scientists to explore the surface of Mars with HoloLens. Last week, I was invited to NASA’s Jet Propulsion Lab (JPL) in Pasadena to see how OnSight actually works in a scientific setting.
From Earth To Mars...In Pasadena
Unlike the small demo room in the Moscone Center, the OnSight experience at JPL utilized a massive room, with more than enough space (no pun intended) to fit one or even two small cars. There were multiple stations, each with the same OnSight experience, so that I could traverse the same surface with other journalists in the room--but more on that later.
The first wave of OnSight was a solo walkthrough of the Martian surface. As I walked around, Alex Menzies, the software lead for JPL’s augmented and virtual reality projects, tagged along next to me to explain OnSight’s various features.
When the program booted up, I was transported to Mars, specifically the Naukluft Plateau, an area visited by the Curiosity rover last month. In fact, the rover was the first thing I saw on the surface. Obviously, it wasn’t moving around in the virtual space, but nevertheless, it was astounding to see a 1:1 version of NASA’s latest rover on the Red Planet.
As for the surface itself, the area surrounding the rover was well-defined. I could see the various ridges of different rocks and even notice their shadows casting on the ground. As we found out in an interview with Menzies and Jeff Norris, the Mission Operations Innovation lead at JPL, OnSight is able to provide this high level of detail because multiple photos of the same area were shot from different positions.
The result is quite astounding. I could actually crouch down on the rocky surface and see every nook and cranny up close and in high detail. Granted, I could still see some pixelation as well as a rainbow streak pattern that started in my peripheral vision but ended up all over the screen, which distorted the color of the surface, but it was still quite the experience. (The rainbow effect appears to be related to the HMD's lenses, not the hologram.)
Menzies also showed me the main tool used by NASA’s scientists in OnSight. By tapping my finger in the air, I brought up a small toolbar that allowed me to place a custom point in the area, which would ping other users to show its location. I could also bring up a virtual ruler that easily allowed me to measure the distance between two points.
As I ventured farther away from the rover’s location, I noticed that the surface quality deteriorated. Soon it was hard to discern any details about the rocks around me. All I saw were pixelated shadows and ridges. These were areas that Curiosity didn’t visit yet. Instead, the images used in these pixelated areas were from the Mars Reconnaissance Orbiter (MRO), a satellite that orbits Mars. At its closest point, the periapsis, it’s 300 km away from the surface, while its apoapsis, its furthest point in orbit, is around 45,000 km.
After a few minutes of alone time on Mars, I was brought into the same session with the other journalists in the room. From my perspective, they appeared as transparent avatars walking around the surface. Then, someone broke the silence in the room as another avatar appeared on Mars. It was Katie Stack Morgan, one of the research scientists on the Curiosity rover mission, and she actually uses OnSight on a regular basis to investigate the Martian surface. After she showed us the various points of interest on the plateau, I had a chance to talk to her about her current project.
Specifically, Stack Morgan is trying to find deposits of silica, a certain type of rock composition. Silica, which appears on the surface as a bright rock, is a tell-tale sign that water traveled through the area. Based on the location of these deposits, she can determine the direction of the water flow as well as the strong Martian winds. By following the path of silica deposits, she hopes to find a source of water on Mars.
Planning The Next Rover
After some time on Mars, we tried another demo that was more down to Earth; specifically, we saw a model of a Mars rover that’s slated to launch sometime in 2020. Even though this demo also used HoloLens, it wasn’t an OnSight experience--it was another program called Protospace.
The area for this demo was significantly smaller than the OnSight demo (think of a room about 15 feet in length), and we gathered around what seemed to be just an empty portion of the room. Here, Norris discussed the potential uses of Protospace for the team at JPL. In short, it’s a method for various groups to collaborate together to see a virtual model of any ongoing project. In the case of the Mars 2020 rover, various engineers can take a look at the prototype model and make suggestions or send feedback on how to improve it before production, which saves time and more importantly, money.
With that in mind, he then revealed the rover in front of our very eyes. Even in its prototype phase, it was quite an awesome sight. Various parts were shaded in different colors, and you could crouch down to look at it from a different angle or walk around to see a specific part up close. If you happened to “walk” through the model, you would see some parts cut away, giving you a more detailed, cross-section view.
As a final twist, Norris grabbed a physical model of one of the rover’s wheel bases and compared it to the virtual model. It was an exact match, further showing that it’s possible to use HoloLens in lieu of a physical model as a way for engineers to collaborate on future projects.
In fact, Protospace was already used in a real-world scenario. Mechanical engineer Stephen Pakbaz told me that some of the rover’s technicians used Protospace to check on the size of its nuclear batteries, which are installed in the rear. Specifically, the technicians wanted to make sure that the batteries would fit while it was inside the rocket so that there wouldn’t be any problems during the flight to Mars.
Obviously, the rover has to be build in a traditional computer-aided design (CAD) program before it enters protospace. According to Marcutte Vona, a producer on the Protospace project, the model takes up “several gigabytes” as a CAD file. However, the team takes a low-level detailed version of the model from the program and makes the file smaller by removing certain parts that technicians wouldn’t be looking at in the first place, such as the rover’s various bolts and screws. Then, it is put through a lossless compression program (Pied Piper, anyone?) before it enters Protospace. By the time it’s loaded onto the HoloLens program, Vona said that the file size is about half a gigabyte.
For now, I could mark only various points of interest on the rover. However, the developers of Protospace are working on new features for users such as the ability to turn and rotate the object. The program developers are working closely with the actual users to figure out the best features for Protospace. One thing Pakbaz mentioned was that he wanted to see the assembly of certain parts in AR.
The Wave Of The Future
The fact that both applications are functioning well with a device that’s still in development is quite astonishing. OnSight and Protospace proved that AR (and more specifically, HoloLens) is a viable option for NASA’s various endeavors such as planning a new rover for Mars exploration or learning more about its surface.
Still, nothing beats the real thing. It will take many years (and a lot more money) to get the first humans on Mars, but with these HoloLens applications, the team at JPL can get a head start in gathering crucial data on its surface as well gain enough expertise to build the next spacecraft that will land humans on the Red Planet.