The Facebook Reality Labs research team today highlighted new advancements on the audio front of their augmented reality (AR) work. The ultimate goal is to create a pair of AR glasses. These steps are part of the process to refining the end product.
Seeing Facebook bring more attention to AR is pretty exciting. Oculus has continued pushing virtual reality (VR) closer to the mainstream since being acquired by Facebook, making some of the best VR headsets on the market. That includes the standalone Oculus Quest, which works without tethering to a pricey gaming PC or phone. We've yet to see a pair of AR smart glasses see as much success as the Quest.
As Facebook Reality Labs progresses towards the final design of its AR smart glasses, they have considerable work ahead of them to ensure the product is desirable and immersive. Recent developments take place in two areas: Audio Presence and Perceptual Superpowers, (which sounds a lot more exciting than the actual technology but is still notably useful).
"The mission of the team is twofold: to create virtual sounds that are perceptually indistinguishable from reality and to redefine human hearing," Facebook's blog post says.
Audio Presence is centered around reconstructing audio for a virtual environment, so the sound comes from appropriate directions. This is essentially like perfecting surround sound for AR. Refining the Audio Presence is a huge step in creating immersive environments.
When it comes to Perceptual Superpowers tech, the name may be a little generous. But don't let that discredit the seriously useful nature of what it does. This technology is designed to reduce distracting background noise so you can amplify the volume of your target audio source—like a conversation you're having at a table in a crowded restaurant.
Work like this is crucial to creating an optimized AR experience and directly tied to Facebook's AR glasses efforts, although incorporating the technology is still "a ways away," according to Facebook Research Scientist Manager Ravish Mehra.
"Imagine being able to hold a conversation in a crowded restaurant or bar without having to raise your voice to be heard or straining to understand what others are saying," Facebook's blog says.
"By using multiple microphones on your glasses, we can capture the sounds around you. Then, by using the pattern of your head and eye movements, we can figure out which of these sounds you’re most interested in hearing, without requiring you to robotically stare at it."
Facebook wants its AR smart glasses to be stylish and understand the visual and acoustic world around you in order to provide useful information.
"When you walk into a restaurant, for example, your AR glasses would be able to recognize different types of events happening around you: people having conversations, the air conditioning noise, dishes and silverware clanking," the blog explains. "Then, using contextualized AI, your AR glasses would be able to make smart decisions, like removing the distracting background noise — and you’d be no more aware of the assistance than of a prescription improving your vision.