Google announced a new entry to its ever-growing portfolio of immersive technology. The company already has its hands full with two VR platforms, Cardboard and Daydream, and it dumped significant resources into realizing the Tango AR platform. Now, Google has another AR option that’s cheaper to adopt: Google ARcore.
Tango is impressive, but to date, only two smartphones (the Lenovo Phab 2 Pro and the Asus Zenfone AR) feature the cameras and sensors necessary to realize the potential of the platform, and it’s unclear if that will ever change. Without more Tango-ready smartphones on the market, few people will experience what Tango can do. Fortunately, Google is bringing similar technology to standard Android phones.
Tango offers motion tracking, depth perception, and area learning to create sophisticated AR experiences. ARcore also leverages three similar technologies to understand your phone's surroundings, including motion tracking, environmental understanding, and light estimation. The primary difference between the two platforms comes down to the hardware needed to use them. Tango requires special hardware, whereas ARcore works with the components found on most modern smartphones.
Google’s ARcore technology relies on a process called “concurrent odometry and mapping” (COM for short) to determine the position of the phone relative to the environment around it. COM uses the phone’s internal camera to scan the area and identify “visually distinct features,” such as furniture. The visual information is then compared to IMU sensor data to estimate the position and orientation of the camera relative to the identified features.
ARcore uses the IMU and visual information to keep track of the motion of your phone, so you can see the correct perspective of digital assets when you move around them. ARcore combines the information from clusters of points to get a better understanding of the environment and its features. The technology can track horizontal surfaces, such as the floor or a tabletop, to help ground the digital visual elements to the environment.
ARcore also takes the light sources in your environment into account to create a higher sense of realism. ARcore detects the average light intensity levels in different areas and renders the digital assets with the same lighting characteristics. In other words, if you place a virtual object in the shade, it too will be dimmed by the shadow.
ARcore also allows you to anchor digital assets within a real environment so they hold their position over time. That is, when a virtual asset is anchored to the real world, you can leave the room and come back to the object where you left it.
Only A Preview
A preview release of Google’s ARcore SDK is available now, but it presently works on only three devices. If you have a Google Pixel, Google Pixel XL, or Samsung Galaxy S8, you can get started with the ARcore SDK today. Google said that when it rolls out the full ARcore platform, “a wide variety of qualified Android phones running N or later” would support it. If you have the hardware to get started already, you have several choices of development environment to use--Google’s Android Studio, Unity, and Epic’s Unreal Engine all support Google’s ARcore platform.
Google didn’t give any indication as to when the ARcore platform would be available on a wider range of devices, and a disclaimer found on Google AR developer resource page gives us the impression that the platform is far from mature, which suggests that we may be waiting for a while.
Note: ARCore is being offered as an early preview so that you can start experimenting with building new AR experiences. It's also an opportunity for you to give feedback on an early version of the API. This preview is the first step in a journey to enabling AR capabilities across the Android ecosystem.