UPDATED 14:44 EDT / JUNE 25 2020

APPS

Google launches ARCore Depth API to enable more powerful AR apps on Android

Google LLC has released a new application programming interface for Android that will enable developers to create more realistic-looking and functional augmented reality applications.

The ARCore Depth API, launched today, will be available on hundreds of millions of Android devices worldwide. A number of Google partners including Samsung Electronics Co. Ltd. and Snap Inc., have announced plans to use it.

AR apps use a phone’s rear camera to capture a real-world environment, say the user’s living room, and insert virtual content into that environment. On Android, such apps were previously limited to displaying virtual objects in a mostly two-dimensional way. An interior design app could overlay a coffee table over the user’s living room but not embed the table directly into the scene. 

The Depth API changes that. The same interior design app can now virtually place the table behind another piece of furniture, or display a painting around a corner. This depth awareness makes AR content appear more realistic while also enabling developers to create more true-to-life interactions between virtual and real-world objects.

The Depth API has potentially broad applications in AR projects. Samsung plans to incorporate the API into a future version of its Quick Measure app that will enable users to point their camera at an object such as a box to view its physical dimensions. Snap, meanwhile, is working on a set of new depth-aware Lenses, a type of AR filter that the social media company uses to monetize its platform.

Video games are another use case the ARCore Depth API should help Google address. Mobile games are expected to become a $77.2 billion market in 2020 and a growing number of them incorporate AR components, a trend Google should now be better positioned to accommodate.

Under the hood, ARCore Depth API provides apps with depth data by using a technique called depth-from-motion. It analyzes footage captured by a user’s smartphone camera at different angles to calculate the distance to each point in a given scene.

“In addition to gaming and self-expression, depth can also be used to unlock new utility use cases,” Rajat Paharia, Google’s AR platform project lead, wrote in a blog post. “For example, the TeamViewer Pilot app, a remote assistance solution that enables AR annotations on video calls, uses depth to better understand the environment so experts around the world can more precisely apply real time 3D AR annotations for remote support and maintenance.”

The timing of the launch is notable. At the virtual WWDC event this week, Apple Inc. introduced an upgraded version of its ARKit AR development kit for iOS that features a new Depth API as well as other enhancements such as motion capture. The ARCore Depth API may help Google better compete with its chief mobile rival in this area.

AR also plays a part in Google’s education business. The company offers mixed reality tools such as Expeditions specifically for educational institutions and any enhancements to Android’s AR capabilities can ultimately help improve those offerings. 

Image: Google

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU