UPDATED 10:30 EST / JUNE 30 2021

AI

Facebook makes its Habitat virtual world AI training environment more interactive

Facebook Inc.’s artificial intelligence research department today announced a major update to its AI Habitat simulator, a virtual environment that’s used to teach robots how to interact with the physical world.

The open-source Habitat simulator was first launched in 2019, giving AI researchers a better way to teach industrial robots how to interact safely and efficiently within the environment in which they’re designed to operate. Facebook’s AI team said at the time that it built the simulator because it’s far easier and more efficient than creating a physical environment in the real world to train robots.

Habitat features various virtual environments, such as office spaces, two-story homes, warehouses and so on. Those virtual environments were created using an infrared depth capture system that records the exact shape of objects in the environment to make them as realistic as possible. Items such as books, chairs and windows can then be reconstructed perfectly within the virtual environment.

Habitat’s virtual environments can be used to train robots to perform complex, multistep tasks that require them not just to see but to understand their surroundings. So a domestic robot can be taught to handle a request such as “Check if my laptop is on the desk in the kitchen.”

The initial release of Habitat did have some big limitations, though. The biggest one was that its training environments were not interactive, so it wasn’t possible to teach the same robot to actually fetch that laptop once it found it.

Facebook AI research scientist Dhruv Batra said in a blog post today that the Habitat 2.0 release addresses this limitation, adding the ability for robots to interact with objects in its virtual environments, just as they would in a real-world setting.

“Habitat 2.0 builds on our original open-source release of AI Habitat with even faster speeds as well as interactivity, so AI agents can easily perform the equivalent of many years of real-world actions — a billion or more frames of experience — such as picking items up, opening and closing drawers and doors, and much more,” Batra wrote.

Key to enabling interactivity with objects is the new ReplicaCAD data set, which is a mirror of the Replica data set that was created by Facebook’s Reality Lab. The dataset has been rebuilt to support the movement and manipulation of objects, Batra explained. With ReplicaCAD, previously static 3D scans of objects have been converted into much more detailed 3D models, including physical parameters, collision proxy shapes and semantic annotations, with full attention paid to details such as the objects’ material composition, geometry and texture.

“The interactive recreations also incorporated information about size and friction, whether an object (such as a refrigerator or door) has compartments that could open or close, and how those mechanisms worked, among other considerations,” Batra added.

So the new, incredibly detailed virtual environments now support training virtual robots that can perform much more complex and useful tasks, such taking food items from a bag to restock a fridge, loading a dishwasher with dirty dishes strewn across a dining room table, and fetching objects on command and then returning them later.

Facebook is planning to share the ReplicaCAD data set under a Creative Commons license for noncommercial use with attribution.

The other big improvement seen in Habitat 2.0 has to do with its speed. Batra explained that his team prioritized speed and performance over a wider rage of capabilities, because doing so would enable AI researchers to test more new approaches and iterate on them more effectively.

“For instance, rather than simulating wheel-ground contact, we use a navigation mesh to move the robot,” Batra wrote. “The platform also does not currently support non-rigid dynamics such as deformables, liquids, films, cloths, and ropes, as well as audio or tactile sensing. This streamlined focus makes the Habitat 2.0 simulator two orders of magnitude faster than most 3D simulators available to academics and industry professionals.”

In addition to the new capabilities of Habitat 2.0, Facebook has also worked closely with Matterport Inc., a creator of immersive 3D virtual spaces, to create a new data set of virtual environments for AI researchers to work with. The Habitat-Matterport 3D Research Dataset is a collection of 1,000 open-source and Habitat-compatible 3D scans of various spaces, including dozens of apartments, multifamily and single-family homes, plus commercial spaces such as offices, warehouses and retail stores.

Batra said he believes the HM3D dataset will play a big part in advancing what he calls “embodied AI.” That refers to robots that can be trained to navigate rooms, understand their surroundings and interact with objects within the environment.

Constellation Research Inc. analyst Holger Mueller told SiliconANGLE that he was encouraged to see the new capabilities with Habitat 2.0 as a framework, in concert with RepliCAD and the interactivity that enables.

“AI and machine learning in real-world scenarios that include physical activity is still very much at the vanguard of practical AI applications,” Mueller said. “But the potential upside is massive as it will fundamentally change the way people live, similar to how electricity gave us refrigerators and washing machines. It’s nice to see the partnership between Facebook and Matterport too, as that will ensure there are plenty of environments available for researchers to test their virtual robots in.”

Looking ahead, Batra promised that Facebook’s AI researchers will keep going to make Habitat even more useful. One of the next areas of focus is on modeling living spaces that are more representative of different parts of the world, taking into account cultural- and region-specific differences in things such as the layout of furniture and the types of furniture and objects within them.

“We acknowledge these representational challenges and are working to improve the diversity and geographic inclusion of the 3D environments currently available for research,” Batra said.

The Habitat 2.0 simulator is open-sourced under the MIT license and available now.

Image: Facebook

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU