June 16, 2019 weblog
Facebook research focuses on lifelike environments for AI-powered assistants
Virtual Robots have moved up to an elite platform dedicated to stepping up their game. The platform is dubbed AI Habitat.
This is to the credit of researchers from Facebook, who recognize that the agents need better lifelike environments if they are to function well while navigating their way from bedroom to hallways, through museum corridors, out and moving about in shops.
Tech watchers, especially those covering the vast topic of artificial intelligence, have been watching the progress of something called AI Habitat. It is an open platform for embodied AI research.
In the Facebook AI blog, the authors said that it was part of Facebook AI's effort to create systems "less reliant on large annotated data sets used for supervised training. As more researchers adopt the platform, we can collectively develop embodied AI techniques more quickly, as well as realize the larger benefits of replacing yesterday's training data sets with active environments that better reflect the world we're preparing machine assistants to operate in."
Will Knight in MIT Technology Review said that "Whereas other simulation engines run at around 50 to 100 frames per second, Facebook says AI Habitat runs at over 10,000 frames per second, which makes it possible to test AI agents rapidly."
Its purpose is enabling the training of embodied AI agents, virtual robots in a photorealistic 3D simulator, before transferring the learned skills to reality.
But wait, step back a bit and pause. Do we really understand what they are going on about, in using the phrase "embodied AI?" Their fuller explanation in an AI blog is by two research scientists and two research engineers. The four clarify what they set out to do in Habitat.
From a robot asked to grab a phone from the desk upstairs to a device that helps the visually impaired navigate an unfamiliar subway system, the next generation of AI-powered assistants will need to demonstrate a range of abilities. Many researchers believe the most effective way to develop these skills is to focus on embodied AI, which uses interactive environments to ground systems training in the real world, rather than relying on static data sets.
This Habitat team also talked about the Habitat Challenge. In the Challenge, uploaded agents are evaluated in unseen environments to test for generalization.
"Unlike traditional challenges where people upload predictions based on a task related to a given benchmark such as ImageNet or VQA," according to a blog post, "this one required participants to upload code. The code was run on new environments that their agents had not seen before."
The Habitat-API is described on GitHub as a "modular high-level library to train embodied AI agents across a variety of tasks, environments, and simulators."
The Habitat API is a 3D simulator "with configurable agents, multiple sensors, and generic 3D dataset handling."
Amrita Khalid, Engadget, was struck by a living room's photo-realistic 3D simulations with their sharp details and how "real" everything was, down to the velour-texture throw on the sofa and reflective wall mirror: "Replica simulation of a living room is meant to capture all the subtle details one might find in a real living room."
Khalid reported that Facebook Reality Labs released the dataset of photorealistic sample spaces dubbed Replica. Some researchers have already taken Replica and AI Habitat for a test-drive, said Khalid. "Facebook AI recently hosted an autonomous navigation challenge on the platform."
Replica is a research project created by Facebook Reality Labs. Replica is described as a photo-realistic re-creation of 18 sample spaces, such as an office conference room and a two-story home, set up by researchers. Replica can be loaded up in AI Habitat. "By training an AI bot to respond to a command like 'bring my keys' in a Replica 3D simulation of a living room, researchers hope someday it can do the same with physical robots in a real-life living room."
As Will Knight said in MIT Technology Review, the desired outcome would be "so that their AI algorithms can learn how the real world works." This could make, in theory, robots and chatbots smarter.
Facebook's team have sound reasons for why all this matters: Training these virtual robots in these virtual spaces empowers a shift from 'internet AI' based on static datasets to "embodied AI where agents act within realistic environments, bringing to the fore active perception, long-term planning, learning from interaction, and holding a dialog grounded in an environment."
One area that could benefit from these research efforts might be domestic robots that adapt to new homes and personalized tasks without being retrained.
Knight made a point about AI's bigger picture: "A lack of common sense is a glaring problem for today's AI systems. Unlike a person, a chatbot or robot cannot rely on an understanding of the world—things like physics, logic, and social norms—to figure out the intent of an ambiguous command."
These virtual spaces can be loaded into the new AI Habitat, inside which AI programs can explore and learn. The algorithms will first be trained to recognize objects in different settings. But over time, Knight said, they should build some common-sense understanding about the conventions of the physical world—like the fact that tables typically support other objects.
ai.facebook.com/blog/open-sour … embodied-ai-research
© 2019 Science X Network