This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

New program takes us one step closer to autonomous robots

One step closer to autonomous robots
Feasibility verification for push door with recoil behaviors. Credit: Science Robotics (2023). DOI: 10.1126/scirobotics.adg5014

We've watched the remarkable evolution of robotics over the past decade with models that can walk, talk and make gestures like humans, undertake tasks from moving heavy machinery to delicately manipulating tiny objects, and maintain balance on two or four legs over rough and hostile terrain.

As impressive as the latest robots are, their accomplishments are largely the result of task-specific programming or remote instruction from humans.

Researchers at ETH Zurich have developed a program that helps robots tackle activities that do not rely on "prerecorded expert demonstrations," as the developers put it, or "densely engineered rewards."

Instead, they designed an approach in which the robot can "rapidly discover a feasible and near optimal multi-modal sequence that solves the task." In other words, they provide an environment in which robots can achieve objectives with minimal guidance from human operators.

The research was reported in the Aug. 16 edition of Science Robotics. The paper, "Versatile multicontact planning and control for legged loco-manipulation," was prepared by Jean-Pierre Sleiman, Farbod Farshidian and Marco Hunter of the Robotic Systems Lab at the public research university ETH Zurich.

"Given high-level descriptions of the robot and object, along with a task specification encoded through a sparse objective," Sleiman said, "our planner holistically discovers how the robot should move, what forces it should exert, what limbs it should use, as well as when and where it should establish or break contact with the object."

Credit: Science Robotics (2023). DOI: 10.1126/scirobotics.adg5014

Demonstration videos show ANYbotics' quadrupedal ANYmal mastering the opening of a dishwasher door and deftly opening a weighted door and keeping it open with a leg while maneuvering through.

"The framework can be readily adapted to different kinds of mobile manipulators," Sleiman said.

The last several years have seen great strides in robotic development. Boston Dynamics, a leading player in the field of robotics, created Atlas in 2013. With stereo vision and fine motor abilities, it could maintain balance in a hostile environment. It eventually was improved to get in and out of vehicles, open doors and handle power equipment. Agility Robotics' Cassie in 2016 exhibited superior walking and running capacity.

In 2017, a lifelike Sophia that smoothly mimicked human gestures and behavior was dispatched to assist the elderly in nursing facilities and play with children. And highly advanced tactile manipulation was demonstrated in 2019 with OpenAI's Dactyl: After training sessions that its developers estimated would take humans 13,000 years to complete, the single-handed Dactyl could easily manipulate a Rubik's cube and solve the 3D combination puzzle, which has stymied millions of users since its release in 1974, in just four minutes.

One step closer to autonomous robots
Planning and control architecture for multicontact loco-manipulation. Credit: Science Robotics (2023). DOI: 10.1126/scirobotics.adg5014

More recently, the last few years have seen Boston Dynamics' four-legged Spot, which can walk three miles, climb hills, conquer obstacles and perform specialized tasks. And Ameca, considered one of the most—if not the most—lifelike robot, engages in smooth conversation and generates and hand gestures that are remarkably humanlike.

ETH Zurich, which would take the grand accomplishments of its predecessors and eliminate—or at least greatly reduce—the need for humans to control robots behind the scenes, has taken a key step in the next stage of development.

More information: Jean-Pierre Sleiman et al, Versatile multicontact planning and control for legged loco-manipulation, Science Robotics (2023). DOI: 10.1126/scirobotics.adg5014

Journal information: Science Robotics

© 2023 Science X Network

Citation: New program takes us one step closer to autonomous robots (2023, August 17) retrieved 27 April 2024 from https://techxplore.com/news/2023-08-closer-autonomous-robots.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Researchers expand ability of robots to learn from videos

80 shares

Feedback to editors