From remote tourism to metaverse, a new robotic avatar made in Italy
Feeling and moving in a place without being there is the main goal of the new iCub robot advanced telexistence system, also called the iCub3 avatar system, developed by researchers at IIT-Istituto Italiano di Tecnologia (Italian Institute of Technology) in Genova, Italy. The new system was tested in an online demonstration involving a human operator based in IIT, Genova, and a new version of the humanoid robot, the iCub 3, visiting the Italian Pavilion at the 17th International Architecture Exhibition—La Biennale di Venezia; the two sites are 300 km apart and the communication relied on basic optical fiber connection. Researchers demonstrated that the system transports the operator locomotion, manipulation, voice and facial expressions to the robotic avatar, while receiving visual, auditory, haptic and touch feedback. This is the first test of a legged humanoid robot for remote tourism and conferring the experience to a human operator. The system is a prototype and may be further developed for other scenarios, including disaster response, healthcare and metaverse applications.
This result was obtained by the research team coordinated by Daniele Pucci, principal investigator of the Artificial and Mechanical Intelligence (AMI) Lab at IIT in Genova. One of their research goals is to develop humanoid robots that play the role of avatars, namely a robotic body that acts in place of humans without substituting them, but allowing them to be where they cannot.
"We believe that this research direction has a tremendous potential in many fields," explains Daniele Pucci. "On the one hand, the recent pandemic taught us that advanced telepresence systems might become necessary very quickly across different fields, like healthcare and logistics. On the other hand, avatars may allow people with severe physical disabilities to work and accomplish tasks in the real world via the robotic body. This may be an evolution of rehabilitation and prosthetics technologies."
The system integration was possible thanks to previously developed IIT technologies. The robot iCub3 is currently being developed at IIT and represents a new version of the iCub robot; the wearable technologies, named iFeel, were developed in the EU-funded project AnDy in collaboration with the National Institute for Insurance against Accidents at Work (INAIL).
An advanced software architecture, designed by the IIT researchers, controls and manages the interconnection between the iCub3 robot and the iFeel system. Also, this software infrastructure allows the integration of commercial wearable technologies, which complete the iCub3 avatar system. For instance, the remote user walks in place inside a virtual-reality platform that allows free upper-body movement.
The iCub3 robot is 25 cm taller than the previous iCub versions, measuring 1.25 m and is thus a more adequate platform to interact within a human environment. Its balance and locomotion are more robust and able to emulate human movements and physical interaction better. The robot is, therefore, bigger—weighing 52 kg vs 33 kg—and has more powerful motors in its legs for faster locomotion. Moreover, the iCub3 robot also differs from the previous platform for a different actuation mechanics, no longer based on cable-driven joints. On the sensors side, it has an additional depth camera and force sensing of the latest generation withstanding higher robot weight. Lastly, iCub3 has a higher capacity battery, which is located within the torso assembly instead of being included in a rigidly attached backpack.
In the demonstration realized by operating the system from Genova to Venice and backward, the IIT wearable iFeel suite tracks operator's body motions and the avatar system transfers them onto the iCub3 in Venice, which then moves as the user does in Genova. The user is also provided with a headset that tracks the user expressions, eyelids, and eye motions. These head features are projected onto the avatar, which reproduces them with a high level of fidelity: the avatar and human share similar facial expressions. The user wears sensorized gloves that track hand motions and, at the same time, provide haptic feedback.
Thanks to the avatar system, the remote user can smile, talk and shake hands with the guide in Venice. Analogously, when the guide hugs the avatar in Venice, the operator in Genova feels the hug thanks to the IIT's iFeel suit that also provides upper body haptics. Moreover, the conversation between the remote user in Genova and the guide in Venice is possible thanks to systems that record and transmit the operator voice, reproduced by the avatar in Venice.
The transmission was streamed on a standard optical fiber internet connection, resulting in only few milliseconds of delay.
"Our iCub 3 avatar system is validated on a legged humanoid robot allowing remote verbal, non-verbal and physical interaction, which represents a perfect starting point when looking for platforms to emulate humans for all interaction aspects," says Daniele Pucci. "What I also see in our near future is the application of this system to the so-called metaverse, which is actually based on immersive and remote human avatars."
The director-general of contemporary creativity of the Ministry of Culture and Commissioner of the Italian Pavilion, Onofrio Cutaia, said, "With great pleasure, we received the proposal to collaborate with the Italian Institute of Technology in Genova on this project. We really would like to thank Arch. Alessandro Melis, curator of the Italian Pavilion at the 17th International Architecture Exhibition—La Biennale di Venezia, for sharing our enthusiasm and interacting with iCub 3 in a surprising dialog between man and robot. A unique opportunity to promote contemporary cultural heritage through new forms of communication. We firmly believe that interdisciplinarity and interaction between languages is the real challenge to be faced, and this is what the Directorate-General will focus on in the coming years."