HuggieBot 2.0 hugging a user. Haptic sizing enables the robot to embrace the user securely and safely and redundant sensors let it know when the user is ready for the hug to end. Credit: Block et al.

Researchers at the Max Planck Institute for Intelligent Systems (MPI-IS) and ETH Zürich have recently created HuggieBot 2.0, a robot that can hug users at their request. This robot, set to be presented at the ACM/IEEE International Conference on Human-Robot Interaction (HRI) in March, builds on a previous robotic system created by Alexis E. Block, one of the authors, during her Master's degree.

"HuggieBot is a research project that I first worked on with Katherine J. Kuchenbecker in the fall of 2016, as part of my master's thesis in robotics at the University of Pennsylvania," Block said. "Both of us had family members who were far away whom we wanted to hug and this was the main inspiration behind the project."

The previous version of the , dubbed HuggieBot 1.0, was based on a robotic platform created by Willow Garage, called Personal Robot 2 (PR2). The PR2 robot's hardware and software were customized in a way that allowed it to easily embrace users.

In July 2017, Block started her Ph.D. studies with the MPI ETH Center for Learning Systems, with Kuchenbecker as her primary supervisor at MPI-IS in Germany, and Otmar Hilliges and Roger Gassert as additional co-supervisors at ETH Zurich in Switzerland. Together with her supervisors, she decided to continue the HuggieBot project and create a completely new robot for hugging.

"After I completed my master's and created the first HuggieBot, Katherine and I were still interested in the topic of a hugging robot and felt there was more to discover," Block said. "Feedback from users taught us that the PR2 was too bulky to be a really good hugging robot, and it also couldn't feel the user well because it had only one small touch sensor on its back."

Before they started working on HuggieBot 2.0, the researchers reviewed previous literature in robotics and examined similar robots created in the past. Subsequently, they worked on a new design for HuggieBot, also involving a Ph.D. student in Otmar's lab at ETH, Sammy Christen, who specifically worked on the robot's computer vision capabilities.

"Our new robotic platform was built according to our six design tenets, or 'commandments' for natural and enjoyable robotic hugging," Block said. "Namely, we felt that a hugging robot should be soft, be warm, be human sized, visually perceive its user, adjust its embrace to the user's size and position and reliably release when the user wants to end the hug. By following these commandments, HuggieBot 2.0 gives excellent hugs."

To build HuggieBot 2.0, Block and her colleagues mounted two Kinova JACO arms (i.e., commercially available robotic arms that are typically attached to wheelchairs) on a custom metal frame. In contrast with its previous version, this robot has an inflatable and soft torso that can sense a user's contact regardless of his/her hand placement. The robot's body is covered with heating pads, a purple robe and a gray sweatshirt, while its hands are covered in padded mittens.

HuggieBot 2.0's head, which was created using 3-D printing, consists of a computer, a screen that serves as its face, a depth-sensing camera, a speaker and a micro-controller. The screen shows different animated facial expressions on a purple background, which give a sense that the robot is smiling and blinking.

"As part our recent study, we tested the influence of three different binary factors: visual hug initiation, haptic hug sizing, and haptic hug release," Block said. "For visual hug initiation at the time of the study conducted in the paper, the robot's camera would detect a user in its field of vision. Once the camera sensed the user was walking towards it, the robot would lift its arms and ask the user 'can I have a hug, please?" For haptic hug sizing, we model hugging as a form of grasping."

HuggieBot 2.0's JACO arms have torque sensors at every joint. Using a torque-thresholding grasping method, the researchers were able to make the robot's grasp more adaptive and secure, ensuring that it matches the bodies of individual users and does not hold them too tight or too loose.

"For haptic release, we used two different methods," Block explained. "First, we used the torque sensors on the robot's arms to detect when a user wishes to leave the embrace. Then we used the inflatable sensing torso to detect when a user has removed his/her arms from the robot's back, thus indicating their desire to end the hug. These features make HuggieBot 2.0 a more natural and intuitive hugging robot."

HuggieBot 2.0 waiting for a hug. It wears heating pads, a purple robe, and a grey sweatshirt over its novel inflatable sensing torso so that its hugs are warm, soft and responsive. Credit: Block et al.

Block and her colleagues evaluated HuggieBot 2.0 in two different studies. Initially, they asked 117 people to watch videos and images of HuggieBot 1.0 and HuggieBot 2.0. After they had seen these videos and images, those who participated in the study were asked to share their feedback and opinions. They found that the majority of participants preferred the latest version of the robot, both in terms of its appearance and movements.

Subsequently, the researchers asked 32 people to test the robot in person and share their feedback with them. They specifically asked them for opinions on how well the robot's arms adapted to their body (i.e., haptic sizing), how naturally they initiated a hug and how effectively they released them from a hug.

Overall, participants said they felt that haptic sizing (i.e., how a robot's embrace adapted to their body) increased their perception of the robot as natural in its movements, intelligent and friendly. Overall, this 'hug adaptability' feature appeared to enable more enjoyable interactions between HuggieBot 2.0 and its users.

"Users provided many positive comments about visual hug initiation, haptic sizing and haptic release," Block said. "In addition to validating our custom robot platform, this paper validates our six design tenets. These findings show that we are moving in the right direction for creating more natural and enjoyable robot hugs, but there is still some room for improvement."

The feedback that Block and her colleagues collected in their user studies helped them to perfect the robot further, ultimately leading to the fabrication of an even newer version of their system, called HuggieBot 3.0. This new robot will be presented in a new paper that is currently being peer-reviewed.

"In addition to showcasing hardware and software improvements, our new paper about HuggieBot 3.0 centers on enabling the robot to detect, classify and respond to intra-hug gestures like rubs, pats and squeezes," Block said. "Being squeezed by a hugging robot is surprisingly enjoyable!"

To further explore the potential of the system they developed, the researchers are currently designing a new experiment aimed at assessing the physiological effects of receiving hugs from HuggieBot. While it is known that physical contact with other humans or even animals can have several health benefits (e.g., lowering our blood pressure and cortisol levels, alleviating stress and anxiety, strengthening our social bonds and immune systems and increasing oxytocin levels), the effects of physical contact with robots are still poorly understood. Block and her colleagues would ultimately like to find out whether a robot's hug can alleviate stress and improve physical wellbeing as much as that of humans or animals.

"Evidence of physiological benefits would complement and help explain all the positive comments and ratings HuggieBot has gotten from users while also illustrating that interacting with robots in this way could improve human health," Block said. "We think soft, warm, responsive robot hugs could help support many people who don't regularly receive hugs from other humans."

In addition to evaluating the physiological effects of their robot's hug on users, the researchers are developing HuggieApp, an application that would allow users to remotely send each other customized hugs via the HuggieBot robot. Via this app, users could substitute the animated faces on the robot's integrated screen with customized videos sent by their loved ones.

"On the HuggieApp, hug senders can also determine the duration of a hug and which intra-hug gestures the robot should perform and at what time," Block added. "After they receive a notification saying that they've been sent a hug through the application, users will just need to approach their HuggieBot and scan a QR code to redeem their customized hug. We also hope to be able to run a study to test whether HuggieBot could help strengthen personal relationships between people who are physically separated by a distance."

Although HuggieBot 2.0 and HuggieBot 3.0 are still prototypes, Block hopes eventually to commercialize them. Before these systems can be implemented on a large scale and become widely available, however, the researchers will need to perfect the quality of their hugs further and ascertain their overall safety and reliability.

"I see the next step towards commercialization (after improving the robot) as targeting larger-scale institutions, where many people could benefit from a single HuggieBot," she said. "Such places might be universities, hospitals, or nursing homes. While our studies have shown interest exists for a hugging robot on its own, I believe the customizable app will be a necessary component of any commercialized hugging robot."

More information: The six hug commandments: design and evaluation of a human-sized hugging robot with visual and haptic perception. arXiv:2101.07679 [cs.RO]. arxiv.org/abs/2101.07679