This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

written by researcher(s)

proofread

Social robot or digital avatar, users interact with this AI technology as if it's real

ai robot
Credit: Unsplash/CC0 Public Domain

Humans are interacting more than ever with artificial intelligence (AI)—from the development of the first "social robots" (a robot with a physical body programmed to interact and engage with humans) like Kismet in the 1990s to smart speakers such as Amazon's Alexa.

But this technology is changing how humans relate with it—and with each other.

Our new research looked at how humans experience interacting with AI or digital avatars—AI virtual chatbots designed to look and interact like a human on a device. These are designed to increase with them.

Social robots such as ElliQ and Pepper are popular in Europe, Japan and the United States, particularly as aids for the elderly. New Zealand has been slower to adopt these technologies.

Since the pandemic, social robots and digital avatars have been used to address issues such as loneliness and mental health issues. In one Scottish experiment during the pandemic, people were introduced to social "Pepper" over regular video chats. The researchers found the interactions lifted the mood of the participants.

Given the uncertainties around the long-term usage of these types of technologies, researchers and policymakers have a responsibility to question how these will affect humans, individually and in wider society.

Human responses to AI

Research has already established these types of technology are playing a greater role in human social relations, leading to changes in how people form connections and relationships.

Our research involved detailed interviews with 15 participants from New Zealand, Australia and Europe, coupled with broader data analysis. We found when people interact with AI social robots or digital avatars, two things happened at the same time.

Firstly, users had physical reactions and feelings towards the AI technology. These responses were largely unconscious.

One user, for example, said they "unconsciously reached out, wanting to touch the [AI 's] hair" on the screen. This was an instinctive response—the participant wanted to use their senses (such as touch) to engage with the digital avatar. Another participant unconsciously smiled in response to a smile from a social robot.

Secondly, users also derived meaning from their interaction with the AI technology through the use of shared language, concepts and non-verbal communication. For example, when one participant frowned, the digital avatar responded by getting "glassy eyes" as if it was upset by the participant's expression.

These shared non-verbal forms of communication allowed the participants to have meaningful interactions with the technology.

Participants also developed a level of trust in the AI social robot or digital avatar. When the conversation flowed, users would forget they were relating to a machine.

The more human the AI social robots and digital avatars looked, the more alive and believable they seemed. This resulted in participants forgetting they were engaging with technology because the technology felt "real."

As one participant said, "Even cynical people forget where they are and what they are doing. Somewhere between suspending disbelief that a system could have such a sophisticated conversation and enjoying the feeling of being in relationship with an 'other.'"

AI social robots and digital avatars are increasingly sharing the same spaces online and "in-person" with humans. And people are trying to physically interact with the technology as if it were human.

Another participant said, "I've got a bit of a spiritual connection (with the AI digital avatar) because I spent a lot of time with her."

In this way, the function of the technology has changed from being an aid in connecting humans to being the subject of affection itself.

Navigating the future of AI

While acknowledging the benefits of AI social technologies such as addressing loneliness and health issues, it is important to understand the broader implications of their use.

The COVID-19 pandemic showed how easily people were able to shift from in-person interactions to online communications. It is easy to imagine how this might change further, for example where humans become more comfortable developing relationships with AI social technology. There are already cases of people seeking romantic relationships with digital avatars.

The tendency of people to forget they are engaging with AI social technologies, and feeling as if they are "real," raises concerns around unsustainable or unhealthy attachments.

As AI becomes more entrenched in , international organizations are acknowledging the need for guardrails to guide the development and uses of AI. It is clear governmental and need to understand and respond to the implications of AI social technologies for society.

The European Commission's recently passed AI Act offers a way forward for other governments. The AI act provides clear regulations and obligations regarding specific uses of AI.

It is important to recognize the unique characteristics of relationships as something that should be protected. At the same time, we need to examine the probable impact of AI on how we engage and interact with others. By asking these questions we can better navigate the unknown.

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation: Social robot or digital avatar, users interact with this AI technology as if it's real (2024, July 24) retrieved 24 July 2024 from https://techxplore.com/news/2024-07-social-robot-digital-avatar-users.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

An avatar will never lie, or will it? Scientists investigate how often we change our minds in virtual environments

1 shares

Feedback to editors