Credit: CC0 Public Domain

Advice from artificial intelligence (AI) experts may be just as influential as from human experts, according to a team of Penn State researchers. However, both human and robotic bearers of bad news may find that they lose influence when their negative opinions run contrary to a positive crowd.

In a study, researchers found that machines that generate recommendations—or AI experts—were as influential as human experts when the AI experts recommended which photo a user should add to their online business profile. However, both AI and human experts failed to budge opinions if their feedback was negative and went against popular opinion among other users, said S. Shyam Sundar, James P. Jimirro Professor of Media Effects in the Donald P. Bellisario College of Communications and co-director of the Media Effects Research Laboratory.

Sundar, who is also an affiliate of Penn State's Institute for Computational and Data Sciences (ICDS), said the findings may show that there are times when the opinion of the crowd—also called the bandwagon effect—can beat out the opinions of experts whether they are AI or human. He added that both AI-powered and human experts with a positive evaluation on a business profile picture were able to influence users' own assessment of the photo. However, if experts did not like the photograph and the crowd offered a positive evaluation of it, the experts' influence waned.

Because people are increasingly using social media to look for feedback, cues that suggest expert opinions and the bandwagon effect may be important factors in influencing decisions, according to Jinping Wang, a doctoral candidate in mass communication and first author of the study.

"Nowadays, we often turn to for opinions from other people—like our peers and experts—before making a decision," said Wang. "For example, we may turn to those sources when we want to know what movies to watch, or what photos to upload to social media platforms."

AI experts are often less expensive than human experts and they can also work 24 hours a day, which, Wang suggests, might make them appealing to online businesses.

The researchers also found that the AI's group status—in this case, national origin was designated—did not seem to affect a person's acceptance of its recommendation. In human experts, however, an expert from a similar national origin who offered a negative assessment of a photograph tended to be more influential that a human expert from an unknown country who offered a similar negative rating of a photograph.

While findings that suggest group status may not affect whether a person values the judgment of AI experts sounds like good news, Sundar suggests that the same cultural biases might still be at work in the AI expert, but they could be hidden in the programming and training data.

"It can be both good—and bad—because it all depends on what you feed the AI," said Sundar. "While it is good to have faith in AI's ability to transcend cultural biases, we have to keep in mind that if you train the AI on pictures from one culture, they could give misleading recommendations on pictures meant for use in other cultural contexts."

The researchers, who report their findings in an upcoming issue of Computers in Human Behavior, recruited 353 people through an online crowdsourcing service to take part in the study. The participants were randomly selected to view a screenshot of a website that offered users recommendations for their business profile photos. Participants were also told that the website allowed feedback from other users of platform, in addition to expert raters. The screenshots represented the various conditions studied by the researchers, including whether the expert raters were human or AI, whether their feedback was positive or negative and whether the source of the rater came from a similar, different or unknown national identity.

In the future, researchers plan to investigate the group dynamics of influence more deeply and examine whether the 's gender plays a role in influencing .

More information: Jinping Wang et al. When expert recommendation contradicts peer opinion: Relative social influence of valence, group identity and artificial intelligence, Computers in Human Behavior (2020). DOI: 10.1016/j.chb.2020.106278

Journal information: Computers in Human Behavior