This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:


trusted source


Survey shows most people think LLMs such as ChatGPT can experience feelings and memories

Credit: Pixabay/CC0 Public Domain

Two-thirds of people surveyed think that artificial intelligence (AI) tools like ChatGPT have some degree of consciousness and can have subjective experiences such as feelings and memories, according to a new study from the University of Waterloo.

Large language models (LLMs) like ChatGPT often display a conversational style when outputting content. These human-like abilities have spurred debates on whether AI has .

According to the researchers, if people believe that AI has some level of consciousness, it could ultimately affect how people interact with AI tools, potentially strengthening social bonds and increasing trust. On the other hand, excessive trust can also lead to emotional dependence, reduced , and over-reliance on AI to make critical decisions.

The article "Folk psychological attributions of consciousness to " was published in Neuroscience of Consciousness.

"While most experts deny that current AI could be conscious, our research shows that for most of the general public, AI consciousness is already a reality," said Dr. Clara Colombatto, professor of psychology at Waterloo's Arts faculty.

To understand public attitudes about AI consciousness, Colombatto and her colleague Dr. Steve Fleming at University College London surveyed a stratified sample of 300 people in the U.S. and asked if they thought ChatGPT could have the capacity for consciousness, as well as a variety of other mental states—such as the ability to make plans, reason, and feel emotions—and how often they used the tool.

The research found that the more people used ChatGPT, the more likely they were to attribute consciousness to it—an important consideration as AI tools are increasingly becoming part of our daily lives.

"These results demonstrate the power of language because a conversation alone can lead us to think that an agent that looks and works very differently from us can have a mind," said Colombatto.

"Alongside emotions, consciousness is related to intellectual abilities that are essential for : the capacity to formulate plans, act intentionally, and have are tenets of our ethical and legal systems. These should thus be a key consideration in designing and regulating AI for safe use, alongside expert consensus."

Future research will explore more specific factors driving these consciousness attributions, and their consequences for trust and social bonding, as well as possible variations within the same people across time, and across different people in other countries and cultures.

More information: Clara Colombatto et al, Folk psychological attributions of consciousness to large language models, Neuroscience of Consciousness (2024). DOI: 10.1093/nc/niae013

Citation: Survey shows most people think LLMs such as ChatGPT can experience feelings and memories (2024, July 2) retrieved 19 July 2024 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Applying a neuroscientific lens to the feasibility of artificial consciousness


Feedback to editors