This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Interdisciplinary group suggests guidelines for the use of AI in science

Guidelines for the use of AI in science
Urs Gasser is Dean of the TUM School of Social Sciences and Technology and Rector of the School of Public Policy. Together with an international working group, he has drawn up rules for the use of AI in science. Credit: Technical University Munich

Artificial intelligence (AI) generates texts, videos and images that can hardly be distinguished from those of humans—with the result that we often no longer know what is real. Researchers and scientists are increasingly being supported by AI. Therefore, an international task force has now developed principles for the use of AI in research to ensure trust in science.

Science thrives on reproducibility, transparency, and accountability, and trust in research stems in particular from the fact that results are valid regardless of the institution where they were produced. Furthermore, the underlying data of a study must be published, and researchers must take responsibility for their publications.

But what if AI is involved in the research? Experts have long used AI tools to design new molecules, evaluate , and even generate research questions or prove a mathematical conjecture. AI is changing the face of research, and experts are debating whether the results can still be trusted.

Five principles should continue to ensure human responsibility in research, according to an interdisciplinary working group with members from politics, business, and academia who published an editorial in the latest issue of the journal Proceedings of the National Academies of Sciences. Urs Gasser, Professor for Public Policy, Governance and Innovative Technology at TUM, was one of the experts.

The recommendations in brief:

  • Researchers should disclose the tools and algorithms they used and clearly identify the contributions of machines and humans.
  • Researchers remain responsible for the accuracy of the data and the conclusions they draw from it, even if they have used AI analysis tools.
  • AI-generated data must be labeled so that it cannot be confused with real-world data and observations.
  • Experts must ensure that their findings are scientifically sound and do no harm. For example, the risk of the AI being "biased" by the used must be kept to a minimum.
  • Finally, researchers, together with policymakers, and business, should monitor the impact of AI and adapt methods and rules as necessary.

"Previous AI principles were primarily concerned with the development of AI. The principles that have now been developed focus on scientific applications and come at the right time. They have a signal effect for researchers across disciplines and sectors," explains Gasser.

The working group suggests that a new strategy council—based at the US National Academy of Sciences, Engineering and Medicine—should advise the scientific community.

"I hope that science academies in other countries—especially here in Europe—will take this up to further intensify the discussion on the responsible use of AI in research," says Gasser.

More information: Wolfgang Blau et al, Protecting scientific integrity in an age of generative AI, Proceedings of the National Academy of Sciences (2024). DOI: 10.1073/pnas.2407886121

Citation: Interdisciplinary group suggests guidelines for the use of AI in science (2024, May 23) retrieved 16 June 2024 from https://techxplore.com/news/2024-05-interdisciplinary-group-guidelines-ai-science.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Clear guidelines needed for synthetic data to ensure transparency, accountability and fairness, study says

1 shares

Feedback to editors