This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

proofread

Legal expert questions the human rights implications for future mind-reading technologies

mind reading
Credit: Pixabay/CC0 Public Domain

Recent advancements using artificial intelligence to extract meaningful thoughts from brain waves have concerned human rights and privacy advocates, who say technology is developing at a rate faster than the law. It is the first paper that considers whether Australia is prepared for the potential applications of neurotechnologies.

Neurotechnological advancements have attracted the attention of scholars, national legislatures and organizations such as the United Nations Human Rights Council, prompting intense debate about whether current legal domestic and international frameworks require modification to address emerging issues, such as human rights and privacy.

However, in Australia the topic of neurotechnology and its impact on human rights has not been addressed, and although ethics have been considered, there has yet to be a human rights law focus.

Prominent neurotech law expert Dr. Allan McCay from the Sydney Law School said the Universal Declaration of Human Rights, drafted before the onset of neurotechnology, may not fully address these technological capacities.

He is calling for neurotechnology to be on the agenda for legal scholars, law reform bodies, human rights organizations and ultimately parliaments in Australia.

This was outlined in the first paper to question whether we are prepared for the potential applications of neurotechnologies and what Australia should do about the human rights challenges. Authored by Dr. McCay, "Neurotechnology and Human Rights: Developments Overseas and the Challenge for Australia" is published in the the Australian Journal of Human Rights.

"While there must be recognition of the positive impacts of neurotechnology—such as assisting those with a disability and treating chronic health conditions—the profound possible human rights violations must be addressed. Given the pace of technological progress, it may be that legislatures should proactively shape the law rather than somewhat passively waiting for the courts to deal with issues," says Dr. Allan McCay.

Dr. McCay said that the whole field is "under-theorized" in Australia and "lacks a response from regulatory/human rights institutions."

"As humans continue to merge with machines, it is important to consider the downside of postponing debate about neurotechnology."

Direct monitoring of neural activity raises a range of issues, the most glaring being privacy. While we have forfeited much of our privacy online, direct neural access is more troubling than privacy issues connected with data gathered from social media behavior.

As demonstrated recently in a study from the University of Texas, participants in a imaging (fMRI) scanner listened to podcasts, generating data that is used to train a model aimed to decode their .

After the model is trained, participants went back under the scanner and listened to a new story—one that hadn't been used to generate training data. As they listened, the fMRI scanner recorded the blood oxygenation levels in parts of their brains.

The researchers then used a large language model—such as OpenAI's GPT-4 and Google's Bard—to match patterns in the to the words and phrases that the participants had heard.

To simplify, Dr. Shinji Nishimoto, a neuroscientist at Osaka University who was not involved in the Texas research said, "brain activity is a kind of encrypted signal, and language models provide ways to decipher it."

Dr. McCay said we need to consider brain monitoring and direct brain-intervention in criminal justice, political, workplace and consumer contexts. For example, would it be a human rights infringement to monitor a suspect's brain during a police interview?

US company Brainwave Science already markets a neurotechnological interrogation product.

Dr. McCay said some would and argue that using AI neurotechnologies and implantable brain devices, even ones that intervene in criminal offenders' brains to change their behavior, is a positive thing. But this is disconcerting from a human rights perspective.

Overseas there have been moves to address these issues with the formation of human rights groups, including the Neurorights Foundation and the Minding Rights Network . The groups doubt the capacity of the international human rights framework can meet the challenges of neurotechnology.

The Neurorights Foundation is pushing to have companies, governments and the United Nations recognize the rights to mental privacy, , free will, fair access to mental augmentation and protection from bias.

Dr. McCay said, "Public debate might be useful to put those who produce on notice that legal change is coming. As realized by several institutions overseas, discussion needs to be had now."

More information: Allan McCay, Neurotechnology and human rights: developments overseas and the challenge for Australia, Australian Journal of Human Rights (2023). DOI: 10.1080/1323238X.2023.2221487

Citation: Legal expert questions the human rights implications for future mind-reading technologies (2023, June 27) retrieved 27 April 2024 from https://techxplore.com/news/2023-06-legal-expert-human-rights-implications.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

How brain-monitoring tech advances could change the law

13 shares

Feedback to editors