UN rights council calls for AI transparency
The UN Human Rights Council on Friday called for transparency on the risks of artificial intelligence and for the data harvested by AI to be used responsibly.
An explosion in generative AI content since ChatGPT launched late last year has left authorities scrambling to figure out how to regulate such chatbots and ensure the technology does not endanger humanity.
In its first look at the development of AI, the UN's top rights body adopted a resolution that called for the "adequate explainability" of AI-supported decisions, taking into account "human rights risks arising from these technologies".
It also calls for the use of data in AI systems to be in line with international human rights law.
The resolution—co-sponsored by Austria, Brazil, Denmark, Morocco, Singapore and South Korea—was adopted by consensus in the 47-country council.
China and India disassociated themselves from the consensus but did not however demand a vote—a stance countries sometimes take when they have reservations but do not want to rock the boat.
China's representative told the council that the resolution had some "controversial content".
AI and privacy
South Korean ambassador Yun Seong-deok said the resolution underlined the importance of "ensuring, promoting and protecting human rights throughout the life-cycle of artificial intelligence systems".
US ambassador Michele Taylor called it a step forward for the council.
"This resolution recognizes both the harms and benefits that new emerging digital technologies, especially artificial intelligence, can bring to the field of human rights."
ChatGPT has become a global sensation since it was launched late last year for its ability to produce human-like content, including essays, poems and conversations from simple prompts.
British ambassador Simon Manley said London was "deeply concerned by the use of technology to curtail human rights, including freedom of expression, association, and peaceful assembly", and the right not to have one's privacy interfered with.
© 2023 AFP