This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

reputable news agency

proofread

Meta urged to update rules after fake Biden post

meta
Credit: Unsplash/CC0 Public Domain

With major elections looming, Meta's policy on deep fake content is in urgent need of updating, an oversight body said on Monday, in a decision about a manipulated video of US President Joe Biden.

A video of Biden voting with his adult granddaughter, manipulated to falsely appear that he inappropriately touched her chest, went viral last year.

It was reported to Meta and later the company's oversight board as .

The tech giant's oversight board, which independently reviews Meta's content moderation decisions, said the platform was technically correct to leave the video online.

But it also insisted that the company's rules on manipulated content were no longer fit for purpose.

The board's warning came amid fears of rampant misuse of artificial intelligence-powered applications for disinformation on in a pivotal election year not only in the United States but worldwide as huge portions of the global population head to the polls.

The Board said that Meta's policy in its current form was "incoherent, lacking in persuasive justification and inappropriately focused on how content has been created."

This was instead of focusing on the "specific harms it aims to prevent (for example, to electoral processes)," the board added.

Meta in a response said it was "reviewing the Oversight Board's guidance and will respond publicly to their recommendations within 60 days in accordance with the bylaws."

According to the board, in the Biden case, the rules were not violated "because the video was not manipulated using nor did it depict Biden saying something he did not."

But the board insisted that "non-AI-altered content is prevalent and not necessarily any less misleading."

For example, most smartphones have simple-to-use features to edit content into disinformation sometimes referred to as "cheap fakes," it noted.

The board also underlined that altered audio content, unlike videos, was not under the policy's current scope, even though deep fake audio can be very effective to deceive users.

Already one US robocall impersonating Biden urged New Hampshire residents not to cast ballots in the Democratic primary, prompting state authorities to launch a probe into possible voter suppression.

The oversight board urged Meta to reconsider the manipulated media policy "quickly, given the number of elections in 2024."

© 2024 AFP

Citation: Meta urged to update rules after fake Biden post (2024, February 5) retrieved 28 April 2024 from https://techxplore.com/news/2024-02-meta-urged-fake-biden.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Biden robocall: Audio deepfake fuels election disinformation fears

22 shares

Feedback to editors