YouTube continues to push dangerous videos to users susceptible to extremism, white supremacy, report finds

youtube
Credit: Pixabay/CC0 Public Domain

Google's YouTube is still recommending extremist and white supremacist videos to viewers already susceptible to racial hatred, a new report found.

Though the nation's most popular social media platform has removed large amounts of extremist content under political pressure, exposure to harmful videos is still common, and users who view extremist videos are still being recommended new clips in the same vein, according to a that ADL (the Anti-Defamation League) released Friday, an advance copy of which was shared exclusively with U.S. TODAY.

One in 10 study participants viewed at least one video from an extremist channel and 2 in 10 viewed at least one from an "alternative" channel, according to the study, which examined the viewing habits of 915 respondents. The study's authors defined extremist and alternative by drawing from published research on online radicalization.

The mainculprit? YouTube's recommendation algorithm. When users watched these videos, they were more likely to see and follow recommendations to similar videos, the study found.

The researchers discovered, for example, that users who already viewed extremist videos on YouTube were recommended other extremist videos to watch almost 30% of the time.

People who aren't already watching extremist YouTube videos were very unlikely to be channeled toward that type of content, showing that some of the company's efforts to limit hate speech are working. Recommendations to potentially harmful videos after viewing other types of videos was also rare.

The ADL says the findings underscore the need for platforms to remove violent extremist groups and content that fuel real-world violence like the Jan. 6 siege on the U.S. Capitol.

"Despite the recent changes that YouTube has made, our findings indicate that far too many people are still being exposed to extremist ideas on the platform," Brendan Nyhan, a report author and professor of government at Dartmouth College, said in a statement.

"We welcome more research on this front, but views this type of content get from recommendations has dropped by over 70% in the U.S., and as other researchers have noted, our systems often point to authoritative content," YouTube spokesman Alex Joseph said in a statement.

Still, experts say YouTube could do much more.

"The fact is that they have not solved this and they're still serving up more and more extremist content to people who are already consuming extremist content, which is a problem," said Bridget Todd, a writer and host of the podcast "There are No Girls on the Internet." "What they really need to do is get serious about keeping this kind of stuff off their platform, and really doing some work on how they can keep from further radicalizing people on YouTube."

"Red pill' moment often on YouTube

For years, study after study has shown that YouTube serves as a megaphone for white supremacists and other hate groups and a pipeline for recruits.

YouTube says it has vastly reduced views of supremacist videos and continues to develop countermeasures against hate speech.

"We have clear policies that prohibit hate speech and harassment on YouTube and terminated over 235,000 channels in the last quarter for violating those policies," YouTube's Joseph said. "Beyond removing content, since 2019 we've also limited the reach of content that does not violate our policies but brushes up against the line, by making sure our systems are not widely recommending it to those not seeking it."

But why it has taken one of the world's largest companies so long to react to the growing problem of homegrown extremism perplexes researchers.

"When you talk to folks who were in the (white supremacist) movement, or when you read in the chat rooms these people talk in, it's almost all about YouTube," Megan Squire, a computer science professor at Elon University who studies online extremism, told U.S. TODAY in December.

"Their 'red pill' moment is almost always on YouTube," Squire said, referring to a term popular with the far right to describe when people suddenly realize white supremacists and other conspiracy theorists have been correct all along.

In 2019, a group of academic researchers from Brazil and Europe published a groundbreaking study that examined radicalization on YouTube.

By analyzing more than 72 million YouTube comments, the researchers were able to track users and observe them migrating to more hateful content on the platform. They concluded that the long-hypothesized "radicalization pipeline" on YouTube exists, and its algorithm speeded up radicalization.

However, another academic study concluded that while extremist "echo chambers" exist on YouTube, there was no evidence they were being caused by the platform's recommendation.

YouTube made changes after outcry

For years, YouTube executives ignored staff's warnings that its recommendation feature, which aimed to boost time people spend online and generate more advertising revenue, ignited the spread of extremist content, according to published reports.

After an outcry from advertisers in 2017, YouTube banned ads from appearing alongside content that promotes hate or discrimination or disparages protected groups.

YouTube limited recommendations on those videos and disabled features such as commenting and sharing. But it didn't remove them. The company said the crackdown reduced views of supremacist videos by 80%.

Last year, YouTube made changes to its recommendation feature to reduce the visibility of what it calls "borderline content," videos that brush up against its terms of service but do not break them.

Also in 2019, it removed thousands of channels and tightened its hate speech policy to ban videos claiming any group is superior "in order to justify discrimination, segregation, or exclusion based on qualities like race, religion or sexual orientation."

But the ADL study shows that such content is still easily accessible on the site, and Todd wondered why a massive company like Google can't simply eradicate from YouTube altogether.

"Other platforms have figured this out," Todd said. "I do not believe that this is something that is out of their control."

(c)2021 U.S. Today
Distributed by Tribune Content Agency, LLC.

Citation: YouTube continues to push dangerous videos to users susceptible to extremism, white supremacy, report finds (2021, February 15) retrieved 19 March 2024 from https://techxplore.com/news/2021-02-youtube-dangerous-videos-users-susceptible.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Google think tank's report on white supremacy says little about YouTube's role in people driven to extremism

9 shares

Feedback to editors