Google think tank's report on white supremacy says little about YouTube's role in people driven to extremism

youtube
Credit: CC0 Public Domain

A Google-funded report examines the relationship between white supremacists and the internet, but it makes scant reference—all of it positive—to YouTube, the company's platform that many experts blame more than any other for driving people to extremism.

The report, by Jigsaw, a "tech incubator" that has operated within Google for the past decade, draws from interviews with dozens of former extremists and describes how the internet is a breeding ground for hate groups.

Study after study has shown that YouTube serves as a megaphone for and other and a pipeline for recruits. YouTube's algorithm has been found to direct users to extreme content, sucking them into violent ideologies.

"They're underemphasizing the role that their own technology and their own platforms have in pushing people towards extremism," said Bridget Todd, a writer and host of the podcast "There are No Girls on the Internet."

"Individuals certainly have a responsibility to not allow themselves to be engulfed in extremist content," Todd said. "But if you're a platform like Google, you can't just emphasize the individual's responsibility and completely obscure the fact that your massive platform has allowed online extremist content to fester and become so popular."

YouTube's 'red pill' videos

Like other major tech platforms, YouTube has recently steered more resources toward content moderation. The company says it has vastly reduced views of supremacist videos and continues to develop countermeasures against hate speech.

But researchers who have for years watched people become radicalized via YouTube ask what has taken one of the world's largest companies so long to react to the growing problem of homegrown extremism.

"When you talk to folks who were in the (white supremacist) movement, or when you read in the chat rooms these people talk in, it's almost all about YouTube," said Megan Squire, a computer science professor at Elon University who studies online extremism.

"Their 'red pill' moment is almost always on YouTube," Squire said, referring to a term popular with the far right to describe when people suddenly realize white supremacists and other conspiracy theorists have been correct all along.

Squire and others suggested several steps Google could immediately take to address the problems outlined in the Jigsaw report. It could provide funding for some of the anti-extremist nonprofits lauded there. Google could drastically ramp up moderation—Squire said it should be multiplied by 10. And it could fund academic research into how people are radicalized online.

The tech giant also could open up its data so academics can fully study platforms like YouTube and their role in spreading extremist content, several experts said.

The Jigsaw report comes as bipartisan scrutiny of the nation's leading tech companies is intensifying in Washington, D.C. Google has joined Twitter and Facebook in the spotlight, defending its policies and its record on everything from misinformation to hate speech.

In October, the Justice Department accused Google of violating antitrust laws by stifling competition and harming consumers in online search and advertising.

Google's white supremacy study offers little new

The Jigsaw report, titled "The Current: The White Supremacy Issue," makes a few key points about how hate metastasizes online.

"Lone wolves"—people who have carried out mass shootings and other violent hate crimes—are not alone at all, the report says. They are often connected via online platforms and communities.

The report outlines the growing "alt-tech ecosystem," in which new social media platforms like Gab and Parler attract white supremacists kicked off Facebook and Twitter.

Jigsaw's researchers detail how supremacists ensnare vulnerable people online with softer versions of their hateful worldview before introducing more extreme concepts.

None of this new to those who monitor and study extremism.

"It feels very derivative and facile," Squire said. "I learned nothing from reading this, and that's disappointing."

The Jigsaw report addresses such criticism, saying its conclusions won't be new to victims of discrimination and hate crimes, but "we hope that it may still offer insightful nuance into the evolving tactics of white supremacists online that advance efforts to counter white supremacy."

YouTube radicalization: How it works

Late in 2019, a group of academic researchers from Brazil and Europe published a groundbreaking study that examined radicalization on YouTube.

By analyzing more than 72 million YouTube comments, the researchers were able to track users and observe them migrating to more hateful content on the platform. They concluded that the long-hypothesized "radicalization pipeline" on YouTube certainly exists, and its algorithm speeded up radicalization.

"We found a very strong effect," said Manoel Horta Ribeiro, one of the main authors of the study. "People who were commenting on alt-right channels had previously commented on some of the more gateway channels. It was a pipeline."

For years, YouTube executives ignored staff's warnings that its recommendation feature, which aimed to boost time people spend online and generate more advertising revenue, ignited the spread of extremist content, according to published reports.

After an outcry from advertisers in 2017, YouTube banned ads from appearing alongside content that promotes hate or discrimination or disparages protected groups. YouTube limited recommendations on those videos and disabled features such as commenting and sharing. But it didn't remove them. The company said the crackdown reduced views of supremacist videos by 80%.

Last year, YouTube made changes to its recommendation feature to reduce the visibility of what it calls "borderline content," videos that brush up against its terms of service but do not break them. Also in 2019, it removed thousands of channels and tightened its hate speech policy to ban videos claiming any group is superior "in order to justify discrimination, segregation, or exclusion based on qualities like race, religion or sexual orientation."

"Over the last several years we've taken steps to ensure that those who aim to spread supremacist ideology cannot do so on YouTube," Alex Joseph, a YouTube spokesperson, said in a statement. "These interventions have had a significant impact, and our work here is ongoing."

But YouTube still has its issues, and the company is being roundly criticized for not doing enough, soon enough.

"The barn door isn't just open, the horse is already out and it's trampling babies," said Talia Lavin, a writer and expert on white supremacists. "Now they want credit for shutting the barn door? I don't think any credit is due."

A 792-page report from The New Zealand Royal Commission released last week says the Australian terrorist who killed 51 people at two mosques in Christchurch, New Zealand, last year was radicalized on YouTube.

"What particularly stood out was the statement that the terrorist made that he was 'not a frequent commentator on extreme right-wing sites and YouTube was a significant source of information and inspiration,'" said Jacinda Ardern, New Zealand's prime minister, according to The Guardian.

"This is a point I plan to make directly to the leadership of YouTube."

©2020 USA Today
Distributed by Tribune Content Agency, LLC

Citation: Google think tank's report on white supremacy says little about YouTube's role in people driven to extremism (2020, December 15) retrieved 19 March 2024 from https://techxplore.com/news/2020-12-google-tank-white-supremacy-youtube.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

YouTube will remove videos making harmful claims rooted in conspiracy theories

11 shares

Feedback to editors