This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

proofread

Extremist communities continue to rely on YouTube for hosting, but most videos are viewed off-site, research finds

Extremist communities continue to rely on YouTube for hosting, but most videos are viewed off-site, research finds
Research finds that extremist and alternative content on YouTube is predominantly viewed off-platform via embedded clips. Credit: Matthew Modoono/Northeastern University

It's easy to fall down the rabbit hole of the online video-sharing platform YouTube. After all, is there really a limit to cute pet videos?

But what about the platform's more sinister side? After the 2016 U.S. , YouTube was so criticized for radicalizing users by recommending increasingly and fringe content that it changed its recommendation algorithm.

Research four years later by Northeastern University computer scientist Christo Wilson found that—while extremist content remained on YouTube—subscriptions and external referrals drove disaffected users to extremist content rather than the recommendation algorithm.

"We didn't see this kind of 'rabbit-holing effect,'" says Wilson, an associate professor at Khoury College of Computer Sciences at Northeastern. "There was in fact a lot of problematic content that was still on YouTube and still had a fairly significant audience of people who were watching it. It's just that they haven't been radicalized on YouTube itself."

So if not on YouTube, where was this audience being radicalized?

In new research presented at an ACM Web Science Conference, Wilson finds that extremist communities continue to rely on YouTube for hosting—it's just that off-site is where the "rabbit-holing" begins.

"If you're already a political partisan and you're going to websites with a particular leaning, that's then leading you to YouTube channels and videos with the same kind of lean," Wilson says. "If you started in a place where you're being exposed to bad stuff, you end up in a place where you're being exposed to more bad stuff."

YouTube is an online video-sharing platform owned by Google. Following criticism for its role in hosting and elevating fringe conspiracy content, particularly through its recommendation algorithm, the platform changed that algorithm in 2019.

But the extremist content never fully disappeared.

Much of it migrated.

"YouTube is not just YouTube itself—you can embed the videos into any website," Wilson says. "This is the first study where we're looking at all this stuff that happens off-platform."

Wilson looked at over 1,000 U.S. residents from three cohorts: demographically representative users, heavy YouTube users, and users with high racial resentment. He analyzed all YouTube videos encountered by the users over a period of six months. All users accessed the web via desktop, rather than through mobile devices.

The research resulted in several interesting conclusions.

Wilson found that users saw more YouTube videos on sites other than YouTube than on the platform's website itself.

He also found that politically right-leaning websites tend to embed more videos from "problematic" YouTube channels than centrist or left-leaning websites. Channels were considered problematic if they were classified as either alternative or extremist, using grades assigned by professional fact checkers or other academics.

Wilson says an alternative channel would be, for example, Steven Crowder—a personality who interviews both mainstream scientists and vaccine deniers and is "sort of intellectually open." Extremist channels, Wilson said, would be "openly hateful"—something like former Ku Klux Klan Grand Wizard David Duke's old YouTube channel.

Most notably, users exposed to off-platform videos from problematic channels are significantly more inclined to browse toward on-platform videos from problematic channels.

"Your off-platform activity very quickly becomes on-platform activity," Wilson says.

So, what can YouTube do? After all, Wilson admits the platform can't control what people do when on other sites.

Wilson recommends stronger content-moderation policies.

"YouTube can tell where videos are being embedded off platform," Wilson notes. "If they see a particular channel being embedded in a website that is a known purveyor of misinformation, that should probably be scrutinized."

Plus, Wilson notes that YouTube still hosts the videos, even if they appear on other sites.

"They are aiding and abetting these fringe communities out there on the web by hosting videos for them," Wilson says. "If they had stronger content-moderation policies, that would definitely help address this."

More information: Desheng Hu et al, U. S. Users' Exposure to YouTube Videos On- and Off-platform, ACM Web Science Conference (2024). DOI: 10.1145/3614419.3644027

This story is republished courtesy of Northeastern Global News news.northeastern.edu.

Citation: Extremist communities continue to rely on YouTube for hosting, but most videos are viewed off-site, research finds (2024, May 22) retrieved 15 June 2024 from https://techxplore.com/news/2024-05-extremist-communities-youtube-hosting-videos.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

YouTube video recommendations lead to more extremist content for right-leaning users, researchers suggest

42 shares

Feedback to editors