Sites like Facebook, Google and Twitter allowed white supremacists to flourish. Now what?

Credit: CC0 Public Domain

Before walking into a Norwegian mosque with a pair of shotguns earlier this month, Philip Manshaus called for a race war in a statement he posted on the dark reaches of social media.

He couldn't go to 8chan, the renegade message board where suspects in three recent mass shootings had uploaded white nationalist screeds. That board had been booted days earlier by its internet provider, after the man suspected of killing 22 people in an El Paso Walmart posted his own -filled manifesto.

It wasn't hard for Manshaus to find a megaphone, though. The 21-year-old—whose Aug. 10 attack was foiled when a worshiper tackled him—posted on a little-known board called endchan.

Much attention since the El Paso shooting on Aug. 3 has focused on 8chan. But white supremacists remain active all across the web—including on the biggest social media sites, where they proselytize in plain sight. Attempts to curb racist and violent views on the internet have become serious only recently, with limited success.

In April, a white nationalist charged with killing 51 people in two mosques in Christchurch, New Zealand, livestreamed his gruesome shooting spree on Facebook.

David Duke, the former grand wizard of the Ku Klux Klan, still maintains his own YouTube channel, as do other prominent white nationalist groups.

Although 8chan took down the El Paso shooter's manifesto, it turned up soon afterward on many other sites, including mainstream Reddit and Facebook.

One person who posted the manifesto on the popular message board before the news media had even disclosed its existence said gleefully, "whoooooo WE ARE DOING THIS!!!!!" The post drew a quick response: "Race war, baby!"

Moderators at 4chan never removed that post.

The presence of racist ideology on popular social media sites has helped fuel the rise of white nationalism, experts say—far more so than on the niche sites, which tend to cater to those already deep in the movement.

"You had a decade-long period in which social media, meaning virtually all of it, Facebook, Twitter, YouTube, etc., was unregulated when it came to speech," said Heidi Beirich, an expert on extremism at the hate-watch group Southern Poverty Law Center. "Hate groups were completely active there. They were spreading propaganda like wildfire."

Social media companies' efforts often fall short

Efforts to clean up violent extremism on social media began roughly five years ago with attempts to scrub jihadist propaganda originating from the Islamic State, said Peter Neumann, founding director of the London-based International Centre for the Study of Radicalisation and Political Violence.

White nationalist posts didn't get much attention until two years ago, after the Unite the Right rally in Charlottesville, Virginia.

"After the Charlottesville rallies, most of the big tech companies ... came to understand that this was a very dangerous situation and began to tighten up their community standards, so less and less of this is found on the mainstream sites," Bierich said.

Twitter and YouTube stepped up their efforts to remove content and accounts that promoted violence and terrorism in 2017. Earlier this year, Facebook and YouTube announced they would start taking down accounts that espouse white supremacy.

An executive at Twitter testified before Congress recently that it had taken action on 184 violent extremist groups and "permanently suspended 2,182 unique accounts." Of those accounts, Twitter said, 93 "advocate violence against civilians alongside some form of extremist white supremacist ideology."

A Twitter spokeswoman told U.S. TODAY, "We're proactively removing content that violates our policies and are engaged with law enforcement, as appropriate."

YouTube reported it has taken down thousands of videos that embrace white supremacy. Videos don't have to call for violence to be banned. Suggesting that the white race is superior, Jews control the world or Muslims are inferior merits removal, the company said.

And yet, white supremacist groups such as the American Renaissance retain channels on YouTube. Dylann Roof, who killed nine African American worshipers in a South Carolina church, acknowledged he was inspired by that group's obsession with black-on-white murder.

Blocking such propaganda poses real challenges. Silicon Valley companies have historically relied on algorithms to decide what content to post; from their earliest days, they took no responsibility for the actual content they displayed.

Government censorship runs awry of free speech laws, although there has been talk of giving the FBI more power to monitor social media sites for domestic terrorists.

Not every tech company has fully committed, either. Popular search engines Google, Yahoo and Bing still link directly to sites where white nationalists converge, such as 4chan, the Daily Stormer, Gab and others.

Hate-filled messages ricochet across the web

The internet is so immense that even if one tech company shuts down one site, posters quickly find another. And hate-filled messages ricochet among social media sites, which makes it difficult to snuff them out.

The video Facebook took down of the New Zealand massacre still can be found with a simple Google search. The same is true of the lengthy El Paso manifesto originally pulled from 8chan that popped up on 4chan.

Posters quickly learn to surf for a site that won't block them, and followers know to repost content to keep it alive.

Philip Manshaus turned to endchan, where the home page says "Welcome 8ch refugees" and the politics page features images of swastikas and Adolf Hitler.

His post included a cartoon meme honoring three men charged in white-nationalist-inspired killings. It calls the New Zealand killer of 51 mosque worshipers a "saint" and depicts two other mass shooters as his "disciples," including the El Paso man, using an ethnic slur to praise him for killing Hispanics.

Endchan, started in 2015, deleted the post and said on Twitter that it opposes violence. It is clearly a fringe site, but U.S. TODAY found the meme Manshaus used also had been posted on mainstream Reddit, a popular discussion board with 330 million users.

The message thread, or subreddit, where the cartoon lived draws more than 92,000 members devoted to mocking male virgins with memes.

In a statement to U.S. TODAY, Reddit said its policies "prohibit content that encourages, glorifies, incites, or calls for violence." The company said that in recent years it has expanded the teams that enforce its policies and is looking for technological solutions to block prohibited posts. Subreddits are primarily policed by volunteer moderators.

Yet Reddit removed the cartoon meme only after U.S. TODAY alerted the company to it. And the subreddit where it appeared remains replete with racist and violent memes.

Hatred festers on sites like 8chan, 4 chan

Even though 4chan and 8chan are unrelated, their histories are closely intertwined, offering insights into where some internet hatred festers.

Users of both sites often have seemed indistinguishable. Both sites are full of vile language, with frequent use of the N-word and slurs against Jews. Both are "imageboards," where users embed images in their posts.

On 4chan, posters often talk nonchalantly about violence. Last week one asked, "Why won't Generation X and Millennial's rise up together and kill all the baby boomers? It's the one real way to fix all our problems." Another replied, "the enemy is the Jews, not boomers."

Christopher Poole started 4chan in 2003, when he was a high school student, for fans obsessed with anime. But 4chan was purist in its approach to free speech and over time attracted a lot of young white males with extremist right-wing political views.

By contrast, 8chan was founded as an obscure copycat board in 2013 by Fredrick Brennan, a 4chan fan.

Users on 4chan don't have to register and almost never reveal their true names. The same was true on 8chan. Unlike most social media sites, there's no way to track a user's history. Posters' identities are unknown even to the message board.

Volunteers manage 4chan, as they did 8chan. The only sitewide rule on 8chan was to not break U.S. laws; 4chan's politics board bans racist posts, a rule routinely ignored.

While 8chan is now gone, 4chan is ubiquitous on social media. It has Twitter account, a Facebook page and a Reddit group with more than 1 million members.

Poole, who now works for Google, stepped down as administrator in 2015. He did not respond to an email requesting an interview.

His departure came shortly after two scandals. Someone posted stolen nude photographs of Hollywood actresses on 4chan, which Poole said cost him tens of thousands of dollars in legal fees. Around the same time, he booted posters who were harassing female critics of sexism in video games, a scandal known as GamerGate.

Many of those exiled were welcomed by 8chan, turning an obscure message board into a much bigger player.

Search engines could be doing more, experts say

Experts believe social media companies should be doing more to curb white nationalist violence.

Search engines can block a site, pushing it to the nether reaches of the dark web, where traffic is limited.

"I think at this stage these types of sites shouldn't be discoverable in search. And they shouldn't be linkable from ," said Joan Donovan, a director at the Shorenstein Center on Media, Politics and Public Policy at Harvard University.

In 2015, Google removed 8chan's home page from its search index because it repeatedly linked to images of child sexual abuse. But Google continued to link to 8chan's individual boards, including the one where the El Paso suspect posted his manifesto.

Google told U.S. TODAY in a written statement that it doesn't want to limit what users can search for, unless it violates the law.

"Hateful ideas and calls for violence are abhorrent, and our systems are designed to not expose people to this type of content if they are not explicitly looking for it," the statement said. "We follow local laws when determining what web pages are blocked from [s]earch, as we do not want to impose our own limits or point of view on what information people should be able to access."

In a statement about its Bing search engine, Microsoft said, "We want to avoid users encountering unexpected, potentially offensive content appearing in results. That's why we encourage people to provide feedback using the feedback button included on our pages."

Yahoo never responded to U.S. TODAY.

Companies that host websites also can stop doing business with white nationalist sites.

Matthew Prince, CEO of Cloudflare—the internet service provider that hosted 8chan—once described himself as "almost a free-speech absolutist." After the El Paso shooting last week, he said that 8chan "has repeatedly proven itself to be a cesspool of hate."

Prince already had ejected the Daily Stormer from his service. He decided to dump 8chan as well, saying the message board's administrators "have proven themselves to be lawless and that lawlessness has caused multiple tragic deaths."

So far, 8chan has not been able to find a new host. Its founder, who became so fed up with his own creation that he left in December, doubts it ever will.

"Good riddance," Brennan tweeted last week. "Literally everyone else online, including chan site users, is better for it."

©2019 USA Today
Distributed by Tribune Content Agency, LLC.

Citation: Sites like Facebook, Google and Twitter allowed white supremacists to flourish. Now what? (2019, August 21) retrieved 3 December 2023 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Sporadic outages at 8chan and a new host after mass shooting


Feedback to editors