Credit: CC0 Public Domain

YouTube is cracking down on videos displaying "harmful conspiracy theories" as social platforms continue to grapple with the spread of misinformation and hate.

The Google-owned streaming company will now prohibit content that targets a person or group with " that have been used to justify real-world violence," YouTube said Thursday.

That means potentially harmful videos claiming that someone is complicit in one or more proven-to-be-false conspiracies will be taken down starting on Oct. 15, YouTube said.

For an example of the types of content that will be banned, think back to the Pizzagate scandal of 2016, which included baseless allegations that Hillary Clinton was involved in running a child sex-trafficking ring out of a pizzeria in Washington. The swiftly spread across the internet and a vigilante gunman later opened fire at the restaurant.

"To address this kind of content effectively, it's critical that our teams continually review and update our policies and systems to reflect the frequent changes," YouTube said in a blog post. The news comes a day after the platform said COVID-19 vaccine misinformation will be removed.

YouTube's efforts come after companies like Twitter and Facebook have set out to crack down on the QAnon extremist conspiracy movement by suspending accounts and taking down posts.

In July, Twitter began removing and suspending accounts associated with QAnon, a conspiracy theory group that has supported President Trump. In October, Facebook pledged to remove pages, groups and Instagram accounts aligned with the conspiracy group.