February 9, 2021
Facebook steps up vaccine misinfo efforts. Will it work?
As inoculation efforts for the coronavirus ramp up around the world, Facebook says it's going all in to block the spread of bogus vaccine claims. In practice, that means the social network plans to ban a new bunch of false claims in addition to the manifold false claims about vaccines and COVID-19 that it has already banned.
Among Facebook's new targets: claims include that vaccines aren't effective or that they're toxic, dangerous or cause autism, all of which have been thoroughly debunked for both the coronavirus vaccine and any other vaccine.
The platform had already prohibited users from spreading falsehoods that such as: masks are ineffective; vaccines cause infertility; vaccines contain tracking microchips; and vaccines don't actually exist. Plus a whole host of other dangerous misinformation that's been debunked by the World Health Organization or government agencies, per a policy that went into effect in December.
In the fall of 2020, the company banned advertisements that discourage vaccinations—with an exception carved out for advocacy ads about government vaccine policies—but at that time it didn't ban unpaid posts by users.
But even with Facebook's evolving policies, those ideas have lived on and spread from private groups to the pages of Instagram influencers peddling health advice to new mothers. It's not clear if Facebook's newly-expanded policy will be more effective than its past attempts to clamp down on COVID and vaccine-related misinformation.
"Millions of people are being fed dangerous lies which lead them to doubt government guidance on COVID and on vaccines, prolonging the pandemic," said Imran Ahmed, CEO of the watchdog group Center for Countering Digital Hate.
© 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.