Facebook has taken a range of steps to protect the social platform form being used for manipulation and misinformation on the November 3 US elections

Facebook is leveraging its vast resources to help protect the 2020 election against the kind of massive manipulation and disinformation efforts that the platform failed to act on in 2016.

The moves come with Facebook taking intense criticism from the Cambridge Analytica scandal in which the British consultancy hijacked personal data from tens of millions of Americans to help Donald Trump's campaign, and for enabling foreign influence campaigns including from Russia four years ago.

Here are some of the measures Facebook has taken to avoid a repeat of the 2016 fiasco and to deliver authoritative information:

Voter participation

After being accused of enabling "voter suppression" efforts in 2016, Facebook has set a goal of helping register at least four million voters this year and on Monday said it exceeded that with 4.4 million registrations.

The social network used by more than 200 million Americans is delivering reminders to people's feeds and offering links on how and where to vote.

It is also removing posts which seek to discourage going to the polls or spreading false information such as that voting centers are risky for coronavirus infections.

Facebook also pledged to remove posts that seek to intimidate voters, and took down a video from Donald Trump Jr. calling for an "army" of volunteers to protect polling stations.

Disinformation

Facebook has heavily invested in rooting out false information that could have an impact on voting.

It has fact-checking partnerships with dozens of media organizations, including AFP, which can result in a labeling of the post or a limit on how much it can be shared.

Facebook has also decided to place labels on media groups which are state controlled, both financially and editorially, and to block these organizations from posting ads targeting Americans.

Facebook co-founder and CEO Mark Zuckerberg was called before lawmakers in 2028 to answer questions on users' personal information harvested by Cambridge Analytica, a British political consulting firm linked to the Trump campaign

Manipulation

Facebook has been for months battling what it calls "inauthentic coordinated behavior," or those using fake or disguised accounts to spread political misinformation.

Many of these influence campaigns have been from overseas, including from Russia and Iran. But a domestic-based group using fake accounts to praise Trump was recently banned as part of the platform's latest crackdown on orchestrated deception.

Facebook and other platforms are on the watch for potential "hack and leak" operations which could be used to spread false information about a candidate. To that end it has barred new political ads one week before the November 3 election.

And it has taken steps to improve security of the accounts of political candidates to guard against a takeover by bad actors.

Election day scenarios

Facebook has also developed contingency plans in case of a disputed election result, and the potential for unrest.

It has banned political ads after polls close on election day.

Any posts prematurely declaring a winner or contesting the count will be labeled with reliable information from news outlets and election officials.

"If a candidate or party declares premature victory before a race is called by major media outlets, we will add more specific information in the notifications that counting is still in progress and no winner has been determined," said vice president of integrity Guy Rosen.

Nick Clegg, a former deputy British prime minister who is Facebook's head of global affairs, said the top social media platform could take exceptional steps to "restrict the circulation of content" in case of turmoil.

"We have acted aggressively in other parts of the world where we think that there is real civic instability and we obviously have the tools to do that," Clegg said in an interview published Tuesday in the Financial Times, in comments confirmed by AFP.

The comments are line with prior reports that Facebook could deploy a "kill switch" to thwart the spread of misinformation in case of a dispute on US election results.