Facebook on Wednesday, October 7, said it will stop running political or social issue ads after the US polls close on November 3 to reduce chances of confusion or abuse.
The leading social network also said that any posts prematurely declaring a winner or contesting the count will be labeled with reliable information from news outlets and election officials.
“If a candidate or party declares premature victory before a race is called by major media outlets, we will add more specific information in the notifications that counting is still in progress and no winner has been determined,” said vice president of integrity Guy Rosen.
Facebook and other social networks have been tightening rules as they gear up for post-election scenarios, including efforts by President Donald Trump to wrongly claim victory or contend the outcome is not legitimate.
The California-based internet giant has been under pressure to avoid being used to spread misinformation and inflame social division, as was the case during the presidential election in 2016.
Policies against voter intimidation instituted by Facebook 4 years ago have been consistently expanded to account for new trends and tactics to intimidate or prevent voting, according to vice president of content policy Monika Bickert.
“As we head into the last days of this election, we know we will see spikes in efforts to intimidate voters,” Bickert said at a press briefing.
Wednesday’s tightening of rules included barring posts that reference weapons or armies in encouraging people to monitor polling places on election day, according to Bickert.
“We will remove statements of intent or advocacy to go to an election site with military language,” Bickert said.
“We will also remove calls to go to polls to monitor if it involves exerting control or showing power.”
Facebook has already banned posts directly urging people to go to polling places with weapons or to stop people from voting.
Tracking ‘viral’ posts
Facebook has more than 35,000 people across the company devoted to safety and security, and teams have worked on more than 200 elections around the world, according to Rosen.
More than 120,000 pieces of content at Facebook and Instagram in the US have been removed for violating policies against voter interference, Rosen said.
He added that warning labels were placed on more than 150 million pieces of content viewed at Facebook in the US and debunked by third-party fact-checkers.
Meanwhile, more than 39 million people have been directed to a voting information center set up at Facebook this year, and an estimated 2.5 million people helped with registering to vote, according to the social network.
An election operations center set up at Facebook’s headquarters in Silicon Valley went virtual at the onset of the Covid-19 pandemic and will continue to run through the election.
Facebook has also devised a system to flag posts that may “go viral,” no matter the topic, so it can check in real time whether they are related to voting or the election, Rosen said.
Facebook has contingency plans to block some content on its platform if civil unrest breaks out after the November 3 election.
Nick Clegg, a former deputy British prime minister who is Facebook’s head of global affairs, said recently that the social platform could take exceptional steps to “restrict the circulation of content” in case of turmoil.
The comments are in line with reports that Facebook could deploy a “kill switch” to thwart the spread of misinformation in case of a dispute on US election results.
Clegg said: “There are some break-glass options available to us if there really is an extremely chaotic and, worse still, violent set of circumstances.”
Social networks have faced pressure to curb political misinformation, both from foreign actors and from groups within the United States.
Some activists have called on Facebook to take a more aggressive stand on false statements from Trump himself, even as the platform has said it would steer clear of blocking political speech. – Rappler.com
There are no comments yet. Add your comment to start the conversation.