Facebook Moves to Stop Election Misinformation

The social network said it would block new political ads in late October, among other measures, to reduce misinformation and interference.

Credit…Photographs by Erin Schaff/The New York Times

author mike isaac thumbLarge - Facebook Moves to Stop Election Misinformation

SAN FRANCISCO — Facebook on Thursday moved to clamp down on any confusion about the November election on its service, rolling out a sweeping set of changes to try to limit voter misinformation and prevent interference from President Trump and other politicians.

In an acknowledgment of how powerful its effect on public discourse can be, Facebook said it planned to bar any new political ads on its site in the week before Election Day. The social network said it would also strengthen measures against posts that tried to dissuade people from voting. Postelection, it said, it will quash any candidates’ attempts at claiming false victories by redirecting users to accurate information on the results.

Facebook is bracing for what is set to be a highly contentious presidential election. With two months to go, President Trump and Joseph R. Biden Jr. have ratcheted up their attacks against each other, clashing over issues including the coronavirus pandemic and racial unrest. Mr. Trump, who uses social media as a megaphone, has suggested that even when the results are in, he may not accept them, and he has questioned the legitimacy of mail-in voting.

Mark Zuckerberg, Facebook’s chief executive, wrote in a post on Thursday that the divisions in the United States and the prospect of taking days or weeks to finalize election results could lead “to an increased risk of civil unrest across the country.”

Facebook’s changes indicate how proactive the Silicon Valley company has become on election interference, especially after it was slow to react to Russians using the service in 2016 to sway the American electorate and promote Mr. Trump. Since then, Mr. Zuckerberg has worked to prevent the social network from being misused, aiming to turn the tide of negative perception about his company.

But Facebook’s moves may already be too little and too late, critics said. Some of the measures, such as the blocking of new political ads a week before Election Day, are temporary. Misinformation and other toxic content also flows freely on Facebook outside of ads, including in private Facebook groups and in posts by users, which the company’s changes do not address.

Some of the actions may unintentionally make Facebook even more politicized before the election, critics said. When political ads are blocked on the site, right-wing publishers on Facebook, such as Breitbart and Fox News, could fill the vacuum, said Tara McGowan, the chief executive of the liberal nonprofit group Acronym.

“By banning new political ads in the final critical days of the 2020 election, Facebook has decided to tip the scales of the election to those with the greatest followings on Facebook — and that includes President Trump and the right-wing media that serves him,” she said in a statement.

The Trump campaign disputed that, saying people would instead be influenced on Facebook by ads from “biased” media. It added that the social network was censoring politicians.

“When millions of voters will be making their decisions, the president will be silenced by the Silicon Valley mafia, who will at the same time allow corporate media to run their biased ads to swing voters in key states,” said Samantha Zager, a Trump campaign spokeswoman.

Hours after rolling out its changes, Facebook applied its new rules to one of Mr. Trump’s posts on his Facebook page, in which he cast doubt on the vote-by-mail process. The company added a warning label that read, “Voting by mail has a long history of trustworthiness in the U.S. and the same is predicted this year.”

Mr. Biden’s campaign didn’t immediately return a request for comment.

Other social media companies, including YouTube and Twitter, have also moved to minimize political manipulation on their platforms. Twitter banned political advertising last year and has added labels to politicians’ tweets. On Thursday, Twitter also added a label to the tweet sent by Mr. Trump about voting that echoed his language on Facebook. YouTube has confirmed that it was holding conversations on postelection strategy, but has declined to elaborate.

Facebook, a key battleground for both presidential campaigns, has been most in the eye because of its billions of users. It has faced increasing scrutiny in recent months as domestic misinformation about this year’s election has proliferated. Yet Mr. Zuckerberg has declined to remove much of that false information, saying that Facebook supports free speech and that politicians’ posts are newsworthy. Many of the company’s own employees have objected to that position.

On Tuesday, Facebook said the Kremlin-backed group that interfered in the 2016 presidential election, the Internet Research Agency, had tried to meddle on its service again using fake accounts and a website set up to look like a left-wing news site. Facebook, which was warned by the Federal Bureau of Investigation about the Russian effort, said it had removed the fake accounts and news site before they gained much traction.

In his post, Mr. Zuckerberg said Facebook had removed over 100 networks worldwide in the last four years that were trying to influence elections. But increasingly, the threats to undermine the legitimacy of the November election were coming “from within our own borders,” he said.

As a result, Facebook said, it will begin barring politicians from placing new ads on Facebook and Instagram, the photo-sharing service it owns, on Oct. 27. Existing political ads will not be affected. Political candidates will still be able to adjust both the groups of people their existing ads are targeting and the amount of money they spend on those ads. They can resume running new political ads after Election Day, the company said.

In another change, Facebook said it would place a voting information center — a hub for accurate, up-to-date information on how, when and where to register to vote — at the top of its News Feed through Election Day. The company had rolled out the voter information center in June and has continued promoting it, with a goal of registering four million people and encouraging them to vote.

To curb voting misinformation, Facebook said, it will remove posts that tell people they will catch Covid-19 if they vote. For posts that used the coronavirus to discourage people from voting in other, less obvious ways, the company said it would attach a label and link to its voter information center.

Facebook also widened its removal of posts that both explicitly and implicitly aim to disenfranchise people from voting; previously, the company took down only posts that actively discouraged people from voting. Now, a post that causes confusion around who is eligible to vote or some part of the voting process — such as a misstatement about what documentation is needed to receive a ballot — will also be expunged.

The company added that it would limit the number of people that users could forward messages to in its Messenger app to no more than five people, down from more than 150. The move mirrors what WhatsApp, the messaging app also owned by Facebook, did in 2018 when it limited message forwarding to 20 people from a previous maximum of 250. WhatsApp now limits message forwarding to five people maximum.

Misinformation across private communication channels is a much more difficult problem to tackle than on public social networks because it is hidden. Limiting message forwarding could slow that spread.

To get accurate information on the election’s results, Facebook said, it plans to work with Reuters, the news organization, to provide verified results to the voting information center. If any candidate tried declaring victory falsely or preemptively, Facebook said, it would add a label to those posts directing users to the official outcome.

Facebook teams have worked for months to walk through different contingency plans for how to handle the election. The company has built an arsenal of tools and products to safeguard elections in the past four years. It has also invited people in government, think tanks and academia to participate.

In recent months, Mr. Zuckerberg and some of his lieutenants had started holding daily meetings about minimizing how the platform could be used to dispute the election, people with knowledge of the company have said. Last month, Facebook employees asked how the social network would act if Mr. Trump tried to cast doubt on the election results, and Mr. Zuckerberg, at a staff meeting, said he found the president’s comment on mail-in voting “quite troubling.”

The chief executive helped drive the new election-related changes, according to two people familiar with the company, who declined to be identified because the details are confidential. On Tuesday, Mr. Zuckerberg and his wife, Dr. Priscilla Chan, separately donated $300 million to support voting infrastructure and security efforts.

Mr. Zuckerberg added that Facebook would not make any further changes to its site between now and when there was an official election result.

“It’s going to take a concerted effort by all of us — political parties and candidates, election authorities, the media and social networks, and ultimately voters as well — to live up to our responsibilities,” he said.

Maggie Haberman and Nick Corasaniti contributed reporting.

Updated  Sept. 4, 2020


  • The Latest

  • How to Win 270

  • Voting by Mail

  • Keep Up With Our Coverage

    • Get an email recapping the day’s news

    • Download our mobile app on iOS and Android and turn on Breaking News and Politics alerts

We Know You Better!
Subscribe To Our Newsletter
Be the first to get latest updates and
exclusive content straight to your email inbox.
Yes, I want to receive updates
No Thanks!
close-link

Subscribe to our newsletter

Sign-up to get the latest marketing tips straight to your inbox.
SUBSCRIBE!
Give it a try, you can unsubscribe anytime.