Early Thursday morning, September 3rd, Facebook announced sweeping changes in an effort to protect the US Presidential election. In a blog post, Facebook’s CEO Mark Zuckerberg wrote: “This election is not going to be business as usual. We all have a responsibility to protect our democracy. That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.”
The moves to reduce misinformation range from blocking political ads a week out to continuing to remove misinformation about voting to limiting forwarding on Messenger. But, as Justin Hendrix writes, “there is cause for great concern that news of these efforts are less of a reason to be optimistic about what is going to unfold, and more of a sign of the chaos to come.”
Protego Press rounded up several quick reactions from leading experts on the announcement. Have your own thoughts? Email us at [info] at ProtegoPress.com!
Bartlett Cleland, executive director of Rainey Center’s Leadership Alliance for a More Perfect Union (LAMP) program: “That FB is being so proactive is laudable. Limiting new content immediately before the election, about the election, seems a reasonable step to provide time for ads to be discussed and responded to before the polls close. They will no doubt be criticized or their actions misconstrued by some but what cannot be denied here is FB’s willingness to try to do the right thing.”
Bridgett Barrett, at University of North Carolina-Chapel Hill’s School of Media and Journalism: “First of all, I am very happy that Mark and Facebook more broadly seem to be recognizing the major threats to this election and (frankly) our democracy and are taking some (late) proactive steps to address some of them. All of these steps, of course, leave me with outstanding questions.
I think stopping new ads the week before the election is a smart middle ground that allows people’s get out the vote efforts while limiting new claims from new events. This is a game-able rule (for instance, loading and getting thousands of messages approved and then increasing spend on them if that scenario actually turns up or that message is seen as useful) but overall a good move.
I am curious if this rule will be extended for weeks after the election as well. I am much less worried about the results of this election than I am worried about the damage to our democracy that the aftermath may cause in the event that the results are extremely close and particularly if Biden is declared the winner.
The voting information center and the labels will all be in the execution. From an advertising background, I am very aware that people do not read or click disclaimers and getting people away from their newsfeed with things like that is very hard. So, how much information will appear in the newsfeed itself, not somewhere that must be clicked away to? What will the label say—will it actually make clear that the claims are false in the label, or just the meaningless “get the info” type of label that Facebook has been using lately?”
Bishop Garrison, Director of National Security Outreach at Human Rights First: “If Facebook announced it would ban political ads either starting now or – if there is a contractual issue with timing – when it was legally feasible to so do, many analysts and experts would have an easier time believing in Zuckerberg’s sincerity to change. A week prior to Election Day is Insufficient to protect the system given many states have already the early voting process.”
Alice Stollmeyer, Founder and Executive Director of Defend Democracy: “Facebook must do more than virtue signalling. If they really want to help defend democracy, they should stop their platform from hosting and amplifying both paid and non-paid disinformation and misinformation operations — in all countries and 365 days per year.”
Joel Carter, Postgraduate Research Fellow at Protego Press: “Seems like FB has taken a step in the right direction, however, the details of the announcement bring into question the efficacy of this attempt to curb election misinformation. FB misses the mark (no pun intended) on a couple of things: 1) Preventing ads one week before the election isn’t enough time, especially in an election where mail-in ballots is expected to be used at high rates; 2) Does this mean political ads currently on the platform which contribute to misinformation will remain?; 3) Curbing misinformation means more than providing a single link, FB should “flood” its platform with accurate voting information; 4) How long will post- general election efforts last? Midterm election campaigns begin shortly thereafter Nov. 3rd, if FB wants its efforts to be more than a PR stunt, it will have to commit to fighting election misinformation in the long-term.”
Sarah Hunt, Co-founder and CEO of the Rainey Center: “State-sponsored disinformation and lies from candidates are as old as elections themselves. Facebook’s new policy is a late-coming private sector attempt to adapt a social media platform to manage modern expressions of these election security risks. But it’s no substitute for robust national security efforts by the government to remove and prevent state-sponsored election disinformation from our media. And it begs the question of how Facebook plans to address these problems on a global basis. Every democracy in which Facebook operates has these election security challenges. Will Facebook’s new policy be unique to the US 2020 election, or does it bode global implications? Today’s announcement provides no clarity there.”
Justin Sherman, research fellow at the Tech, Law, & Security Program at American University Washington College of Law and senior fellow at Ethical Tech at Duke University: “It’s almost laughable at this point that we’re supposed to be impressed with these incredibly small announcements that Facebook makes about its political content and advertising policies, announcements which have a heavy PR spin. Yet it’s very much not laughable, because the decisions Facebook has made and continues to make on these issues are having seriously damaging effects on public discourse headed into the election. The company is still, for example, going to run political ads that are blatant lies. It still refuses to take strong action against election falsehoods posted by the President, with a recent ‘label’ on a post not even explicitly saying the content of the post itself was false. That Mark Zuckerberg seems utterly unwilling to actually deal with some of the platform’s fundamental problems does not bode well for what happens if someone like the President himself continues lying about the election through November.”