
Badge Vote. US presidential election in 2016.
Advertising After Facebook
Without federal action on political advertising rules, history is bound to repeat itself
As the 2020 election rapidly approaches, campaigns are ramping up their final round of online ad spending, reaching tens of millions of people for pennies on the dollar. Two years ago, online political ads generated tens of billions of views, and total online political ad spending is expected to be 200 percent higher in 2020 than in the last presidential election. Both campaigns, as well as countless outside groups, have already spent millions pushing their messages online.
However, unlike the political ads you hear on the radio or see on TV, campaign ads on social media don’t have to disclose their funding and sometimes even appear to be posts from friends. Activists have spent years lobbying platforms like Facebook to strengthen their advertiser disclosure requirements and ban ads containing misleading claims.
While some progress on digital political ad regulation has been made, the only way to permanently reduce disinformation and voter suppression in political advertising is through federal action. Only the government can set mandatory, platform-agnostic and proactive standards for all social media companies.
Swift federal action is needed now more than ever in the midst of the 2020 election — online advertising has already been used in this and past election cycles to spread disinformation and enable voter suppression across the country. This past winter, the Bloomberg campaign paid “deputy digital organizers” to post on social media about his campaign without mandating disclosure. It was easy to mistake these paid advertisements for regular people excited about the campaign. In the 2016 presidential election, Trump campaign officials explained in detail how they used microtargeted Facebook ads aimed at reducing turnout for likely Clinton voters.
In 2020, unregulated political ads could be further used to catalyze confusion with mail-in voting, polling locations and even the results of the general election.
The Federal Election Commission could make great strides in reducing disinformation in political ads. Under current rules, online political statements have special status allowing their funders to avoid disclosing most paid content. The FEC — once they get quorum back — could remove these self-imposed handcuffs and open the floodgates, imposing a vast swath of electioneering rules on online political ads, as they already do for TV and radio.
In addition, the authority of the FEC allows its regulations to be proactive, not just reactive. The commission can impose regulations not only on existing platforms but future ones as well. Engineers and product managers on platforms like Facebook would be able to enact changes knowing they have a longer shelf life than whatever seems best in the current political moment. Future social media companies would have to learn the laws of the road before building their platforms.
But most importantly, progress made lobbying a company like Facebook to make platform changes would not be lost when a more popular platform dethrones it, because this progress would be incorporated into concrete federal regulation. We see already that teens and young adults prefer newer platforms like TikTok. If these future voters abandon Facebook for good, the effort and activism to reform Facebook’s political ad policies must not be lost. Currently, even if activists managed to persuade Facebook to concede to all of their demands, the “next” Facebook can ignore these precedents. Such a platform might value free speech overall, or have its own ideas about what speech should be restricted. If the federal government preserved this progress in federal regulation, activists would not have to start from scratch with each new up-and-comer.
Facebook is not the last battle in the fight against online disinformation in political advertising. We must not only plan for the next three weeks, but also for the next three decades. That means the federal government needs to step in. When federal elections leaders decide to take matters into their own hands and impose new regulations, only then will we make satisfying, permanent progress in protecting our democracy online.
Elizabeth Allendorf is an artificial intelligence engineer and a fellow at the Aspen Tech Policy Hub.