
Warren vs Facebook: What Really Is Happening?
Last week, Senator Elizabeth Warren published a Medium post on her desire to apply antitrust regulation to big tech. Following that post, Sen. Warren ran several anti-Facebook messages as ads on Facebook. Those ads were blocked, resulting in the rare time when Senator Ted Cruz and Senator Warren are aligned on bashing Facebook. But is it deserved?
A few of Protego Press’ editors discussed the issues:
Karen Kornbluh: This episode points to a fundamental problem when companies like Facebook are so powerful they function as the public square but they don’t have rules or full transparency on how they use their power. That Facebook can silence a message from a prominent Senator and leading presidential candidate—even just temporarily—is a challenge for our democratic dialogue. That’s especially true in instances when questions arise about whether their interests (financial or political) are involved.
Renee DiResta: I think there’s plenty of basis to argue for regulating companies like Facebook, but believe this episode is being blown wildly out of proportion. Facebook has clearly articulated policies for running ads using their logo or alterations of that logo, which is what triggered Senator Warren’s ads being blocked here. Sometimes you run afoul of a policy. In this particular case, the policy against using Facebook’s logo appears to be designed to avoid ads that create the impression that Facebook is somehow involved, which could be used to scam people. The algorithm halts the ad and the ads manager spells out what policy you violated. You can correct it. Usually if you make the change, the ad runs again. That seems to be what happened here—and that’s not “censorship.”
Dipayan Ghosh: But that doesn’t undermine Karen’s point: these companies have tremendous power over the market and over individual communications, and there’s so much that rides on getting caught up in their internal decision-making policies and their content-blocking AI. Companies like Facebook should have checks in place to make sure this doesn’t happen. It is fair for Senator Warren to make a big deal of this mishap, especially if Facebook’s policies are outdated. Who knows if this has happened with others who didn’t realize it, or didn’t have enough knowledge or resources to make the needed adjustment?
Joshua Geltzer: Ultimately, algorithms must drive decisions at this scale to some degree, and in some instances they’ll trigger the sort of incongruous result that Facebook’s algorithm triggered here. That may not be censorship, as it’s not intentional, it’s not lasting, and it’s not directed at particular, unwanted speech. But I think one can agree with Renee on that and still agree with Karen and Dipayan that this shows what incredible power each of these companies has, while also raising questions about those who get “trapped” in an algorithm-driven outcome with less knowledge and clout about what to do about it than Senator Warren had here.
Justin Hendrix: Absolutely. But, I think that one thing that would be very helpful for Facebook to do in this case is to give a forensic account of how this decision was made. It would be very informative. The Washington Post reported that we don’t know “whether it was human reviewers or the company’s artificial intelligence tools” that flagged these ads for removal. We need to know, so that the public has some insight into how these things happen. Otherwise, conspiracy theories will take root. Academic research has shown that folk theories around content moderation are a fertile soil for conspiracists.