A Proposal To Redress Malicious Social Media Posts Targeting Public Figures

The FCC’s Personal Attack Rule- struck down in 2000- should be resurrected. By Justin Hendrix and Bryan Jones.

Another day and yet another incident that illustrates the role of social media platforms in facilitating and accelerating toxicity into the public discourse. “Distorted videos of House Speaker Nancy Pelosi (D-Calif.), altered to make her sound as if she’s drunkenly slurring her words, are spreading rapidly across social media, highlighting how political disinformation that clouds public understanding can now grow at the speed of the Web,” the Washington Post reported, with one video racking up nearly 3 million views in under 24 hours.

Later on the same day, President Trump tweeted a different video, edited to show Pelosi tripping over her words. While not manipulated to the same degree as the video that went viral, which relied on slowing down certain frames, a forensic expert nevertheless told NBC News the video Trump shared is “highly deceptive as it compiles in rapid succession relatively small verbal stumbles in an attempt to portray Speaker Pelosi as stumbling through her press conference.”

Yet again, the videos in question spread rapidly and were viewed by millions even though their veracity was quickly challenged. It seems exceedingly difficult to find a remedy for the spread of malicious, altered videos, audio or photos on social media platforms. There are many reasons for this – one is that the virality of such content is what the platforms were designed for, and how they make money. Another is that Section 230 of the Communications Decency Act gives the platforms general immunity from liability. Another- in our view, disingenuous- argument comes from the platforms themselves: that it is not their role to moderate content.

But what if we were to institute a rule about such incidents, to ensure redress and attempt to expose the truth to anyone who saw the content in question? Regulators could decide to implement a requirement that if a maliciously doctored video is discovered which deliberately maligns a political leader, any platform having distributed that content must ensure a corrective post is inserted into an equivalent number of feeds, targeted with the same attributes or hopefully to the precise users who originally saw the malicious content, matching the original distribution. Of course, such a rule would have to be used sparingly and governed by an expert review committee, and no doubt difficult judgment calls would have to be made over satire, art, other fair uses. But there is nothing technically impossible about the proposition.

What we’re proposing isn’t necessarily new; in fact, we’ve had similar rules before. Specifically,  we might look to the Fairness Doctrine– instituted by the Federal Communications Commission (FCC) in 1949 during the heyday of broadcast television, and eliminated in 1987. It came to include a personal attack rule that allowed someone targeted to seek redress and the opportunity to address the audience to which that person was impugned. Pursuant to the FCC’s own regulations, the personal attack rule applied “when ‘an attack is made upon the honesty, character, integrity, or like personal qualities of an identified person or group’ during origination cablecasting concerning controversial issues of public importance.” Under this rule, media owners were required to make a notification and identify the offending broadcast; provide a script, tape or accurate summary of the attack to the target; and an offer of a reasonable opportunity to respond. The personal attack rule stayed on the books until 2000.  

The history of the personal attack rule mirrors current day events. A journalist, Fred Cook, wrote a book in 1964 called Goldwater: Extremist on the Right and was subsequently attacked by a conservative evangelist named Billy James Hargis on his Christian Crusade radio broadcast, broadcast by a radio station called WGCB in Red Lion, Pennsylvania.

Cook petitioned the FCC for redress, and the commission ordered Red Lion to send a transcript of the broadcast to Cook and provide him equitable reply time, a decision that ultimately led the FCC to specifically set the rule on personal attacks. Eventually, the Supreme Court upheld the Fairness Doctrine in a 1969 judgment in favor of the FCC.

Importantly, the original personal attack rule didn’t say it was necessary to take down the original content. This is an important aspect of this proposal, as it does not wade into any discussions on censorship or infringing on free speech. Rather, the idea is to create more speech, not less.

For a variety of reasons- including massive amounts of lobbying money spent by the social media platforms to avoid culpability on Capitol Hill- we have collectively decided we cannot do anything about these incidents other than wringing our hands. In past decades, we were not so confused. In an age in which technology makes it easy to manipulate media and instantly reach billions, perhaps we need to look for the wisdom in the old rules.