Reconsidering the EARN IT Act

As families—and especially children—have been forced to spend more time online due to COVID-19, predators and other cybercriminals have also been lurking. A recent report from EUROPOL highlights how child sexual abusers are sharing more exploitative material due to the pandemic and are eagerly anticipating more children being online to groom and solicit. Child protection tip lines are showing a surge in child abuse and exploitation online.

The pandemic underscores how the spread of exploitative material about children, often termed child sexual abuse materials or CSAM, is a rampant and growing problem. Research surveys and newspaper exposés suggest the number of CSAM crimes is increasing. A shocking report in the New York Times last fall discovered that tech companies had reported 45 million online photos and videos of children being sexually abused in 2018—double what was reported the previous year. There is widespread recognition that something more must be done, and there has been bipartisan support in Congress to encourage companies to do more to combat CSAM.

The Eliminating Abusive and Rampant Neglect of Interactive Technologies Act, or EARN IT Act, has emerged as one likely legislative vehicle for addressing CSAM. Introduced in early March, the EARN IT Act would create a 19-member National Commission on Online Child Sexual Exploitation Prevention at the U.S. Department of Justice with representatives from the Department of Homeland Security, Federal Trade Commission, and civil society, including privacy experts, law enforcement, and CSAM victims. This new commission would be charged with exploring and creating sets of “best practices” for different products and business models to address the online sexual exploitation of children online. No one disagrees on the general merits of this goal.

Instead, the anticipated contents of these best practices and the bill’s mechanisms to encourage their adoption have come under fire. The premise of the bill is that some legal immunity under Section 230 of the Communications Decency Act should be earned—or conditioned—on certifying adoption of what the commission eventually proposes. Much has been written about Section 230, one feature of which is that it gives technology platforms a tremendous amount of discretion to deal with irresponsible, harmful, and dangerous content in whichever way they choose, or not at all.

The EARN IT Act could do much to bring a degree of harmony to the many different ways that tech companies currently try to combat child exploitation online. The merits of this goal, however, have been drowned out by a chorus of hypothetical horribles. Critics have levied both policy and constitutional arguments against the EARN IT Act with no more than a cursory acknowledgement that the bill’s underlying goal of establishing best practices is noble and necessary. Further, most of these criticisms are not tied to the actual text of the bill, but rather to critics’ fears about what guidance could eventually emerge from the commission the bill creates. Critics do not want to have that discussion and, for all their criticism, have failed to articulate what fixes could allow the bill to achieve its stated goal. Thus, while critics have been quick to argue that the EARN IT Act is constitutionally suspect, these criticisms ultimately boil down to arguing that if the best practices maybe require certain practices then they could be found by a court someday to run afoul of constitutional protections under the First and Fourth Amendments.

There are a lot of “ifs” there, and we should acknowledge where this argument leads. If statutes should be rejected as unconstitutional simply because an administrative or enforcement agency could ultimately interpret or implement a law in a manner that violates the Constitution, many of the country’s most iconic laws—from the Federal Trade Commission Act to antitrust laws and civil rights statutes—have the same supposedly fatal flaw as the EARN IT Act. We must focus on what the law actually says if we are to evaluate it fairly.

The EARN IT Act Does Not Force Companies to Violate Users’ Fourth Amendment Rights

The Fourth Amendment protects individuals from unreasonable searches and seizures, but this protection applies only to the government’s actions unless a private company acts “as an instrument or agent of the Government.” Tech platforms and services currently are free to voluntarily scan for, report, and turn over CSAM, but when companies are forced to do so under color of law, they may be acting as agents of the state. In that case, the Fourth Amendment’s warrant requirements presumptively would apply, and failure to follow legal process would run the risk of having any CSAM found as a result thrown out of court.

However, concerns about the interaction between this constitutional requirement and scanning online content should not be conflated to mean the best practices will automatically be unconstitutional. For one, it discounts both the composition and the process by which the best practices are ultimately adopted. First, members of the commission who have served as prosecutors have every reason to take seriously the Fourth Amendment’s exclusionary rule; it is their cases that will be undermined if unconstitutional best practices lead courts to exclude evidence secured from tech companies. Federal agencies also have an important role to play in scrutinizing anything the commission produces, and Congress has the ultimate ability to approve of what is produced. In short, critics’ arguments against the EARN IT Act require us to believe that prosecutors, cops, tech companies, and survivor groups will all agree on best practices that cannot hold up in a court of law. This defies common sense.

That does not discount that this is a serious issue. The National Center for Missing and Exploited Children, a group at the center of several Fourth Amendment challenges to CSAM monitoring, has been viewed by some courts as a government actor. It has a strong incentive to avoid undermining efforts to address child exploitation, yet it has strongly championed the EARN IT Act.

It is also worth acknowledging that following the best practices is not mandatory. Critics insist that securing Section 230 immunity is so dispositive as to make coercive any practice required to maintain that immunity, but again, any penalty for noncompliance with the best practices depends on what the final product of the commission is. Furthermore, the EARN IT Act is not a wholesale repeal of Section 230. Instead, the EARN IT Act addresses only a portion of a company’s wide-ranging Section 230 immunity—specifically, lawsuits involving child sexual exploitation laws. It is simply not fair to say that additional exposure to legal liability is the equivalent of a legal mandate for Fourth Amendment purposes, no matter how effective Section 230 has been at saving companies’ money from protracted legal battles.

Going After CSAM Does Not Automatically Turn Companies Into Government-Commanded Speech Police in Violation of the First Amendment

The First Amendment arguments against the EARN IT Act again turn on what the best practices may or may not recommend, and again it is difficult to argue that the legislation is constitutionally problematic simply because a commission might theoretically produce rules that run afoul of the First Amendment. This dynamic is no different from the legal mandates of the Federal Trade Commission and Federal Communications Commission, yet only a few would argue the mere creation of these entities raises problems.

Instead, opposition to the EARN IT Act always focuses on the worst-case scenarios. Under the guise of criticizing a “vague and expansive” list of potential areas for the commission to explore, opponents of the bill insist this commission will produce one monolithic and problematic slate of best practices, even as the EARN IT Act calls for alternatives that consider different business models and different types of products. Many federal laws include general, flexible instructions to regulators and law enforcers. The FTC Act prohibits “unfair and deceptive acts or practices” while the Sherman Antitrust Act contains two whole provisions. The EARN IT Act is a model of clarity by contrast—and, unlike the Sherman Act, none of its provisions has the force of law until specific best practices are developed.

Critics then insist that any best practices that inform how tech platforms police content amounts to unconstitutional censorship, but the EARN IT Act itself does not require anyone to do anything. It is simply conditioning some elements of Section 230 immunity on following the best practices. That implicates statutorily granted liability protection—not constitutional speech protection.

There’s a calculation for companies to make here. Critics point to case law that states that the government “may not condition the granting of a governmental privilege on individuals or entities doing things that amount to a violation of their First Amendment rights,” but these cases generally point to direct financial benefits – not legal protections such as Section 230 that Congress remains free to amend or repeal. The EARN IT Act only tweaks Section 230 protections with respect to CSAM, to which no First Amendment protection applies.  Critics seem insistent that any conceivable set of best practices would somehow lead to censorship or other, legitimate speech. And, again, these concerns make no sense unless and until the commission promulgates regulations that have some impact on content that is even arguably protected by the First Amendment. 

The Mere Specter of Encryption, Tech Policy’s Third Rail, Looms Large

All of these arguments are arguably a smokescreen for the real worry that the EARN IT Act is a sneak attack on end-to-end encryption. Encryption is a vitally important security measure to protect the confidentiality of communications, yet law enforcement officials repeatedly argue certain forms of encryption are one of the major reasons making law enforcement “go dark” in its ability to find evidence and follow the trail of crimes.

This tension is why critics in the advocacy and tech communities were arguing against this proposal before the final proposal was even unveiled. The bill’s sponsors have, in turn, found it difficult to satisfy a concern with the bill that predated the bill itself.

The text of the legislation does not even include the word “encryption.” But opponents suggest that the best practices may require companies to have some form of access to all content transferring their services, even if it is encrypted. Because encryption may fundamentally challenge the ability to prevent and identify child sexual exploitation, critics seized upon the notion that the bill was, functionally, an attack on companies’ ability to offer end-to-end encryption services like texting or video chats. One effort to try to resolve this concern was to add expertise in cryptology as a qualification for some members of the commission, but critics seized on this as confirmation that the best practices were always really about targeting encryption.

Without oversimplifying the concerns of the bill’s opponents, fear about the role of the Department of Justice and especially the current Attorney General, both critics of encryption policy, animate much of this debate. Some argue the Attorney General has some sort of puppet master-like outsized role over the commission’s final output, ignoring the roles of the Department of Homeland Security and Federal Trade Commission, agencies with different priorities with respect to data and cybersecurity. Further, the 19-member commission would still need to adopt anti-encryption best practices with a bipartisan super-majority of support that includes two members with expertise in consumer protection or privacy, two members with expertise in computer science or software engineering, and four members representing small and large tech companies. Best practices cannot even be adopted without robust support reflecting divergent constituencies. After all of this, they also have to be approved by Congress and the President, creating more checks on this process than might exist at an independent federal agency.

While any best practices that categorically prohibited companies from offering end-to-end encrypted services would be controversial, it is unlikely that the commission will prioritize encryption in its deliberation. If the goal is to take encryption off the table entirely, it remains unclear at this point what additional provisions or procedural guardrails would alleviate this concern aside from an explicit prohibition barring the commission from addressing the subject, but even that approach would have the downside of sidelining the useful conversations going on about how to make CSAM scanning tools work alongside end-to-end encryption. It does not serve tech companies, victims, or advocates for the commission to be oblivious to that clear fact. At its core, however, this worry seems to come down to a lack of trust in any commission to produce anything other than “pernicious regulations” and censorship.

Lambasting this commission charged with addressing legitimate threats to children is premature. It is not good enough to simply acknowledge that child exploitation and abuse is happening online and then offer up platitudes and commitments to do more. While many critics have rallied around a useful proposal that would provide more funding to federal law enforcement, the alternative cannot be to abandon any semblance of corporate accountability. Companies should be expected to do more. The EARN IT Act would force companies to the table to improve how they respond to CSAM online. Any constitutional objections should be considered if actual constitutional concerns are raised by the commission’s best practices, but, as a start, the best practices themselves are urgently needed. EARN IT Act provides a valuable way to develop those best practices.

About The Author