A new report from the NYU Center for Business and Human Rights, Who Moderates the Social Media Giants? A Call to End Outsourcing, considers the means and methods that technology giants use to confront illicit content and concludes the current approach at all of these firms, on balance, is grossly inadequate and in need of serious reform.
“It’s a mission-critical function, but you fulfill it with your most precarious employees, who technically aren’t even your employees,” notes Sarah Roberts, an information studies expert at the University of California, Los Angeles, as quoted in the report.
“When you run an operation of this size, will there be situations that come up that are not optimal? Yes,” notes Arun Chandra, Facebook’s vice president for scaled operations. The problem is that such “situations” are occurring at a scale difficult to comprehend. Facebook removed or added warnings to nearly 4 billion pieces of content in the first quarter of 2020 alone. These include everything from spam to child nudity, violence, hate speech, and the posts of false accounts. And as the report notes, there is also a geographic, cultural and language scale to be considered: “the question remains whether Facebook has expanded into more countries than it’s prepared to safeguard from potential misuse of its platform.”
The authors see three major risks to business as usual: “first, that at-risk countries receive insufficient attention from moderators; second, that moderators’ mental health is not adequately protected; and third, that the outsourced environment is not conducive to the sort of careful content assessment that’s vital for user safety.”
The authors note that Facebook turned down repeated requests to visit a content moderation facility.
Here are the report’s summary recommendations:
Note: On Thursday, June 18, 2020 at 12:00 PM ET, I will lead a discussion and audience Q&A on the report with its author, Paul Barrett Deputy Director, NYU Stern Center for Business and Human Rights, alongside experts Dipayan Ghosh, Digital Platforms and Democracy Project, Harvard Kennedy School and author of a brand new book, Terms of Disservice: How Silicon Valley Is Destructive by Design and Sarah T. Roberts, Assistant Professor of Information Studies, University of California, Los Angeles, author of the pioneering 2019 book, Behind the Screen: Content Moderation in the Shadows of Social Media. Free registration is available here.