The Great Trust Crisis of 2020

Just as the Great Financial Crisis made us question whether our market economy is hopelessly distorted, the Great Trust Crisis should make us question whether the marketplace of ideas is irreparably corrupted. 

The pandemic tears through America, protests for social justice roll across the nation, and our country faces one of the most consequential elections in memory. 

Yet we can’t trust what we see on social media because the information marketplace is rigged. 

Every event is accompanied by disinformation and conspiracies: Covid-19 is a bio-weapon; social justice protests are led by antifa; mail-in votes will undercut the election.

It would now be a surprise for a significant news story not to be accompanied by profiteerstrolls or foreign governments generating false explanations and crackpot theories. 

Messages about wearing masks to prevent Covid-19 are boring, while conspiracies about masks as social control or causing health problems speak to people’s darkest fears and justify the selfish behavior of Karens and Kens across the U.S.

Conspiracists and trolls are the equivalent of high frequency traders in the market of ideas, while legitimate sources of information are manually inputting buys and sells at E-Trade.

Organic self-correction of the information marketplace is impossible and there will definitely not be a V-shaped recovery for truth.  To fix the information ecosystem, major changes are needed.  

The World Health Organization, in the face of misinformation around the Covid-19 pandemic, convened a conference to educate the public about infodemiology. John Oliver enlisted celebrities to educate their fans about media literacy. 

But much more must be done.  We propose three actions to make the information market more fair.

Alternative pathways for information

We can unleash a bull market for trust if we empower community leaders to be as loud as the profiteers and trolls.  To do so, we must change the way social media highlights material and makes money.  

The first step is to develop alternative information pathways for local leaders who are already trusted by their communities — school superintendents, pastors, doctors, city authorities, community leaders and the like.

But even the most trusted local leader has a fraction of the social media following of a conspiracist who speaks to a national audience.  And social media algorithms are amoral: they amplify information that keeps users on the platform and profits coming in, not because it is useful or trustworthy. 

The Auxiliary, a crisis comms firm started by two of us at the beginning of the pandemic, is piloting a project that may offer some solutions.  

To fight social-media sourced rumors that have devastated residents of the public housing authority in Albany, Georgia, we embarked on a campaign to get residents to sign up for a “First-to-Know” system which will allow the authority’s leader to send important information through smartphone notifications directly to residents.

The system is powered by UgoRound, a community-alerting platform that requires no data from users and doesn’t collect any. Instead of monetizing user data like most apps, they charge a maintenance fee to the housing authority.  There is no financial benefit for more salacious or inflammatory content, and only credible and authoritative voices will be able to send alerts.  

Regulating speech on the internet 

The government regulates the stock market.  It regulates communications on our public airwaves.  Libel laws constrain publishers of books and newspapers to meet standards of accuracy and objectivity.  The internet should be no different.

Platforms should be held liable for running the digital equivalent of boiler rooms that sell fraudulent stocks, destroying trust in the information marketplace. 

Treating tech companies like other publishers

When social media companies make money by disseminating news, they call it a business model.  When they are called on to be responsible for what is published, they decry being asked to be arbiters of truth.”

They can no longer have it both ways.  Social media platforms should not provide the purveyors of lies access to their enormous audiences. Even on private companies’ platforms, not bound by the First Amendment, threats and hate speech should not be welcome. 

Some platforms recently took a stand against public health misinformation and unsubstantiated statementsabout voting — all tech companies must take similar stances. 

Like the bank run that sparked the Great Financial Crisis, disinformation is the symptom, not the cause of our Great Truth Crisis.  The cause is that institutions that we traditionally rely on to make good faith efforts to traffic in substantiated facts have found it more profitable to discard their epistemic values.  

To clean up the market for information, we must return to those values.  We need to rebuild trust in the institutions that adhere to facts, de-platform voices committed to spreading unverified information that could cause harm, and support business models that balance profit with public good.

About The Author