The Lessons Facebook Ought To Learn About Inclusion

The Department of Housing and Urban Development (HUD) has filed a lawsuit against Facebook on several charges alleging that Facebook violated the Fair Housing Act in its sale of housing advertisements. HUD claims that Facebook allowed housing advertisers to pick and choose which race, gender, or age group could see their ads while excluding other protected groups.  As Sarah Hunt, my fellow co-founder of the Rainey Center, remarked on the issue, “algorithms are only as moral as the people who write them.”

This recent crackdown on Facebook’s actions appears to highlight that concern that Facebook’s actions have a disparate impact on communities of color. HUD secretary Ben Carson stated, “Facebook is discriminating against people based upon who they are and where they live.” HUD contends that Facebook’s practices went as far as to provide ad buyers the ability to decide who could or could not see their ads based on their geographic location as well as specific languages the prospective home buyers might speak. This practice, known as redlining, has long haunted communities of color.

This is the second alleged instance of discrimination against protected communities on Facebook. In the first incident, the Internet Research Agency (IRA) was allowed to set up disinformation operations on Facebook that disproportionately targeted Latino and African American communities Whether Facebook intended this discrimination or not is beside the point: the company needs to commit to a diverse workforce that will notice discriminatory product features and stop them during the design phase, before they are unleashed on the public. These examples, nefarious or incidental, provide two important points for consideration.

First, Facebook needs to take action to address the inconsistencies of its system. Algorithms don’t set their own parameters. Even if the system has allowed for these types of discriminatory practices to take place inadvertently, quality checks and approvals could be implemented to catch such glaring omissions of equity. However, that process, much like the algorithms themselves, would only be as good or as useful as the knowledge of the individuals tasked with maintaining it. If designers and engineers aren’t considering how these tools can be used for illicit or outright illegal practices, then – as we’ve now witnessed in multiple instances with Facebook – they won’t prevent such immoral actions from taking place.

Second, like many corporations, Facebook espouses the strength of diversity in the workplace and looks to highlight those efforts in recruitment so that its workforce better reflects the community it serves. While these ideals are admirable, Facebook needs to implement or improve their policies to see that the engineers who are building these systems reflect that diversity.

It is generally difficult for tech companies to recruit from communities of color, but what those companies can do is work to change their culture to address the issues that involve marginalized communities today. A place to start would be by speaking to leaders within the aforementioned groups. When I spoke to a member of Facebook’s global political team recently, they stressed the company’s effort to connect with grasstop leaders to affect change within their own practices. Further, the company recently announced that it would ban all white supremacy, white separatist, and white nationalist materials and discussion from its site in an effort to aid the fight against hate and bigotry. While these are no small undertakings, they only represent a portion of the solution to Facebook’s discrimination problems.

To truly ensure that communities of color are not only represented, but that they are protected and served by the Facebook service, it’s crucial that these groups are represented at all stages of the business processes and services that social media companies provide. If a process exists in which a tool is allowing users to discriminate against a specific group, a member of that community who sees this activity is very likely to flag it and report it to the company’s leaders. If few to no members of that community are a part of the process, however, a company may likely find itself in a similar legal hotseat to Facebook.

If Facebook is serious about preventing discrimination on its platform, then it would do itself some good to determine why the discriminating events keep happening and to include more people of color in the systems design process.