Forget bitcoin; human information has become the true cryptocurrency of our world. Online discrimination and privacy breaches are happening at an alarming rate and affecting those most vulnerable in our society: minority groups, mothers, and the elderly.
This month, over two billion users will log on to Facebook, one billion users will utilize one of seven different Google products, and about 350 million people will browse their Twitter feed. In the sage words of Spider Man’s Uncle Ben: with great power comes great responsibility. Today’s fundamental question is just how much responsibility online platforms have for maintaining and safeguarding civil rights as custodians of their users’ data. Below I outline two of the major cases that reveal this glaring problem.
According to Facebook’s SEC filings, in 2017 alone, the company earned a whopping $40.65 billion, and 98 percent of this revenue was made from third party advertisers. So how does this nearly trillion-dollar advertisement system work and how does it affect housing discrimination?
Facebook’s advertising portal is extremely attractive to outside companies because it allows advertisers to hand select their audiences based on “behavior profiles” (what they like and interact with), specific traits, and demographics. Don’t be surprised if you’ve fallen into a rabbit hole of Tasty videos and found yourself bombarded with whisk advertisements the next day. On a whiskless and more threatening note, ProPublica, a nonprofit newsroom that produces investigative journalism, first exposed Facebook’s enablement of housing discrimination back in October 2016. In response to this exposure Facebook pledged to put a stop to discriminatory advertising and promote diversity by updating its ad policies.
Yet, you’re reading this blog — pledge or not, the problem has persisted. In November 2017, when ProPublica conducted a second investigation, they found that nothing substantive had changed. To test Facebook’s “improved” policies, ProPublica bought rental housing ads on Facebook, but asked that the ads not be shown to African Americans, mothers of high school kids, people interested in wheelchair ramps, Jews, expats from Latin America, or Spanish speakers. All of these advertisements were approved, even though these groups are specifically protected from housing discrimination under the federal Fair Housing Act. The only change from the previous investigation was that users’ ethnic affiliations were moved from the category of “demographics” to “behaviors.”
These reports of discrimination have prompted the National Fair Housing Alliance and three other advocacy groups to sue Facebook for violating the Fair Housing Act. In their complaint, the groups acknowledged that Facebook had finally changed its advertising policy to block the use of race as an exclusionary category; yet they also argued that advertisers could still discriminate by using proxies for race or ethnicity (for example, having “liked” Telemundo can be used to signal to advertisers that one may be of Hispanic ethnicity).
Is Housing Segregation Still a Problem?
Yes. The goal of the Fair Housing Act is to promote a unitary housing market where people’s backgrounds do not unfairly restrict their access to housing. The effects of redlining by banks and the federal government, along with other racially discriminating policies, can still be seen in the segregation that exists in all of our major cities — just take a look at the racial segregation in the city closest to your home. The United States has a long history of excluding minority groups from neighborhoods with good housing, school systems, and public services.
Although policies like the Fair Housing Act promote progress, housing discrimination and racial segregation are still major injustices our country must continue to tackle. According to a recent study done by the U.S. Department of Housing and Urban Development, minority home seekers are told about and shown fewer homes and apartments than whites. Today, online platforms should be acting as great democratizer — almost everyone uses Facebook and Google, from your coworkers, to your neighbor, and even your Great Aunt Susie. Yet these platforms are providing easy tools to make sure that ads never reach minority populations that insidious third-party actors do not want to house. The time and cost for minorities to search for housing goes up, while their number of choices goes down.
In an added plot twist, Facebook has been actively pushing to expand its own reach in the real estate and rental market. The social media network has entered into a partnership with the popular rental sites Apartment List and Zumper. According to Facebook, since 2018, the search volume of Facebook’s Marketplace tab, where individuals can list their homes and other items for sale and for rent, has grown an astounding 300 percent globally. If Facebook is seeking to become a dominant player in the housing market, they certainly should not be evading fair housing laws.
Here We Go Again: The Case of Employment Discrimination
This past December, another investigation conducted by The New York Times and ProPublica revealed that companies used Facebook to keep older workers from viewing job advertisements. Verizon, Amazon, Goldman Sachs, Target, and even Facebook were all found to have posted recruitments that had age restrictions keeping older users from ever seeing the ads. In response, the Communications Workers of America filed a class-action lawsuit in the California Federal Court. Using similar arguments to the housing discrimination cases, the CWA allege that these companies are using data to discriminate based on age and are in violation of the Age Discrimination in Employment Act of 1967.
A 2017 study done by economists at Tulane University and the University of California Irvine shows that as job applicants get closer to retirement age, they are much less likely to hear back from recruiters than their younger counterparts. It is clear that online networks should not be making age discrimination any easier to accomplish.
Playing Three-Dimensional Chess
Maintaining civil rights online has become an oh-so-unironic game of three-dimensional chess. As with the housing case, the employment case points to the new barriers, enabled by online platforms, which have formed within the fight to combat discrimination and promote civil liberties. Other examples of major online platforms enabling discrimination include Airbnb hosts refusing to rent to black users and ride-hailing apps having longer wait times and higher cancellation rates for people of color. On the other hand, it is paramount to remember that the internet and platforms like Facebook and Twitter have also opened up the possibility for vulnerable groups to first share and then compare stories, find attorneys, and organize. The internet has forever changed how our personal information is used and how we interact with one another; now we must adapt to protect our rights penned long before a keyboard was available to do the same.
Aside from Facebook currently acting as one big magnet for lawsuits, what do these court cases signal about the future responsibility platforms will have for protecting and upholding civil rights and the different ways society will try to enforce that responsibility?
The statutory law is where these court cases get even more complicated in our digitized day: Freedom of expression comes head to head with the equal pursuit of life, liberty, and property. Under Section 230 of the Communications Decency Act, Facebook, as an online platform, is not considered a publisher (like a newspaper would be) of content its users post. This immunizes it from many forms of legal liability.
Section 230 was created with good intentions: Previous to its passage, some online services were found to be “publishers” if they moderated content in any way, but not if they simply ran unmoderated platforms. Publishers are responsible for materials they publish just as if they were the speakers themselves—for instance, a newspaper is legally responsible for letters to the editor that it chooses to publish. Section 230 was designed, in part, to ensure that platforms could moderate content without this editorial function transforming them into “publishers.”
It’s important to note that Section 230 means that platforms have more freedom to moderate content than they did before, but that they also do not have an incentive to over-moderate. Without Section 230’s protection, websites might be incentivized to censor much more in order to avoid potential lawsuits. This threatens the sometimes life-saving tool the internet has given people, especially for vulnerable communities, to organize and share their stories without fear of censorship.
This is not to say that Section 230 is perfect. Despite being intended to give platforms a greater ability to moderate content, some platforms may choose not to do so. Even one of Section 230’s authors has suggested it might need to be revised. Public Knowledge will return to this topic in the future, but for now, there’s an apparent conflict between Section 230 and justice, with respect to housing discrimination.
The obvious solution may be that Section 230 should be amended to ensure that it doesn’t shield platforms from liability in this circumstance. But that may not be necessary, since contrary to Facebook’s argument, it’s not at all clear that Section 230 even applies to this set of facts. In a previous Fair Housing Act suit, a federal appellate court found that if a platform helped ‘develop unlawful content,’ then Section 230 simply doesn’t apply. This is because if a platform provides tools that make it easy for users to break the law, it becomes the co-author of the user’s material. Thus Facebook would be liable, not for the conduct of its users, but for its own conduct.
Plaintiffs in lawsuits that involve Section 230 often try to claim that some action by a platform makes them directly liable for posted content, but that argument usually fails, and even if it succeeds here, it would only make Facebook liable for discrimination that is facilitated by its targeting tools. Nevertheless, it is at least clear that Section 230 does not provide Facebook with a virtual get of jail free card, and the policy seems simple: Facebook should be held accountable to the rules of the Fair Housing Act and Age Discrimination in Employment Act, especially to the extent that it makes discrimination as easy as choosing an option from a drop-down menu.
A New Way of Thinking
The ever-present danger is that discrimination can occur when we unknowingly reveal who we are; platforms monitor our every click and build behavior profiles about us, even when we do not expressly tell these sites who we are. The owners and designers of technology platforms need to consider their entire user base and the possibility of discrimination in their original and updated designs. The prevention of discrimination needs to become a priority, not an afterthought.
The public’s knowledge of these issues must be maintained. People from all communities should be informed about how their data can be used to profit from and perpetuate discrimination. Platforms are responsible for affording users their civil rights and blocking any discrimination that the platform’s design might enable. Our personal data drives the profits of America’s most powerful companies, and we should constantly be reminding them of that.