by Ariana Tobin and Ava Kofman
This story was originally published by ProPublica. ProPublica is a Pulitzer Prize-winning investigative newsroom. Sign up for The Big Story newsletter to receive stories like this one in your inbox.
Investigating Algorithmic Injustice
In a settlement announced by the Department of Justice on Tuesday, Meta Platforms — formerly known as Facebook — has agreed to eliminate features in its advertising business that allow landlords, employers and credit agencies to discriminate against groups of people protected by federal civil rights laws.
The deal comes nearly six years after ProPublica first revealed that Facebook let housing marketers exclude African Americans and others from seeing some of their advertisements. Federal law prohibits housing, employment and credit discrimination based on race, religion, gender, family status and disability.
For years, ProPublica and other researchers showed that problems persisted in the delivery of advertisements related to housing, employment and credit, even as Facebook pledged to fix the loopholes that we identified.
This week’s settlement was the result of a lawsuit brought three years ago by the Trump administration alleging that Meta’s ad targeting system violated the Fair Housing Act. The DOJ also argued that Facebook used a machine learning algorithm to restrict and create ad audiences, which had the effect of skewing delivery toward or against legally protected groups. This was the first time the federal government challenged algorithmic bias under the Fair Housing Act.
As part of the settlement, Meta has agreed to deploy new advertising methods that will be vetted by a third-party reviewer and overseen by the court.
The company said in a statement that it will implement a “novel use of machine learning technology that will work to ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad.”
The statement, by Roy L. Austin Jr., Meta’s vice president of civil rights and deputy general counsel, noted that although the settlement only requires Facebook to use its new tool for advertisements related to housing, it will also apply to posts about employment and credit. (Facebook declined a request for additional comment.)
Civil rights attorney Peter Romer-Friedman, who has brought several cases against the company, said that previous negotiations had tried and failed to hold Facebook accountable for algorithmic bias. “Ultimately what this shows is that it’s never been a question of feasibility to eliminate algorithmic bias,” he told ProPublica. “It’s a question of will.”
After we reported on the potential for advertising discrimination in 2016, Facebook quickly promised to set up a system to catch and review ads that discriminate illegally. A year later, ProPublica found that it was still possible to exclude groups such as African Americans, mothers of high school kids, people interested in wheelchair ramps and Muslims from seeing advertisements. It was also possible to target ads to people with an interest in anti-Semitism, including options such as “How to burn Jews” and “Hitler did nothing wrong.”
We later found that companies were posting employment ads that women and older workers could not see. In March 2019, Facebook settled a lawsuit brought by civil rights groups by creating a “special ads portal” specifically for employment, housing and credit ads. The company said the portal would curb advertisers’ targeting options and also limit its algorithm from considering gender and race when deciding who should see ads.
But when ProPublica worked with researchers at Northeastern University and Upturn to test Facebook’s new system, we found more examples of biased ad delivery. Though Facebook’s modified algorithm prevented advertisers from explicit discrimination, delivery could still rely on “special ad” or “lookalike” proxy characteristics that correlated with race or gender.
The research also found that Facebook skewed the audience depending on the content of the ad itself. How many women might see a job listing for an open janitorial position, for instance, depended not just on what the advertiser told Facebook, but also on how Facebook interpreted the advertisement’s image and text.
ProPublica also continued to find employment advertisements that favored men or excluded older possible applicants, potentially violating civil rights law. Some advertisers we interviewed were surprised to learn that they were unable to reach a diverse audience, even if they tried.
In a press release, the DOJ said Tuesday’s settlement requires Meta to stop using the “Special Ad Audience” tool by the end of the year. It also requires Meta to change its algorithm “to address disparities for race, ethnicity and sex between advertisers’ targeted audiences and the group of Facebook users to whom Facebook’s personalization algorithms actually deliver the ads.” The company must share details with the DOJ and an independent reviewer before implementing changes.
As part of the settlement, Meta also agreed to pay a $115,054 fee, the maximum allowed by the law.
“Because of this ground-breaking lawsuit, Meta will — for the first time — change its ad delivery system to address algorithmic discrimination,” U.S. Attorney Damian Williams for the Southern District of New York said in a statement. “But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation.”
Photo Credit: Brett Jordan