Justice Department Announces Settlement Agreement with Meta Platforms to Resolve Allegations of Discriminatory Advertising

The Department of Justice announced on June 21 that it obtained a settlement agreement resolving allegations that Meta Platforms Inc., formerly known as Facebook Inc., engaged in discriminatory advertising in violation of the Fair Housing Act. The proposed agreement resolves a lawsuit filed June 21 in the U.S. District Court for the Southern District of New York alleging that Meta’s housing advertising system discriminates against Facebook users based on their race, color, religion, sex, disability, familial status and national origin. The settlement won’t take effect until it’s approved by the court.

Among other things, the complaint alleged that the algorithms Meta uses to direct housing ads at Facebook users rely, in part, on characteristics protected under the FHA. This is the DOJ’s first case challenging algorithmic bias under the FHA.


Under the settlement, Meta will stop using an advertising tool for housing ads known as the “Special Ad Audience” tool. According to the department’s complaint, the tool relies on a discriminatory algorithm. Meta also will develop a new system to address racial and other disparities caused by its use of personalization algorithms in its ad delivery system for housing ads. That system will be subject to DOJ approval and court oversight. 

This settlement marks the first time that Meta will be subject to court oversight for its ad targeting and delivery system.

“As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner,” Assistant Attorney General Kristen Clarke of the DOJ’s civil rights division said in a statement. “This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit. The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.”  

“When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising using more traditional advertising methods,” U.S. Attorney Damian Williams for the Southern District of New York said in a statement. “Because of this ground-breaking lawsuit, Meta will — for the first time — change its ad delivery system to address algorithmic discrimination. But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation.”  

“It is not just housing providers who have a duty to abide by fair housing laws,” Demetria McCain, the principal deputy assistant secretary for fair housing and equal opportunity at the Department of Housing and Urban Development, said in a statement. “Parties who discriminate in the housing market, including those engaging in algorithmic bias, must be held accountable. This type of behavior hurts us all. HUD appreciates its continued partnership with the Department of Justice as they seek to uphold our country’s civil rights laws.”

U.S. Lawsuit

According to the June 21 announcement, the complaint challenged three key aspects of Meta’s ad targeting and delivery system. 

Specifically, the DOJ alleged that:

  • Meta enabled and encouraged advertisers to target their housing ads by relying on race, color, religion, sex, disability, familial status and national origin to decide which Facebook users will be eligible and ineligible to receive housing ads.
  • Meta created an ad targeting tool known as “Lookalike Audience” or “Special Ad Audience.” The tool uses a machine-learning algorithm to find Facebook users who share similarities with groups of individuals selected by an advertiser using several options provided by Facebook. Facebook has allowed its algorithm to consider FHA-protected characteristics — including race, religion and sex — in finding Facebook users who “look like” the advertiser’s source audience and are eligible to receive housing ads.
  • Meta’s ad delivery system uses machine-learning algorithms that rely in part on FHA-protected characteristics — such as race, national origin and sex — to help determine which subset of an advertiser’s targeted audience will actually receive a housing ad.

The complaint alleged that Meta has used these three aspects of its advertising system to target and deliver housing-related ads to some Facebook users while excluding other users based on FHA-protected characteristics.

The DOJ’s lawsuit alleged both disparate treatment and disparate impact discrimination. The complaint alleged that Meta is liable for disparate treatment because it intentionally classifies users on the basis of FHA-protected characteristics and designs algorithms that rely on users’ FHA-protected characteristics. The DOJ also alleged that Meta is liable forFacebook lawsui disparate impact discrimination because the operation of its algorithms affects Facebook users differently on the basis of their membership in protected classes.

Settlement Agreement

By Dec. 31, Meta must stop using an advertising tool for housing ads known as “Special Ad Audience” (previously called “Lookalike Audience”), which relies on an algorithm that, according to the U.S., discriminates on the basis of FHA-protected characteristics.

According to the announcement, Meta has until December 2022 to develop a new system for housing ads to address disparities for race, ethnicity and sex between advertisers’ targeted audiences and the group of Facebook users to whom Facebook’s personalization algorithms actually deliver the ads. If the U.S. concludes this new system sufficiently addresses the discriminatory disparities that Meta’s algorithms introduce, then Meta will fully implement the new system by Dec. 31.

If the U.S. concludes Meta’s changes to its ad delivery system don’t adequately address the discriminatory disparities, the settlement agreement will terminate and the U.S. will litigate its case against Meta in federal court.

The parties will select an independent, third-party reviewer to investigate and verify on an ongoing basis whether the new system is meeting the compliance standards agreed to by the parties. Under the agreement, Meta must provide the reviewer with any information necessary to verify compliance with those standards. The court will have ultimate authority to resolve disputes over the information Meta needs to disclose.

Meta will not provide any targeting options for housing advertisers that directly describe or relate to FHA-protected characteristics. Under the agreement, Meta must notify the U.S. if Meta intends to add any targeting options. The court will have the authority to resolve any disputes between the parties about proposed new targeting options.

Meta must also pay a civil penalty of $115,054 to the U.S. The penalty is the maximum available under the FHA.

The DOJ’s lawsuit is based in part on an investigation and charge of discrimination by HUD, which found that all three aspects of Meta’s ad delivery system violated the FHA. When Facebook elected to have the HUD charge heard in federal court, HUD referred the matter to the DOJ for litigation.

This case is being handled jointly by the DOJ’s civil rights division and the U.S. Attorney’s Office for the Southern District of New York.

Individuals who believe they have been victims of housing discrimination may submit a report online or contact HUD.

Previous articleCourt Opinions: 10th Circuit Court of Appeals Opinion from June 21
Next articleAG Weiser Announces $1.25 Million Multistate Settlement in Carnival Data Breach

LEAVE A REPLY

Please enter your comment!
Please enter your name here