The US Department of Justice (DOJ) Tuesday announced a “groundbreaking” settlement agreement with Meta Platforms Inc., formerly known as Facebook Inc., to resolve claims of discriminatory advertising.
The complaint alleged Facebook’s housing advertisements designed and provided advertisers with trait-based targeting, “lookalike” targeting, and delivery determinations.
Further, this case is the DOJ’s first case in which it alleged “algorithmic bias under the Fair Housing Act.” Through its advertising, “Meta use[d] algorithms in determining which Facebook users receive housing ads, and that those algorithms rely, in part, on characteristics protected under the FHA.” These protected characteristics include race, religion, sex, and national origin, among others.
The settlement agreement states Meta, by December 31, 2022 will cease using advertising tools, such as the “Special Ad Audience” tool, which rely on discriminatory algorithms. Additionally, the DOJ gave Meta until December 2022 to come up with a new house advertising system “to address disparities for race, ethnicity and sex between advertisers’ targeted audiences and the group of Facebook users to whom Facebook’s personalization algorithms actually deliver the ads.” In order to prevent future discriminatory advertising, the new system must be evaluated by “an independent, third party reviewer to investigate and verify” that compliance standards have been met.
Meta must also pay the United States the maximum penalty for violating the Fair Housing Act, which is a civil penalty of $115,054.
Recently Meta has been named in various lawsuits, including preference to visa-dependent workers in the US, poor working conditions in Kenya, and violating data protection laws in the European Union.