Meta Settles Lawsuit With DOJ Over Ad Delivery Algorithms

- Advertisement -


US Department of Justice Today announced that he entered into an agreement with Meta, the parent company of Facebook, to resolve a lawsuit alleging that Meta engaged in discriminatory advertising in violation of the Fair Housing Act (FHA). The proposed settlement is subject to review and approval by the District Judge for the Southern District of New York, where the lawsuit was originally filed. But if it’s moving forward, Meta said that he agreed to develop a new housing advertising system and pay a fine of approximately $115,000, the maximum fine under the FHA.

- Advertisement -

“When a company develops and implements technology that completely or partially deprives users of housing opportunities based on protected characteristics, it violates the Fair Housing Act, just as when companies engage in discriminatory advertising using more traditional advertising methods,” U.S. Attorney Damian. Williams said in a statement. “Because of this groundbreaking lawsuit, Meta will change its ad delivery system for the first time to eliminate algorithmic discrimination. But if Meta cannot demonstrate that it has changed its delivery system enough to guard against algorithmic bias, this office will move forward with the lawsuit.”

- Advertisement -

The lawsuit was the Department of Justice’s first major algorithmic bias under the FHA, and it alleged that the algorithms Meta uses to determine which Facebook users receive housing ads are based in part on characteristics such as race, color, religion, gender, disability, marital status, and national origin are all protected by the FHA. Academic research has provided evidence to support the Justice Department’s claims, including the 2020 paper from Carnegie Mellon, who showed that bias in the Facebook advertising platform exacerbates socioeconomic inequalities.

Meta said that under an agreement with the Department of Justice, it will stop using Special Ad Audience, an advertising tool that allegedly relied on a discriminatory algorithm to find users who “look like” other users based on FHA-protected characteristics. . Meta will also develop a new system over the next six months to “eliminate racial and other differences caused by the use of personalization algorithms in its ad delivery system for ad placement,” according to a press release, and will implement the system by December 31st. 2022.

- Advertisement -

An independent third party reviewer will continuously investigate and verify that the new Meta system meets the standards agreed with the company and the Department of Justice. Meta must notify the Department of Justice if it intends to add any targeting options in the future.

If the Justice Department concludes that the new system does not sufficiently address discrimination, the settlement will be terminated.


Credit: techcrunch.com /

- Advertisement -

Stay on top - Get the daily news in your inbox

DMCA / Correction Notice

Recent Articles

Related Stories

Stay on top - Get the daily news in your inbox