NEW YORK — Facebook will change its algorithms to prevent discriminatory home ads. The parent company will submit itself to court oversight to settle a lawsuit filed Tuesday by the US Department of Justice.
In a release, US government officials said Meta Platforms Inc., formerly known as Facebook Inc., said it had reached an agreement to settle the lawsuit filed the same day in Manhattan federal court on Tuesday.
According to the press release, it was the Justice Department’s first case challenging algorithmic discrimination under the Fair Housing Act. Facebook will now be subject to Justice Department approval and court oversight for its ad targeting and delivery system.
American attorney Damian Williams called the lawsuit “groundbreaking.” Assistant Attorney General Kristen Clarke called it “historic.”
Ashley Settle, a Facebook spokesperson, said in an email that the company was building “a new machine learning method without our ad system that will change the way home ads are shown to people living in the US across different demographics.”
She said the company would expand its new method of advertising employment and credit in the US.
“We are excited to be pioneering this effort,” Settle added in an email.
Williams said Facebook’s technology has in the past violated the Fair Housing Act online “just like when companies engage in discriminatory advertising using more traditional advertising methods.”
Clarke said that “companies like Meta have a responsibility to ensure that their algorithmic tools are not used in a discriminatory manner.”
Under the settlement terms, Facebook will stop using a housing ad tool that the government said used a discriminatory algorithm to locate users “similar” to other users based on characteristics protected by the Fair Housing Act. By December 31, Facebook must stop using the tool once called “Lookalike Audiences,” which relies on an algorithm the US says discriminates based on race, gender, and other characteristics. Thee Justice Department said.
It said that Facebook will also develop a new system over the next six months to address racial and other inequalities caused by personalization algorithms in its housing ad delivery system.
If the new system fails, the settlement agreement could be terminated, the Justice Department said. Under the settlement, Meta must also pay a fine of just over $115,000.
The announcement comes after Facebook agreed in March 2019 to overhaul its ad targeting systems to prevent discrimination in housing, credit, and employment ads as part of a legal settlement with a group including the American Civil Liberties Union, the National Fair Housing Alliance. and others.
The changes announced at the time were designed so that advertisers who wanted to run ads for housing, employment, or credit would no longer be allowed to target people by age, gender, or zip code.
The Justice Department said Tuesday that the 2019 settlement reduced the potentially discriminatory targeting options available to advertisers but did not resolve other issues, including Facebook’s discriminating delivery of home ads through machine learning algorithms.