Facebook stops using an advertising tool in a settlement with the US government | Meta

Facebook will change its algorithms to prevent discriminatory housing advertising, and its parent company will face court oversight to settle a US Department of Justice lawsuit on Tuesday.

In a press release, US government officials said that Meta, formerly known as Facebook, had reached an agreement to settle the lawsuit filed in federal court in Manhattan the same day.

Under the terms of the settlement, Facebook will stop using a housing ads advertising tool that the government said uses a discriminatory algorithm to find users who “look like” other users based on characteristics protected by the Fairness Act Homes are protected, the Justice Department said. By December 31, Facebook must stop using the tool once called “Lookalike Audience,” which relies on an algorithm that the US says discriminates based on race, gender and other characteristics.

Facebook will also develop a new system over the next six months to eliminate racial and other disparities caused by the use of personalization algorithms in its home ad delivery system, it said.

According to the press release, it was the Justice Department’s first case challenging algorithmic discrimination under the Fair Housing Act. Facebook is now subject to Department of Justice approval and court oversight for its ad targeting and delivery system.

US Attorney Damian Williams called the lawsuit “groundbreaking”. Assistant Attorney General Kristen Clarke called it “historic”.

Ashley Settle, a Facebook spokeswoman, said in an email that the company is “developing a novel machine learning method, without our ads system, that will transform the way housing ads are delivered to US residents from diverse demographic groups.” will”.

She said the company will expand its new method for employment and credit-related ads in the United States. “We’re excited to advance this effort,” Settle added in an email.

read more  Sony SRS-XG300 - Great balance

Williams said Facebook’s technology has a history of violating the Fair Housing Act online, “just as companies engage in discriminatory advertising using more traditional advertising methods.”

Clarke said, “Companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner.”

The announcement comes after Facebook agreed back in March 2019 to overhaul its ad targeting systems to eliminate discrimination in housing, credit and job ads as part of a legal settlement with a group including the American Civil Liberties Union, the National Fair to prevent Housing Alliance and others.

The changes announced at the time were designed so that advertisers who wanted to run housing, job, or loan ads were no longer allowed to target people by age, gender, or zip code.

The Justice Department said Tuesday the 2019 settlement reduced potentially discriminatory targeting options for advertisers but didn’t resolve other issues, including Facebook’s discriminatory delivery of housing ads through machine learning algorithms.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Never miss any important news. Subscribe to our newsletter.