Facebook settles discrimination lawsuit related to letting advertisers pick what races, genders saw housing ads 

FILE - Facebook's Meta logo sign is seen at the company headquarters in Menlo Park, Calif. on Oct. 28, 2021. According to a report released Thursday, June 9, 2022, Facebook and parent company Meta once again failed to detect blatant, violent hate speech in advertisements submitted to the platform by the nonprofit groups Global Witness and Foxglove. (AP Photo/Tony Avelar, File)

Meta, the social media giant that owns Facebook and Instagram, has settled a landmark lawsuit with the U.S. Justice Department that accused the company of using its targeted advertising system to allow landlords to market housing ads in a discriminatory manner. 

According to the Washington Post, the Trump administration filed the lawsuit in 2019 citing the Fair Housing Act. This is the second such ad settlement for the company. The first compelled the company to withhold demographic data such as gender, age, and zip codes — the latter is often used to determine race — in marketing housing, credit, and job opportunities. 

However, researchers determined that discrimination was still taking place because the software could detect when people of a certain race or gender would click on a particular ad, then send similar ads to “look-alike audiences.” The result was, for example, men would be shown a certain housing ad even though the advertising company had not requested that only men be targeted. 

Facebook’s Meta logo sign is seen at the company headquarters in Menlo Park, California, on Oct. 28, 2021. (AP Photo/Tony Avelar, File)

This week’s settlement also requires the company to overhaul a tool that is called Lookalike Audiences. The company will now build a new automated advertising system aimed at ensuring that housing ads are delivered to a more equitable population mix. A third party would have to review the new system, under the terms of the settlement. Additionally, Meta will pay the maximum $115,054 penalty. 

“This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division.

Advertisers will still be able to target locations, but not based on zip codes alone. 

“Discrimination in housing, employment and credit is a deep-rooted problem with a long history in the US, and we are committed to broadening opportunities for marginalized communities in these spaces and others,” Roy Austin, Facebook’s vice president of civil rights, said in a statement. “This type of work is unprecedented in the advertising industry and represents a significant technological advancement for how machine learning is used to deliver personalized ads.”

TheGrio is FREE on your TV via Apple TV, Amazon Fire, Roku, and Android TV. Please download theGrio mobile apps today!

Exit mobile version