SAN FRANCISCO — Meta agreed Tuesday to change its ad technology and pay a $115,054 fine, in a settlement with the Justice Department over claims that the company’s ad systems discriminated against Facebook users by means of restrict who can see home listings on the platform based on their race, gender and zip code.
Under the agreement, Meta, the company formerly known as Facebook, has said it would change his technology and use a new computer-aided method that aims to regularly verify that the audiences targeted and eligible for home ads are actually seeing those ads. Called a “variance reduction system,” the new method relies on machine learning to ensure advertisers serve housing-related ads to specific protected groups of people.
“We’re going to take a snapshot of the audience of marketers every now and then, look at who they’re targeting, and remove as much variety from that audience as possible,” Roy L. Austin, Meta’s vice president of civil rights and deputy general counsel, said in a statement. interview. He called it “a significant technological advancement for how machine learning is used to deliver personalized ads.”
Facebook, which became a business behemoth by collecting its users’ data and letting advertisers target ads based on audience characteristics, has faced complaints for years that some of those practices are biased and discriminatory. The company’s ad systems allowed marketers to choose who saw their ads by leveraging thousands of different attributes, also allowing those advertisers to exclude people who fall under a number of protected categories.
While Tuesday’s settlement concerns housing advertisements, Meta said it also planned to apply its new system to control the targeting of advertisements related to employment and credit. The company has previously faced backlash for: allow prejudice against women in vacancies and excluding certain groups of people from see credit card ads†
“Because of this groundbreaking lawsuit, Meta will — for the first time — change its ad serving system to address algorithmic discrimination,” said Damian Williams, a US attorney. said in a statement† “But if Meta fails to demonstrate that it has modified its delivery system enough to guard against algorithmic bias, this office will proceed with the lawsuit.”
Meta also said it would no longer use a feature called “Ad Special Audiences,” a tool it developed to help advertisers expand the groups of people their ads would reach. The Justice Department said the tool was also involved in discriminatory practices. The company said the tool was an early effort to fight prejudice and that the new methods would be more effective.
The issue of biased ad targeting has been discussed in particular in home ads. In 2018, Ben Carson, the Secretary of the Department of Housing and Urban Development, announced: a formal complaint against Facebook, accusing the company of having ad systems that “unlawfully discriminated” on the basis of categories such as race, religion and disability. Facebook’s potential for ad discrimination was also revealed in a 2016 research by ProPublica, which found that the company’s technology made it easy for marketers to exclude specific ethnic groups for advertising purposes.
in 2019, HUD sued Facebook for committing housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems were not delivering ads to “a diverse audience,” even if an advertiser wanted the ad to be widely seen.
“Facebook discriminates against people based on who they are and where they live,” Mr Carson said at the time. “Using a computer to limit one’s housing choices can be as discriminatory as slamming a door in one’s face.”
The HUD case came amid wider pressure from civil rights groups who argued that the massive and complicated advertising systems underlying some of the largest internet platforms have inherent biases, and that tech companies like Meta, Google and others should do more to knock back those prejudices†
The field of study, known as “algorithmic fairness,” has been a major topic of interest among computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists such as Timnit Gebru and Margaret Mitchell, have sounded the alarm bell years on such prejudices.
In the years that followed, Facebook has been curtailed on the types of categories marketers can choose from when buying home listings, reducing the number to hundreds, and eliminating options to target based on race, age, and zip code.
Meta’s new system, which is still under development, will periodically check who sees housing, employment and credit ads, and ensures that those audiences match the people marketers want to target. If the ads being served start to diverge significantly from white males in their twenties, for example, the new system will in theory recognize this and shift the ads to show more equitably to a wider and more diverse audience.
Meta said it would work with HUD in the coming months to incorporate the technology into Meta’s ad targeting systems, and agreed to an external audit of the effectiveness of the new system.
The fine that Meta will pay in the settlement is the maximum available under the Fair Housing Act, the Justice Department said.