This article is more than 1 year old

Meta develops AI targeting system to show housing ads to wider range of users

Part of a settlement with US Department of Justice after ads discrimination case

Meta is rolling out a more open-minded AI-powered system that promises to reduce discrimination after it was sued by the Department of Justice for preventing Facebook users from seeing housing ads based on personal characteristics like their ethnicity, sex, and marital status.

Federal prosecutors accused Meta of violating the Fair Housing Act (FHA), which prohibits landlords from selling, renting, or advertising their properties to people or evicting tenants based on their race, religion, sex, status, or disability. The social media giant allegedly targeted housing adverts to people from specific demographics, and unfairly hid them from users that didn't fit the same criteria.

Meta agreed to pay the $115,054 fine and settle the case last year in June. The deal also ensured the company had to overhaul its ad targeting system and make it fairer. Now, Meta has scrapped its previous "Lookalike Audience" or "Special Ad Audience" tool that used machine learning methods to group users with common backgrounds, allowing advertisers to automatically target specific audiences.

It has replaced it with the Variance Reduction System (VRS), a reinforcement learning-based system that places adverts to a wider range of users by tweaking the advert bidding process.

"The VRS is an offline reinforcement learning framework with the explicit goal of minimizing the variance in number of ad views between people who have seen the ad and the broader eligible audience of people who could have seen the ad," explained Miranda Bogen, policy manager, Responsible AI at Meta.

The system starts by measuring the distribution of users from different ages, genders, and ethnicities to whom an advertiser wants to show their ad. It then looks at the ad impressions from people that have been served the advert, and compares it with initial distribution to see what types of users should see the ad but haven't yet. VRS then adjusts various factors that control the value of adverts in the bidding process, swaying advertisers to buy ad spaces displaying them to different audiences.

"The VRS remeasures the audience's demographic distribution and updates the pacing of ads throughout the campaign, working to reduce variance between the audiences," said Bogen.

"When there's a new chance to show someone an ad, the system uses the latest demographic measurements, along with limited information about that person, to determine how to best adjust the pacing of the bid in order to encourage the ad to be distributed to an audience that more closely reflects the ad's eligible targeted audience."

That way, online adverts are more likely to be shown to a broader range of users and reduces the likelihood of discrimination, it's hoped. Meta is implementing VRS for housing advertisements to start with, and will slowly expand it to support employment and credit ads in the US next year.

"This development marks a pivotal step in the Justice Department's efforts to hold Meta accountable for unlawful algorithmic bias and discriminatory ad delivery on its platforms," assistant attorney general Kristen Clarke of the Justice Department's Civil Rights Division, said in a statement.

"The Justice Department will continue to hold Meta accountable by ensuring the Variance Reduction System addresses and eliminates discriminatory delivery of advertisements on its platforms. Federal monitoring of Meta should send a strong signal to other tech companies that they too will be held accountable for failing to address algorithmic discrimination that runs afoul of our civil rights laws," she concluded. ®

More about

TIP US OFF

Send us news


Other stories you might like