Monday, December 23, 2024

As a part of our settlement with the Department of Justice (DOJ), representing the US Department of Housing and Urban Development (HUD), we announced our plan to create the Variance Reduction System (VRS) to help advance the equitable distribution of ads on Meta technologies. After more than a year of collaboration with the DOJ, we have now launched the VRS in the United States for housing ads. Over the coming year, we will extend its use to US employment and credit ads. Additionally, we discontinued the use of Special Ad Audiences, an additional commitment in the settlement.

The Variance Reduction System in Action

The VRS uses new machine learning technology in ad delivery so that the actual audience that sees an ad more closely reflects the eligible target audience for that ad. After the ad has been shown to a large enough group of people, the VRS measures aggregate demographic distribution of those who have seen the ad to understand how that audience compares with the demographic distribution of the eligible target audience selected by the advertiser. To implement this technology in a way that respects people’s privacy, the VRS relies on a widely used method of measurement called Bayesian Improved Surname Geocoding (BISG) – informed by publicly available US Census statistics – to measure estimated race and ethnicity. This method is built with added privacy enhancements including differential privacy, a technique that can help protect against re-identification of individuals within aggregated datasets. 

Throughout the course of an ad campaign, the VRS will keep measuring the audience’s demographic distribution and continue working to reduce the difference between the audiences. 

Lean more about this new technology in our technical paper and on our AI blog.

Our Work to Further Algorithmic Fairness

Meta embeds civil rights and responsible AI principles into our product development process to help advance our algorithmic fairness efforts while protecting privacy. 

The VRS builds on our longstanding efforts to help protect against discrimination. This includes restricting certain targeting options for campaigns that advertise housing, employment or credit ads. For example, we don’t allow advertisers that are either based in or trying to reach people in the US, Canada and certain European countries from targeting their housing, employment or credit ads based on age, gender or ZIP code. 

Across the industry, approaches to algorithmic fairness are still evolving, particularly as it relates to digital advertising. But we know we cannot wait for consensus to make progress in addressing important concerns about the potential for discrimination — especially when it comes to housing, employment, and credit ads, where the enduring effects of historically unequal treatment still have the tendency to shape economic opportunities. We will continue to make this work a priority as we collaborate with stakeholders to support important industry-wide discussions around how to make progress toward more fair and equitable digital advertising.

Source

0 Comments

Leave a Comment