An Update on Our Ads Fairness Efforts




Meta






By Roy L. Austin Jr, Vice President of Civil Rights and Deputy General Counsel



Updated on July 19, 2023 at 2:57PM PT:


Earlier this year, we launched the Variance Reduction System (VRS) — an update to our ad delivery system that helps advance more equitable delivery of housing ads in the US. As part of our settlement with the United States Department of Justice (DOJ) and the Department of Housing and Urban Development (HUD) last year, we agreed to have an independent, third-party reviewer regularly verify the VRS’s compliance with the agreed-upon metrics in the June settlement. As the DOJ shared today, the reviewer verified that the VRS has met the compliance standards that Meta agreed upon with the DOJ. This important report is another step to help advance fairness and equity in our ads system, and later this year, we plan to extend the VRS to US employment and credit ads.


Originally published on January 9, 2023 at 12:06PM PT:


As a part of our settlement with the Department of Justice (DOJ), representing the US Department of Housing and Urban Development (HUD), we announced our plan to create the Variance Reduction System (VRS) to help advance the equitable distribution of ads on Meta technologies. After more than a year of collaboration with the DOJ, we have now launched the VRS in the United States for housing ads. Over the coming year, we will extend its use to US employment and credit ads. Additionally, we discontinued the use of Special Ad Audiences, an additional commitment in the settlement.


The Variance Reduction System in Action



The VRS uses new machine learning technology in ad delivery so that the actual audience that sees an ad more closely reflects the eligible target audience for that ad. After the ad has been shown to a large enough group of people, the VRS measures aggregate demographic distribution of those who have seen the ad to understand how that audience compares with the demographic distribution of the eligible target audience selected by the advertiser. To implement this technology in a way that respects people’s privacy, the VRS relies on a widely used method of measurement called Bayesian Improved Surname Geocoding (BISG) – informed by publicly available US Census statistics – to measure estimated race and ethnicity. This method is built with added privacy enhancements including differential privacy, a technique that can help protect against re-identification of individuals within aggregated datasets. 


Throughout the course of an ad campaign, the VRS will keep measuring the audience’s demographic distribution and continue working to reduce the difference between the audiences. 


Learn more about this new technology in our technical paper and on our AI blog.


Our Work to Further Algorithmic Fairness


Meta embeds civil rights and responsible AI principles into our product development process to help advance our algorithmic fairness efforts while protecting privacy. 


The VRS builds on our longstanding efforts to help protect against discrimination. This includes restricting certain targeting options for campaigns that advertise housing, employment or credit ads. For example, we don’t allow advertisers that are either based in or trying to reach people in the US, Canada and certain European countries from targeting their housing, employment or credit ads based on age, gender or ZIP code. 


Across the industry, approaches to algorithmic fairness are still evolving, particularly as it relates to digital advertising. But we know we cannot wait for consensus to make progress in addressing important concerns about the potential for discrimination — especially when it comes to housing, employment, and credit ads, where the enduring effects of historically unequal treatment still have the tendency to shape economic opportunities. We will continue to make this work a priority as we collaborate with stakeholders to support important industry-wide discussions around how to make progress toward more fair and equitable digital advertising.




Post a Comment

0 Comments