Auditing the Audits
Lessons for Algorithmic Accountability from Local Law 144's Bias Audits
Marissa Kumar Gerchick, Ro Encarnación, Cole Tanigawa-Lau, Lena Armstrong, Ana Gutiérrez, Danaé Metaxa
FAccT '25: Proceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency
In this work, we “audit the audits,” analyzing the documents produced pursuant to one of the United States’ first enacted laws regulating the use of artificial intelligence in employment: New York City’s Local Law 144. This law requires employers and employment agencies using certain types of automated tools to publish “bias audits” with statistics about how different sex and racial groups fare in the hiring process when the tools are used. We collect and conduct a comprehensive analysis of all Local Law 144 bias audits (N=116) made publicly available to our knowledge from the law taking effect in July 2023 until early November 2024, and describe the extensive challenges we faced in identifying, archiving, extracting information from, and ultimately analyzing these bias audits. We identify several ways that bias audits produced in accordance with Local Law 144 are incomplete evaluations of algorithmic bias, despite news coverage and characterizations by employers and vendors suggesting otherwise. We show that Local Law 144 bias audits are significantly hampered by several issues, including missing demographic data, opaque data aggregation, problematic uses of “test data,” and reliance on metrics that do not represent how automated hiring tools are used in practice. We analyze the reported results in Local Law 144 bias audits alongside the four-fifths rule often used as a measure for assessing adverse impact in employment contexts. Most audits do not report results that would not suggest violations of the four-fifths rule. Crucially, however, we show that these tools could often be in violation of the four-fifths rule when considering potential impacts of missing demographic data. We offer ten practical recommendations to strengthen future legislative efforts that mandate algorithm auditing in hiring and other areas, and contribute an open dataset and codebase for extracting and combining bias audit results to support future auditing efforts.