July 13, 2023
The Road to Hell and Trying to Regulate AI in Hiring
New York City's new law on AI in hiring is a cluster. It requires notices and audits that won't tell you much and won't make any difference in preventing or discouraging discrimination.
Notices
The law requires NYC employers to give at least 10 days' notice to all NYC applicants that AI is used in the selection process. It also requires that the employer allow applicants to request an alternate selection process that does not use the tool.
The ten-day requirement is nuts. It means employers cannot use their tools that are designed to save them time and effort until they give this notice and wait ten days. It bogs down the process for applicants and employers. Nobody wants this.
And it probably won't make any difference. If you are applying for a job, are you going to ask for a different process and come out looking like a trouble-maker before they've even sorted the resumes?
What would the alternate process be? Well, it's likely to be that a human being looks at your resume and decides whether or not you make the cut. This does not mean there will be less bias in the hiring process; it could increase bias.
In addition to the individual applicant notices, the employer must also give notice of "the type of data collected for the automated employment decision tool, the source of such data and the employer or employment agency’s data retention policy," as well as the results of an annual bias audit of the employer's hiring.
First, most employers don't know what data is used in tools that sort resumes and rank potential candidates or give predictions about their success in the role.
The data definitely includes applicant resumes. But after that, it can include information from the employer about its current workforce and their performance as well as the vendor's anonymized data about other customers' employees. AI hiring tools can also rely on all sorts of other information available about candidates such as social media profiles and public records, including court filings.
Employers won't be able to make these disclosures without information from the vendors who designed the tools. And vendors are not going to want to disclose the recipe for their secret data sauce. Some of the information may also be protected by trade secret or other intellectual property laws.
Audits
The other big requirement of the New York City law is the employer must have its hiring data audited at least once a year by an "independent auditor" for disparate impact. Again, preventing discrimination is a great idea. Again, the plan will defeat the purpose.
First, the audit won't tell whether the tool is having a disparate impact unless employers are also tracking whether the tool's recommendation was taken or not. If the people making hiring decisions don't follow the suggestions and end up hiring fewer people in protected classes, it may be the people not the tool.
Second, employees come and go for all sorts of reasons and the demographics of any employee population, especially new hires, are always dynamic. An annual audit is about as useful as an annual engagement survey.
Third, the required audit is the same as under Title VII and will only tell you whether the protected class is being selected less often than the most selected group 80% of the time. This is the 4/5 test, which deals with evidentiary presumptions and burdens of proof. It doesn't tell you whether or not there is discrimination.
Last, there is absolutely no requirement that the employer do anything if a disparate impact is revealed. Sure, discrimination is illegal and the employer has to disclose the results of the audit either on its website or on reques. But this law just makes it a lot harder to hire residents of New York City and does almost nothing to prevent discrimination.
The new guidance does give more information on who the law applies to and what is required. Here's a great summary of the guidance.
On June 29, 2023, the New York City Department of Consumer and Worker Protection (DCWP) issued new guidance on the enforcement of the city’s law regulating the use of automated employment decision tools (AEDTs) ahead of the July 5, 2023, effective date for final rules implementing the law.
Quick Hits
The DCWP released a set of frequently asked questions (FAQs) to flesh out the regulations implementing the city’s AEDT law, which regulates the use of automated decision-making tools and artificial intelligence (AI) to make employment decisions. The law, which technically took effect on January 1, 2023, requires that employers and employment agencies ensure AEDTs have been subjected to bias audits prior to being used. The law also provides specific notifications and disclosures to job candidates about the use of such tools.
On April 6, 2023, New York City published final rules that explained the scope of “machine learning, statistical modeling, data analytics, or artificial intelligence,” standardized the bias audit standards required under the law, and clarified notice requirements. Those final rules are set to take effect on July 5, 2023.
The DCWP’s new FAQs provide further clarification that employers and employment agencies of which employers and employment agencies may want to be aware. Here are some key points.
AEDTs Used ‘In The City’
According to the DCWP, the AEDT’s application to employers and employment agencies that use such tools “in the city,” means:
The DCWP stated that if the law applies, then a bias audit must be completed before use of an AEDT, and job candidates who are residents of New York City must be notified that the employer or employment agency uses an AEDT.
Potential Disparate Impact
The DCWP explained that while the AEDT law requires employers and employment agencies to conduct bias audits, the law does not “require any specific actions based on the results of a bias audit.” However, FAQs note that employers and employment agencies that use AEDTs could be subject to potential liability under federal, state, and New York City laws prohibiting employment discrimination.
This explanation comes as federal enforcement agencies, including the U.S. Equal Employment Opportunity Commission (EEOC), have been scrutinizing whether such technology can lead to discriminatory results. Notably, the EEOC in May 2023 issued new guidance that specifically highlighted that the use of such tools to make job applicant or employee selection decisions could have a disparate impact on protected groups.
Use of Historical Data in Bias Audits
According to the final rules implementing the AEDT law, multiple employers or employment agencies may rely on the same bias audit conducted using historical data of other employers or employment agencies as long as they had “provided their own historical data from its own use of the AEDT to the independent auditor” or had never used the AEDT.
The FAQs explain that there is “no specific requirement about the historical data used for a bias audit” but that the “summary of the results of a bias audit must include the source and explanation of the data used to conduct the bias audit.” The FAQs state that the audit should explain “if the historical data was limited in any way, including to a specific region or time period.”
Additionally, further clarifying the intended meaning of historical data, the FAQs also specify that “[i]mputed or inferred data” cannot be used to conduct a bias audit.
Use of Test Data in Bias Audits
The FAQs specify that “DCWP has not set specific standards for statistical significance,” and explain that “[i]f an independent auditor determines the historical data is insufficient, test data may be used” and the “summary results of the bias audit must explain why test data was used and include the source and description of the data.”
The FAQs explain that to “allow for flexibility and development of best practices” the DCWP “has not set requirements for test data.” Again, the FAQs state that summary results should explain the source of the data and if test data is used, “should explain how the data was sourced or developed.”
Vendor-Provided Bias Audits
The FAQs explain that a vendor may “have an independent auditor do a bias audit of its tool” and that the law “does not prohibit a vendor from having a bias audit done or coordinating the collection of data to use to conduct a bias audit.” While the FAQs are not clear whether an auditor is independent if they work for the vendor that develops and distributes the AEDT, the FAQs note that employers and employment agencies have ultimate responsibility for ensuring that the audit is done before using an AEDT.
Notice Requirements
The FAQs state that the notice requirement under the AEDT law technically only applies to employees and applicants “who are residents of New York City.” Further, the FAQs provide that “notice on a website does not have to be position-specific.”
Next Steps
Employers are increasingly relying on automated decision-making tools and AI systems to make employment-related decisions, including hiring, screening job candidates, or improving workplace efficiency. New York City is one of several jurisdictions to place guardrails on the use of this emerging technology as federal regulators, such as the EEOC, have further raised concerns that the tools could result in discrimination against individuals with disabilities or other protected groups.
Employers and employment agencies in New York may want to consider reviewing the extent to which they are already using such tools and whether such tools used or planned to be used have been subjected to bias audits.
CompAnalyst® Pay Equity Suite can help you achieve and sustain pay equity