In November 2021, the New York City Council passed a bill requiring that artificial intelligence ("AI") tools used by employers to make or assist in hiring decisions or internal promotions undergo bias audits to screen for discriminatory effects. Under the City Charter, the bill became law a month later after it was returned unsigned to the Council by then-mayor Bill DeBlasio. While the law was originally set to go into effect on January 1, 2023, the New York City Department of Consumer and Worker Protection ("DCWP") twice delayed enforcement. Following the adoption of a final rule by DCWP (the "Final Rule"), New York City-based employers and employment agencies (defined below) must begin complying with the law on July 5, 2023, and should be aware of their obligations if they use AI to assist in making hiring or promotion decisions.
Overview of the Law
The law provides that when AI tools "substantially assist" or replace human decision making in the screening or hiring process, they are considered "Automated Employment Decision Tools" ("AEDT") and may not be used by New York City-based employers (or employment agencies1) unless both of the following are met:
1. The AEDT "has been the subject of a bias audit" within a year of its use; and
2. The results of the bias audit are made publicly available, and candidates receive notices regarding the use of AEDT.
The law applies to the use of AEDT at each stage of the hiring process and not just to the final hiring or promotions decisions. So, AEDT used to "screen out" certain rsums or narrow an applicant pool down to a manageable number falls under the law's coverage. However, the law does not apply to tools used to identify potential candidates who have not yet applied for a position.
When Does AI Become an "Automated Employment Decision Tool?"
The law broadly defines AEDT to capture "any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or [a hiring/screening] recommendation." However, the law only considers such technology to be AEDT where it "is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons." By the Final Rule, DCWP has defined "to substantially assist or replace discretionary decision making" as any of the following:
- to rely solely on a simplified output (score, tag, classification, ranking, etc.), with no other factors considered;
- to use a simplified output as one of a set of criteria where the simplified output is weighted more than any other criterion in the set; or
- to use a simplified output to overrule conclusions derived from other factors including human decision-making.
The law would, therefore, cover a computer program that eliminates certain applicants from consideration before they even reach an HR department, as well as technology that an employer allows to overrule a hiring manager's recommendation that a particular candidate be interviewed. But, it would appear not to cover an algorithm that makes initial suggestions about whom to interview, as long as HR personnel reviews each application and the employer does not give more deference to the algorithm than to recommendations from individuals in the HR department. The law also categorially excludes from the definition of AEDT purely organizational or security-related technology like junk email, antivirus software, and certain compilations of data.
What is a "Bias Audit?"
The law requires that at least once a year, each AEDT must be subject to a bias audit, defined as "an impartial evaluation by an independent auditor," to assess whether the AEDT could have a disparate impact on certain protected classes of applicants. The regulations direct independent auditors--who must be free from enumerated conflicts of interest--to calculate "impact ratios" for various classes of workers in order to check for disparate impact caused by the AEDT. The ratios must be calculated not only for individual classes, such as male, female, Hispanic or Latino, Black or African Americans, but also for a variety of "intersectional categories," such as Hispanic women and non-Hispanic white men.
Under the Final Rule, data from multiple employers (or agencies) who use the same AEDT can be aggregated for purposes of the bias audit. However, to benefit from such data aggregation, an employer must submit its own data for an audit (unless it has never used the AEDT). In the event that insufficient historical data is available for a bias audit, auditors may rely on test data--sample data created to test the algorithm--for audits.
Using this data, the independent auditor and the employer can assess the degree to which AEDT might positively or negatively impact certain classes, including intersectional categories, and AI developers and employers can consider--perhaps in consultation with legal counsel--what adjustments to the AI might be appropriate to address any disparate impact.
What are the publication requirements?
New York City-based employers (and employment agencies) who use AEDT must post on their websites a summary of the results of the most recent audit of each AEDT they use, as well as the date they began using such tool(s). The summary must include:
- the source of the data used to conduct the bias audit, and how the data was used;
- the number of individuals whose demographic data is unknown;
- the number of applicants or candidates evaluated by the AEDT; and
- the impact ratios for all categories, including underlying "selection or scoring rates," as applicable.
Further, New York City-based employers must provide applicants who are residents of New York City, at least ten (10) business days prior to such use of the AEDT:
- notice that an AEDT will be used in the assessment of their candidacy;
- information about the type and source of the data considered by the AEDT, and about the employer's applicable data retention policy; and
- procedures for requesting "an alternative selection process or a reasonable accommodation under other laws."
These notice requirements may be satisfied by (in all cases at least ten business days prior to the use of the AEDT): (1) a conspicuous posting on the employment section of the employer's website; (2) inclusion of such information in the job posting; or (3) individual notice to the candidate by mail or email. With respect to the alternate selection process notice requirement, the law requires only that applicants be informed about their right to request a non-AEDT selection process, not that one actually be provided. However, employers will still need to consider whether a disability (or religious belief or other covered ground) could disadvantage an applicant in the AEDT screening, potentially requiring an accommodation to AEDT use--such as manual screening--under anti-discrimination statutes.
What are the penalties for Non-Compliance?
There is no express private right of action under the City law, meaning that applicants are unlikely to be able to sue in their own right for violations of the law. However, in City enforcement actions, employers can face fines of up to $500 for a first violation, and up to $1,500 for subsequent violations. Importantly, each day on which AEDT is improperly used constitutes a separate violation, as does each failure to provide an applicant with a required notice, which could cause fines to add up very quickly.
Are there future rules and regulations on the horizon?
The topic of AI in recruiting has also gained attention at the federal level. The Biden Administration has published a Blueprint for an AI Bill of Rights, which asserts that "algorithms used in hiring . . . decisions have been found to reflect and reproduce existing unwanted inequities or embed new harmful bias and discrimination." This past January, as part of its AI and Algorithmic Fairness Initiative launched in 2021, the Equal Employment Opportunity Commission ("EEOC") held a public hearing on AI in the employment context dedicated to "Navigating Employment Discrimination in AI and Automated Systems: A New Civil Rights Frontier." At the hearing, the EEOC heard testimony from multiple stakeholders, including computer scientists, civil rights advocates, legal experts, and employer representatives.
Thereafter, the EEOC published guidance on the potential for AI to cause disparate impact under Title VII, and separate guidance about potential Americans with Disabilities Act ("ADA") implications. The Title VII guidance reminds employers that the use of AI in hiring could give rise to disparate impact liability, particularly where the selection rate of a class covered by Title VII--race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin--is "substantially different" than the selection rate for another class. Under long-standing EEOC guidelines on employer tests and selection tools, a selection rate is "substantially different" if the ratio between one group and another is less than four-fifths, or 80%. For example, if an AI tool's selection rate were 30% for Black applicants and 60% for White applicants, the ratio would be 50%, which is less than 80%, indicating probable disparate impact. So, in connection with bias audits to comply with City law, employers should also ensure that AEDT selection ratios remain within EEOC recommended thresholds.
In the ADA guidance, the EEOC highlights that employers are required to consider for employment candidates with a disability who can perform the essential functions of the position (with or without a reasonable accommodation). AI used in hiring could run afoul of the ADA where, for example, it screens out an applicant because of a gap in their rsum, and that gap was caused by a disability. Accordingly, employers should--as the EEOC recommends--confer with AI vendors to ensure that the applicable software or algorithms were developed with due consideration for individuals with disabilities.
We will also monitor any New York State legislative action on AI in employment, given that the State often follows suit on City employment laws, as it has done in the recent past with sick leave and salary range posting requirements.
What can employers do to prepare for the July 5, 2023 City implementation deadline?
As a first step, employers will need to think carefully about how they might already be using AEDT in the recruiting and hiring process. If AI is used at any step to "filter" resumes, or recommend "top candidates," covered AEDT might be in use, and employers will be responsible for ensuring that they comply with the City's law. We expect that as a practical matter, employers will look to AEDT vendors to coordinate bias audits. Employers should reach out to any provider of AEDT as soon as possible to discuss next steps, and be ready to provide any historical data requested by the independent auditor. To preserve confidentiality, employers might consider requiring auditors and vendors to sign confidentiality agreements before receiving any applicant data.
Employers should also be aware that the City law does not obligate employers to collect demographic data from job applicants.2 As a general matter, employers should think very carefully before inquiring about an applicant's protected characteristics in a non-anonymized manner, as doing so could give rise to potential discrimination claims.