While still complying with the ADA.

The Equal Employment Opportunity Commission issued some helpful guidance on Friday related to employers' use of artificial intelligence and algorithms and the Americans with Disabilities Act. The Guidance purports to relate to all aspects of employment, but it's really focused on hiring.

The guidance is short and easy to read, even if you're tech-challenged, as I am. Rather than restate it here, I'll provide a boiled-down version.

The ADA requires reasonable accommodation, not only for current employees, but also for job applicants and individuals who have offers of employment but haven't started work. Therefore, if you're an employer using software, algorithms, or AI to screen job applicants or offerees, you need to offer reasonable accommodations to those who need them in connection with the application or screening process. The EEOC recommends clearly stating up front that the company makes reasonable accommodations for individuals with disabilities, providing a contact, and responding to any such requests promptly.

If the individual claims a disability but the disability is not obvious, the employer can legally ask for appropriate medical documentation before deciding whether to make reasonable accommodations. This does not violate the ADA.

Hiding your head in the sand won't help. Employers can be liable for violations of the ADA in connection with the use of AI in hiring, even if the screening is performed by the vendor rather than the employer. In fact, employers can be liable if the applicant asks the vendor for a reasonable accommodation and the vendor either says no or ignores the request, and the employer doesn't even know it happened.

This is because the vendor would normally be considered an "agent" of the employer, which would make the employer liable for what the agent does within the scope of the agency. Therefore, employers, as I see it, you have three options: (1) Designate yourself as the entity to handle all reasonable accommodation requests from applicants, (2) require your vendor to immediately refer all requests for reasonable accommodation to you (make sure that's in the contract!), or (3) ensure that your vendor understands the ADA and can competently and legally field all requests for reasonable accommodation. My preference would be Option 1 because if you want something done right, do it yourself. Option 3 would be my last choice, but if you have a really good vendor, it might work.

Using AI that is "bias-free" or "validated" doesn't necessarily mean you're in the clear. According to the EEOC, some vendors say that their products are "bias-free," but this usually means only that the tool has been tweaked to avoid disparate impacts based on race, sex, national origin, or color -- in other words, the categories protected under Title VII. As we know, the ADA is different because it often requires an individualized assessment and, in appropriate cases, reasonable accommodations ("special treatment") for particular individuals. Thus, a "bias-free" tool may not help much with ADA compliance. "Validated" means only that the tool has been found to assess for characteristics that are needed for performance of the job. Again, that may not do any good with an individual who is capable of performing the job but requires reasonable accommodation.

And, whatever you do, don't let your AI ask for medical information from applicants! Pre-employment medical inquiries are strictly prohibited by the ADA until after a conditional offer of employment has been made. That rule applies to applicants with disabilities as well as applicants without disabilities. The only exception is in response to an applicant's request for reasonable accommodation. At that point, a human rather than an algorithm should be doing the follow up.

The applicant may be entitled to reasonable accommodation even after he or she has failed the screen. Don't assume you can say "no" to an accommodation request just because the applicant has already failed the screening process. It's possible that the individual didn't know reasonable accommodation was an option, or didn't realize how much the unaccommodated disability would affect his or her ability to successfully make it through the screening process. The obligation to consider reasonable accommodations exists whether the individual asks before the screening, during the screening, or after the screening. (Again, you don't have to take the applicant's word for it -- you can require appropriate medical documentation of the disability.)

"'Promising Practices'? We got 'Promising Practices'!" The "Promising Practices" terminology is a little hokey, but the EEOC does have (mostly) some good tips for employers seeking to hire the best candidates for the job without violating the ADA.

  • Train the designated employees "to recognize and process requests for reasonable accommodation as quickly as possible" in connection with the applicant screening process.
  • Train the designated employees to come up with other ways of assessing job candidates if the standard tools screen them out because of disabilities.
  • If requests for reasonable accommodation go to the vendor rather than to you, make sure the vendor refers to the requests to you immediately.
  • Use tools "that have been designed to be accessible to individuals with as many different kinds of disabilities as possible." I'm not sure this is realistic, but it's a worthy aspiration.
  • Let all applicants know that reasonable accommodations are available for those who have a legitimate need (and give them the necessary contact information to allow them to make requests).
  • "[Describe], in plain language and in accessible formats, the traits that the algorithm is designed to assess, the method by which those traits are assessed, and the variables or factors that may affect the rating." This is the only piece of advice from the EEOC that I disagree with -- assuming I'm understanding it correctly -- because providing this level of information in advance seems to make it too easy for applicants to "game" their answers. Remember the old non-virtual interview question, "What's your greatest shortcoming?" And the correct answer was something like, "Well, I'm a perfectionist, and I'm too devoted to my work, and I work such long hours that it sometimes gets in the way of my personal life." So, with all respect to the EEOC, I'd keep this description pretty general. For example, "We administer a personality test by computer that has multiple choice questions. We use the answers to determine whether the applicant will be compatible with our corporate philosophy and with co-workers and customers. If you need a reasonable accommodation in connection with the format of the test or the assessment of your test results, please click here."
  • "Ensur[e] that the algorithmic decision-making tools only measure abilities or qualifications that are truly necessary for the job."
  • "Ensur[e] that necessary abilities or qualifications are measured directly, rather than by way of characteristics or scores that are correlated with those abilities or qualifications." (Emphasis is the EEOC's.)
  • Make sure the tool doesn't make improper pre-offer medical inquiries.

Interested in more? Back in February, Thy Bui, Matt Gurnick, and I had the honor of hosting a video webinar with EEOC Commissioner Keith Sonderling, who has taken special interest in the topic of AI and employment discrimination. The webinar is still available for your viewing pleasure here.