The use of big data in employment decisions—a practice often referred to as “people analytics”—has exploded in recent years. Lately, however, the concept is gaining more and more attention not only for its appeal of faster and more efficient hiring, but also for the significant risks it can pose. One key risk is the potential for a disparate impact claim, particularly on a class-wide basis. So while proponents of using software tools and algorithms to identify and select job candidates claim people analytics is more efficient and effective than traditional recruiting and selection procedures, employers should take care when choosing tools and vendors, and should proactively monitor their implementation to avoid big liability.

Examples of people analytics include software programs that source and match candidates to employers’ job postings based on certain words used in the candidates’ applications, resumes, and/or social media profiles; automated online reference checking tools that assess the “fit” of applicants for an employer’s culture; computer game tests that estimate applicants’ cognitive abilities; and other online personality assessments. Undoubtedly, these types of programs and tools have the appeal of faster and more efficient hiring. Proponents even claim they can program the algorithms to mitigate biases and help create a more diverse workplace. One big problem, however, is that many of these programs have not been appropriately validated in order to comply with federal and state anti-discrimination laws, and make selection “decisions” based on factors that are not shown to be job related and consistent with business necessity. Critics also claim that many programs simply, codify and perpetuate existing biases, which causes these programs to screen out protected groups of individuals (for example, women or minorities) in a statistically significant way. The culmination of these factors can mean big risk, particularly at a time when disparate impact class actions are on the rise.

At a growing rate, plaintiffs’ attorneys and the EEOC are pursuing class-based claims that focus on the use of uniform policies, tests or other employee selection procedures that allegedly have a statistically significant adverse impact against a protected group and are not sufficiently job-related and consistent with business necessity. This may be because post-Dukes, disparate treatment class actions are harder to certify absent an easily identifiable uniform policy to form the “glue” holding the claims together. A software program or algorithm used to make hiring decisions arguably constitutes such a policy.

Moreover, employers who use selection devices such as software programs or algorithms must ensure they comply with the Uniform Guidelines on Employee Selection Procedures or “UGESP,” which contain detailed and technical requirements for demonstrating their validity. While a software vendor’s documentation supporting the validity of the test or program may be helpful, the responsibility of ensuring compliance with UGESP ultimately resides with the employer. Thus, to the extent there are defects in the vendor’s validity documentation or to the extent that the vendor failed to undertake the proper validity analysis—the employer will be on the hook if the program or test is shown to have adverse impact.

Despite these risks, the appeal of these new techniques indicates they are more than a short-lived trend. Below are some tips to help minimize the risk of a disparate impact class claim triggered by the use of “big data”:

  • Ensure meaningful understanding of the tests and other selection procedures being administered. Do not simply turn things over to a vendor to make employment selection decisions without careful oversight.
  • Ensure that employment tests and other selection methods are properly validated for the positions and purposes for which they are used.
  • Obtain the necessary documentation from vendors regarding adverse impact and validation. Consult with counsel, and potentially an IO psychology expert, to ensure compliance with UGESP.
  • Ensure vendor contracts contain appropriate indemnification provisions.
  • Regularly consult counsel, and potentially an IO psychology expert, to conduct job analyses and to assess for adverse impact of overall employment practices.

Be mindful that even computer programs can be tainted by human bias, which can infect the programs themselves and lead to undesirable statistics and data outcomes.