The FTC unveiled a lengthy report, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, warning companies about commercial uses of big data and the discriminatory impact it may have on low-income and underserved populations. “Big data” refers to the ubiquitous collection of massive amounts of consumer information by companies, which may be analyzed to reveal certain consumer patterns, trends, and associations.

While the term may conjure up an ominous feeling for some, big data has brought numerous advantages to society by efficiently and effectively matching products and services to consumers of all demographics. However, the FTC’s report warns that potential inaccuracies and biases might lead to detrimental effects on low-income and underserved populations, such as the misuse of personal information, reinforcing existing biases and disparities against certain populations, perpetuating fraud against vulnerable consumers, and weakening the overall effectiveness of consumer choice. While companies can design efficient big data algorithms that learn from human patterns and behavior, those algorithms may also “learn” to generate biased results.

During its 2014 workshop on big data, the FTC identified concerns over companies potentially using big data to exclude low-income and underserved communities from credit and employment opportunities. As a result, the FTC published the recent report to identify the benefits and risks of using big data, discuss its current research on the issue, and suggest legal compliance questions businesses should start asking themselves. The report also reinforces its jurisdiction and explains how companies should use data in order to comply with various consumer protection laws, including the Fair Credit Reporting Act (“FCRA”), Federal Trade Commission Act, and the Equal Credit Opportunity Act, which prohibits creditors from discriminating against applicants based on factors like race, national origin, religion, sex, marital status and age.

In an effort to push companies to start reviewing their big data collection and usage practices, the FTC sets forth a list of compliance questions, such as:

  • Does your data model account for biases at both the collection and analytics stages? For example, if an employer uses big data analytics to synthesize information gathered on successful existing employees to define a “good candidate,” then the employer could risk incorporating previous discrimination in employment decisions into new employment decisions.
  • If you use big data products for eligibility decisions, is your FCRA house in order?  Have you certified that you have a “permissible purpose” to obtain the information, and that you won’t use it to violate equal opportunity laws? Do you give people the pre-adverse action notices and adverse action notices required by the FCRA?
  • If you use big data analytics in a way that might adversely affect people’s ability to get credit, housing, or employment, are you careful not to treat them differently on a prohibited basis, such as by race or national origin?
  • Do you explain to consumers how you use their information? Are you honoring promises you make about your data practices?
  • Do you maintain reasonable security to protect sensitive information of consumers?

Continuing its foray into the world of data privacy, the FTC, on January 14, 2016, will host “PrivacyCon” – a workshop aimed at discussing the latest research and trends related to consumer privacy and data security. Stay tuned for our coverage of the January 14, 2016, FTC workshop.

To read the full FTC report, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, click here.