On January 6, 2016 the Federal Trade Commission (“FTC”) issued the report Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues (“Report”), based on prior workshops and subsequent public comments on Big Data usage. The Report concentrates on data usage, not collection, and the application of current law to such usage, and not perceived gaps in regulation. Industry can take comfort in the Report’s non-expansive application of current law and its very limited application of FTC unfairness authority to Big Data usage.
Data brokers, platforms, analytics companies, e-tailers, ad networks, advertisers, creditors, employers, and others that are looking to Big Data to help make decisions and to develop, offer, and market products and services should look to the Report for guidance. Key take-aways include:
- Beware Application of the Fair Credit Reporting Act (“FCRA”): FCRA applies to companies that prepare and sell reports on consumers to be used for credit, employment, insurance, housing, and similar eligibility determinations, and companies that obtain and use such reports. The FTC warns that use of Big Data (e.g., predictive analytics) in such manner is covered under FCRA. This results in obligations to ensure maximum accuracy and to provide notice to consumers and an opportunity to correct errors. The FTC has brought recent enforcement actions regarding “risk based” pricing of credit or insurance (e.g., related to cell phone services), and the FTC makes clear that Big Data analytics to establish such pricing triggers FCRA.
- Consider Equal Employment Opportunity Laws and the Impact Big Data Analytics-Based Decision Making May Have on Protected Classes Such as Race, Gender, Disabilities, and Ethnic Origin. The federal nondiscrimination laws prohibit discrimination based on certain protected characteristics, including in many instances “disparate treatment” or “disparate impact” – the unintended discriminatory impact unexplained by neutral factors. The FTC warns, as examples, that use of zip codes to establish preferential treatment could implicate race and that certain data characteristics could implicate gender. The FTC also counsels that such data usage to make marketing decisions about what offers are made to what consumers (e.g., prime versus subprime mortgages) could violate equal opportunity laws. The key is whether opportunities are offered, and eligibility determined, based on a particular consumer’s qualifications, not class-based generalizations.
- Authority Under the FTC Act Is Narrow, but the Scope of the FTC’s Deception and Unfairness Authority Needs to Be Considered. The Report reminds companies that making inaccurate statements regarding their Big Data practices will constitute a deceptive practice actionable under Section 5 of the FTC Act. It further warns that failure to disclose material information that consumers ought to know could lead to a deception claim. The Report also explains that although unfairness authority under Section 5 is limited, where Big Data practices lead to there is a likelihood of harm to consumers not outweighed by any benefits to consumers or competition, the FTC can prosecute for unfairness. As an example, based on a recent enforcement action, the Report describes a situation where data brokers knowingly provide consumer data to fraudsters and identity thieves, or where reasonable data security is not maintained. Section 5 violations can be prevented by accurately describing all material data practices (transparency and accurate disclosure) and taking reasonable steps to protect data (data security) and prevent its illegal use (diligence and contractual use restrictions).
The Report goes on to suggest best practices for companies to prevent violations of current federal law, including a list of questions designed to identify potentially problematic uses of Big Data.
Companies using Big Data should also consider the application of state laws. For a detailed analysis of potential state law issues regarding use of Big Data for dynamic pricing, see our prior blog post here.