The Council of the District of Columbia is currently considering landmark legislation that would impose significant obligations on many entities that use algorithms. In sum, the Stop Discrimination by Algorithms Act precludes entities from using algorithms that make decisions based on protected traits. Notably, the bill applies to a broad range of industries, imposes various affirmative requirements and provides enforcement authority to the Office of the Attorney General for the District of Columbia (“OAG-DC”). The bill also includes a private cause of action, penalties up to $10,000 per violation and punitive damages. Note, this bill was introduced at the request of OAG-DC. Given the frequent collaboration between state attorneys general, companies should expect to see this type of legislation in other jurisdictions.
THE BILL APPLIES TO A BROAD RANGE OF INDUSTRIES
This bill applies to any organization that uses algorithms to make decisions related to an offer of credit, insurance, education, employment, housing or a place of public accommodation1, and that meets one of the following criteria:
- Possesses or controls personal information (which includes purchasing or consuming history) on more than 25,000 residents of the District of Columbia;
- Has greater than $15 million in average annualized gross receipts for the three years preceding the most recent fiscal year;
- Is a data broker or entity that derives at least 50% of its annual revenue by collecting, assembling, selling, distributing, providing access to or maintaining personal information, and some proportion of the personal information concerns a District resident who is not a customer or an employee of that entity; or
- Any entity that performs algorithmic eligibility determinations or algorithmic information availability determinations on behalf of another entity.
THE BILL RENDERS IT UNLAWFUL FOR AN ALGORITHM TO MAKE DECISIONS BASED ON PROTECTED TRAITS WITH NARROW EXCEPTIONS
The bill makes it unlawful for a covered entity to make a decision stemming from an algorithm where such decision is based on an individual’s actual or perceived race, color, religion, national origin, sex, gender identity or expression, sexual orientation, familial status, source of income or disability in a manner that makes important life opportunities unavailable to an individual or class of individuals. Any practice that has the effect or consequence of violating the preceding sentence is deemed an unlawful discriminatory practice. The bill explicitly applies to marketing practices that are based on algorithms.
The bill additionally provides for a narrow “business necessity” exception. There, an entity bears the burden to prove that, without such exception, such business cannot be conducted. A business necessity is not a business efficiency. There are also narrow exceptions for religious organizations.
THE BILL IMPOSES VARIOUS AFFIRMATIVE REQUIREMENTS
- Written agreements. Covered entities that rely on a service provider to conduct an algorithmic determination must execute a written agreement with that service provider to implement and maintain measures reasonably designed to ensure compliance.
- Covered entitles must create a one-page notice in English and any other language spoken by more than 500 DC residents2. This notice must be “continuously and conspicuously available in various formats,” including on the entity’s website. Significantly, covered entities must send this notice to individuals prior to using an algorithm to make a decision about the individual.
- Adverse Action Disclosure. If a covered entity takes adverse action with respect to any individual based on an algorithm, the covered entity must provide to that individual a specific disclosure explaining, among other things, the factors on which the determination depended. Covered entities must provide individuals an opportunity to submit corrections to the information used in the algorithm.
- Entities must audit their algorithmic practices and conduct annual impact assessments to determine whether their algorithms discriminate based on the protected traits described in Section II of this article. Entities must also create and retain an audit trail for at least five years for each algorithmic decision. Notably, this audit trail must include, among other things, the algorithm used to make each determination and the data used to train the algorithm.
- Annual Report. A covered entity must provide an annual report to OAG-DC containing specified information including performance metrics the entity uses to gauge the accuracy of the assessments, a description and rationale of each decision, and whether the entity has received complaints from individuals regarding the algorithmic determinations it has made.
OAG-DC AND PRIVATE INDIVIDUALS WOULD HAVE THE AUTHORITY TO INVESTIGATE AND ENFORCE THIS ACT
Any covered entity or service provider that violates any provision of this act is exposed to a civil penalty of not more than $10,000 per violation if OAG-DC brings an enforcement action.
If a private plaintiff brings a civil action, any covered entity or service provider is exposed to damages of not less than $100 and not greater than $10,000 per violation or actual damages, whichever is greater.
If OAG-DC or a private plaintiff brings an action, a covered entity is also exposed to punitive damages, attorney’s fees and costs, and any other relief a court deems appropriate.
The text of the bill can be found here. If you have questions or concerns about this bill.