On September 30, 2022, the Colorado Attorney General’s office (AG) published proposed rules (Rules) for the Colorado Privacy Act (CPA). These draft Rules follow this summer’s California Privacy Rights Act (CPRA) draft regulations. This post highlights several notable differences between the draft CPA and CPRA regulations, including technical details about global privacy controls, new consent requirements, and new sensitive data limitations.

Timing

The Rules have been made available for public comment, and the AG will hold a series of stakeholder meetings on November 10, 15, and 17, 2022, followed by a public hearing on February 1, 2023. The CPA goes into effect on July 1, 2023.

Definitions

The Rules clarify the definition of “Publicly Available Information” by specifically including in the definition “[i]nformation that a controller has a reasonable basis to believe the consumer has made lawfully available to the general public.” This is defined to include (1) information known to be available to the general public; (2) information that a consumer has intentionally made available to the general public; and (3) information that a consumer has made available under federal or state law.

The Rules also specifically exclude the following from the definition of “Publicly Available Information”:

  • Any personal data obtained or processed that constitutes posting a private image for harassment or criminal invasion of privacy
  • Inferences made exclusively from multiple independent sources of publicly available information
  • Biometric Data
  • Genetic Information
  • Publicly Available Information that has been combined with non-publicly available Personal Data
  • Nonconsensual Intimate Images known to the controller

The Rules also distinguish “Automated Processing” where there is human involvement (i.e., meaningful consideration of available data, as well as the authority to change or influence the outcome of the processing) from such processing where there is only a human review, which “does not rise to the level required for Human Involved Automated Processing.” The third category is “Solely Automated Processing,” which has no human review, oversight, involvement, or intervention. The right to opt-out is directed at Solely Automated Processing and Human Reviewed Automated Processing, and controllers may, but do not have to take action on requests to opt-out of profiling based on Human Involved Automated Processing (provided a controller makes other required disclosures).

Universal Opt-Out Mechanism

The Rules provide further details around the validity, process, and technical specifications of such global opt-out signals through its Universal Opt-Out Mechanism (UOOM). To be a valid UOOM, the tool must allow consumers to automatically communicate their opt-out choice via a simple and easy-to-use method that allows them to provide their opt-out for multiple controllers without having to make individualized requests. This can occur either by (a) sending an opt-out signal (e.g., an HTTP header field or JavaScript object), or (b) by alternative means such as a “do not sell” list with an automated query tool for controllers.

The UOOM must also allow consumers to exercise and automatically communicate their opt-out rights either for all purposes (sale, targeted ads, or profiling), specific purposes, or both, and it may also include allowing a consumer to opt out of multiple controllers or domains. However, the UOOM must not unfairly disadvantage any controller (e.g., unequal controller treatment or engaging in self-dealing).

The Rules also clarify that UOOMs cannot be the default setting for pre-installed tools (e.g., a browser or operating system). A UOOM and a consumer’s choice to use it will only be effective if the tool does not come pre-installed, and it is “marketed prominently as a privacy-protective tool” or as a specific user opt-out tool.

Additionally, the Colorado Department of Law (Law Department) will maintain a public list of recognized UOOMs, and the initial list will be released no later than April 1, 2024, just in time for the July 1, 2024 compliance deadline. In order to be recognized, a UOOM must (a) comply with the Rules’ technical specifications; (b) be an open, free, and adoptable system or standard; and (c) be free from confusion. The Law Department may consider additional factors, such as commercial adoption of the UOOM; ease of use; and whether the UOOM has been approved by a widely recognized, legitimate standards body.

Consent requirements

Controllers will have to obtain valid consumer consent prior to processing (a) sensitive data; (b) children’s data; (c) data for targeted advertising or profiling after the consumer has opted out for such purposes; (d) processing for a secondary purpose; and (e) prior to selling personal data. To be valid, the consent must (1) be obtained through the consumer’s clear, affirmative action; (2) be freely given; (3) be specific; (4) be informed; and (5) reflect the consumer’s unambiguous agreement. In no scenario can consent be obtained using dark patterns. Notably, consent, as set forth in the Rules, will require controllers to provide consumers with the ability to consent for some purposes and refuse consent for other purposes. The Rules also state that consent must be distinct from other terms and conditions and state that the consent should link directly to the applicable section of the controller’s privacy policy, if technically feasible. Thus, blanket opt-out consent is no longer permitted; instead, consumers must be able to select the data uses they would like to permit.

The Rules provide that after a consumer has opted out and the controller wishes to proactively obtain consent to process personal data for the opted-out purpose, the controller must provide a link or similar mechanism on its website or application that enables the consumer to provide consent. The consent mechanism must have “a similar look, feel, and size relative to other links on the same web page or application, and not be presented through pop-up windows, pop-up banners, or other web interface displays that degrade or obstruct the consumer’s experience on the controller’s web page or application.” However, the controller may request consent if an opted-out consumer subsequently initiates a transaction or attempts to use a product or service inconsistent with the request to opt out (e.g., signing up for a Bona Fide Loyalty Program that also involves the sale of personal data).

The Rules also require that a consumer’s ability to refuse or revoke consent be just as easy as providing consent. Moreover, controllers are required to refresh consent from consumers that validly consented in a manner that is compliant with all consent requirements in the CPA and the Rules “at regular intervals,” which is based on the context and scope of the original consent, sensitivity of the personal data collected, and reasonable expectations of the consumer. However, controllers must obtain new consent if a purpose materially evolves into a secondary use, and they must refresh consent at least annually for sensitive data.

Privacy notice requirements

The notice requirements for controllers are mostly consistent with the CPRA and its proposed regulations. However, one notable new requirement under the Rules is the 15-day notice requirement. Controllers must notify consumers at least 15 days prior to making any substantive or material changes to a privacy notice. Examples of substantive or material changes include changes to (1) categories of personal data processed; (2) processing purposes; (3) a controller’s identity; or (4) methods by which consumers can exercise their data rights request. Notably, a controller must obtain consent from a consumer for any secondary use of data even if the new purpose is disclosed in the updated privacy notice.

Secondary use

As indicated above, the Rules have more obligations associated with the secondary use of personal data. Consistent with other state privacy laws, the Rules clarify that controllers must obtain consumer consent before processing personal data for a secondary use, even if the new purpose is disclosed in the privacy notice. A secondary use is any purpose that is “not reasonably necessary to or compatible with the original specified purpose(s).” The Rules provide some considerations for determining this, including:

  • The reasonable expectation of an average consumer
  • The link between the original specified purpose(s) and the purpose(s) of further processing
  • The consumer-controller relationship
  • The type, nature, and amount of the personal data subject to the new purpose
  • The possible consequence or impact on the consumer of the new purpose
  • The identity of the entity conducting the new purpose(s) (e.g., the same or a different controller, an affiliate, a processor, or a third party)
  • The existence of additional safeguards, such as encryption or pseudonymization

Loyalty programs

The Rules also have disclosure requirements for controllers offering Bona Fide Loyalty Program Benefits that are similar to the requirements under the CPRA’s proposed regulations for Financial Incentive programs, but they differ in terms of what they value. For instance, the CPRA’s proposed regulations focus on the value of the consumer’s data that forms the basis of the Financial Incentive program. In comparison, the Rules focus on the value of the Bona Fide Loyalty Program Benefits (e.g., a superior price, rate, level, quality, or selection of goods/services offered via the Loyalty Program). As such, businesses should take note of this distinction as they update their privacy policies to comply with either or both state privacy regulations.

Right to correction

The Rules have more specific compliance requirements than the CPRA for correcting a consumer’s personal data. The CPRA only requires businesses to use “commercially reasonable efforts” when receiving correction requests. Under the Rules, controllers must correct the consumer’s personal data “across all data flows and repositories” and “implement measures to ensure that the personal data remains corrected.” Alternatively, the Rules permit controllers to instruct the consumer to make the requested corrections through their account settings if it is possible and not unduly burdensome.

Data minimization

While data minimization has been a principle in the United States for a while, the Rules now require controller implementation by requiring controllers to set specific time limits for the erasure of personal information or to conduct a periodic review to ensure that data is still needed. The Rules also state that biometric Identifiers must be reviewed at least once a year to determine whether storage is still necessary.

Requirements specific to sensitive data

The Rules differ from the CPRA when it comes to processing sensitive data. Under the CPRA, businesses that use or disclose sensitive personal information must offer consumers a right to limit the business’s use of such sensitive personal information (subject to certain exceptions) and specific methods for consumers to exercise that right. In contrast, the Rules require controllers to obtain consent to process sensitive data, which includes Sensitive Data Inferences (“inferences made by a controller ... which indicate an individual’s racial or ethnic origin; religious beliefs; mental or physical health condition or diagnosis; sex life or sexual orientation; or citizenship or citizenship status”). However, controllers may process Sensitive Data Inferences from consumers over the age of 13 without consent in very limited circumstances, including that the inferences be deleted within 12 hours of collection. Lastly, controllers must refresh consent at least annually for sensitive data as noted above.

Data Protection Impact Assessments

Lastly, the Rules provide more details about the CPA’s Data Protection Impact Assessment (DPIA) requirement. A DPIA is required when the controller’s processing presents a “heightened risk of harm to a consumer,” which includes (a) processing for targeted advertising or potentially harmful profiling; (b) selling personal data; and (c) processing sensitive data. The Rules provide that a DPIA must, “at a minimum,” describe 18 different enumerated topics. However, the Rules allow the use of a DPIA that complies with another similar law and is “reasonably similar in scope and effect” to one that would have been conducted under the Rules.