This blog post is a bonus supplement to our quarterly Artificial Intelligence and Biometric Privacy Quarterly Review Newsletter. Be on the lookout for our Q3 Newsletter!

We are quickly approaching the Jan. 1, 2023 operative date of most of the provisions of the California Privacy Rights Act (“CPRA), which, as most of us know by now, substantially amends the CCPA. Under the CPRA, the California Privacy Protection Agency (“CPPA” or “Agency”) has a mandate to issue regulations on a number of specific topics. With just fewer than three months to go until January 1, regulations are not even close to being finalized.  The Agency released the first draft of proposed regulations on May 24, and the first public comment period ended on August 23. In a meeting held by the CPPA on Friday, September 23, the Agency gave no concrete sense of timing or any comments on topics, such as those discussed in this post, for which regulations have not even been issued. This has left many businesses feeling left in the lurch, uncertain of what to do.

This feeling of uncertainty and lack of direction is particularly acute with respect to automated decision-making and profiling; topics on which we do not yet have draft regulations (along with cybersecurity). Given that the Agency’s mandate as to automated decision-making technology and profiling is akin to the Agency receiving a blank check, as we discuss below, the regulations that the Agency eventually promulgates on these topics will, no doubt, have broad and sweeping consequences and require significant additional compliance and operational efforts for most businesses. The mandate, which we discuss in further detail below, is as follows:

“Issuing regulations governing access and opt-out rights with respect to businesses’ use of automated decision-making technology, including profiling and requiring businesses’ response to access requests to include meaningful information about the logic involved in such decisionmaking [sic] processes, as well as a description of the likely outcome of the process with respect to the consumer.”

Regardless of where the Agency ends up on the topic – whether in alignment with the EU General Data Protection Regulation’s (“GDPR”) strict regime or the more lax frameworks in, for example, Virginia– your compliance program will have to, at a minimum, address the following:

  1. Is profiling involved?
  2. Is automated decision-making involved?
    1. Solely automated? Or
    2. With human involvement?
  3. Did it subject the consumer to “legal or similarly significant effects”?
  4. What is the logic (e.g. the algorithm) involved in the decision-making process?
  5. How can the logic be described in simple terms to consumers?
  6. What is the outcome of the decision-making process with respect to consumers?

Fortunately, the Agency has gathered public input on these issues in informal rulemaking and given us some clues as to its considerations with respect to these topics, which will certainly inform where it lands on the regulations. In addition, these concepts show up in the GDPR, as well as in some of the forthcoming 2023 state privacy laws in Virginia, Colorado, Connecticut, and Utah. Based on these, below we set out a roadmap for what to collect, at a minimum, in order to identify your business’ ADM and profiling processes, and to be prepared for wherever the Agency lands in the regulations on ADM and profiling to address consumer rights, data protection impact assessments, and any potential restrictions (e.g. in the event of a GDPR-esque approach).

Level-Setting: Profiling v. ADM

As an initial matter, it is important to understand how profiling and ADM differ from one another.

In general, “profiling” is defined similarly across the laws to involve:

  • Automated processing
  • Of personal data/personal information
  • Evaluating/analyzing/predicting personal aspects
    • Performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location or movements.
  • Does NOT necessarily involve taking action

In other words, profiling effectively means gathering information about an individual (or group of individuals) and evaluating their characteristics or behavior patterns in order to place them into a certain category or group, in particular to analyze and/or make predictions about, for example, their ability to perform a task, interests, or likely behavior.

“Automated Decision-Making,” on the other hand, involves taking action.  As we’ll see a bit later, whether automated decision-making is solely automated or conducted with human involvement is important to understand, as certain laws require heightened compliance obligations if the decision-making is solely automated.

CPPA Rulemaking Thus Far

Under the statutory mandate stated above, the CPPA must issue regulations regarding:

  1. A definition of “automated decision-making technology”
  2. Opt-out rights for “automated decision-making technology, including profiling”
  3. Access rights for ADM and profiling, including
    1. Provision of information regarding the logic involved in such “decision-making processes” in response to access requests
    2. Description of the likely outcome of “of the process” for the consumer in response to access requests

Of all the concepts and terms in the statutory mandate, the CPRA only defines “profiling” (though the definition points to the mandate to provide leeway to alter the definition). Under the CCPA, as amended by the CPRA:

“Profiling” means any form of automated processing of personal information, as further defined by regulations pursuant to paragraph (16) of subdivision (a) of Section 1798.185, to evaluate certain personal aspects relating to a natural person, and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location or movements.

In light of the dearth of statutory text and the broad grant of authority granted to the Agency, the CPPA has dedicated a great deal of initial attention on rulemaking with respect to these concepts, with industry and consumer rights groups providing extensive feedback regarding various issues involved. This focus began in the fall of 2021 when profiling and automated decision-making were included as part of nine topics on which the Agency sought public comment. In late March, the CPPA hosted informational sessions—during which time the Agency discussed automated decision-making for the majority of an entire day, including cross-jurisdictional approaches to automated decision-making and profiling under the GDPR. Based on the CPPA’s preliminary rulemaking activities to date, the Agency appears to be focusing a significant amount of attention on how ADM and profiling are treated under GDPR, while also likely considering input from a number of commenters that consistency with the other U.S. state law frameworks is the right approach.

To issue effective regulations satisfying its entire regulatory mandate, the CPPA will likely have to address a number of issues and definitions in its regulations, including the following:

  1. “Automated Decision-Making Technology”

The CPPA is tasked with defining “automated decision-making technology” as that term is not defined in the statute.  While none of the other regulatory schemes specifically define “automated decision-making technology”, an examination of how they regulate automated decisions or automated decision-making, along with profiling, is helpful to understand the comparative landscape, and perhaps will inform how the CPPA will define the term.

  1. Automated Decision-Making and Profiling Opt-Outs

Immediately below is a table which includes the regulatory and statutory language of the GDPR and state laws, particularly as to restrictions and opt-out rights for ADM and profiling. Below that, we break down the restrictions (in the case of GDPR) and opt-out rights (in the case of all) that apply to automated decision-making and profiling. Notably, we refer to “human involvement” in the tables; if you search for “human involvement” in the GDPR or any of these other laws, you are not going to find it absent a passing reference in one of the GDPR’s recitals.  The GDPR does, however, have the concept of solely automated decision-making, and drawing a distinction between that concept and ADM with human involvement will be helpful when we know where the CPRA regs land on these issues. If the CPRA does end up introducing the concept of solely automated decision-making and corresponding rights and obligations, you will have to apply those accordingly.

Table 1.  Profiling and Automated Decision-Making (“ADM”): Restrictions and Opt-Out Rights under the General Data Protection Regulation (“GDPR”), Consumer Privacy Rights Act (“CPRA”), Virginia Consumer Data Privacy Act (“VCDPA”), Colorado Privacy Act (“CPA”), and Connecticut Act Concerning Personal Data Privacy and Online Monitoring (referred to as the “CTPA” herein).  Note: neither the California Consumer Privacy Act (“CCPA”) (pre-CPRA amendments) nor the UCPA include profiling or automated decision-making concepts.

  GDPR CPRA VCDPA CPA CTPA
 

Restriction (subj. to numerous exceptions): The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her. Art 22(1).

The data subject shall have the right to object, on grounds relating to his or her particular situation, at any time to processing of personal data concerning him or her which is based on [consent or the legitimate interests of the controller] including profiling based on those provisions. Art 21(1).

“Opt-out rights with respect to businesses’ use of automated decision-making technology, including profiling” [to be further defined in regulations]. Opt-out right: “To opt out of profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.” Opt-out right: “To opt out of profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.” Opt-out right:  “A consumer shall have the right to . . . opt out of the processing of the personal data for purposes of . . . profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer.”
Profiling generally Opt-out right (if based on legitimate interest or consent) TBD by Regs. No opt-out right absent decisions that produce legal or similarly significant effects No opt-out right absent decisions that produce legal or similarly significant effects No opt-out right absent decisions that produce legal or similarly significant effects
ADM with human involvement (including profiling) Opt-out right for profiling under the legal bases of public interest or legitimate interest and profiling for direct marketing purposes TBD by Regs. Opt-out right for profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer Opt-out right for profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer No. See below.
Solely ADM (including profiling) Prohibited if results in legal or similarly significant effects (subj. to exceptions, including opt-in consent). TBD by Regs. Opt-out right for profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer Opt-out right for profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer No. CTPA provides the right to opt-out of “profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer.” Note, this is distinct from VCDPA and CPA.
Solely ADM (no profiling) Prohibited if results in legal or similarly significant effects (subj. to exceptions, including opt-in consent) TBD by Regs. No opt-out right if profiling not involved No opt-out right if profiling not involved No. See above.

 

Based on the above tables, the key issues are as follows:

  1. Is profiling implicated? A GDPR-like approach would include an opt-out for just profiling without regard to legal or similarly significant effects, at least under certain circumstances, and a Virginia or Colorado-like approach would require this to be considered as well.
  2. Is automated decision making implicated? A GDPR-like approach would necessitate further analysis as to whether the decision-making is solely automated or includes human involvement. The approach outlined in the Colorado Privacy Act Regulations similarly requires companies to permit individuals to opt out of profiling if it is based solely on automated processing.
    1. Is it solely automated?
    2. Or, does it include human involvement.
  3. And finally, the third key issue is whether the decisions have produced or resulted in legal or similarly significant effects. Certainly, this is a key aspect of both a GDPR-inspired construct and Virginia/Colorado where the presence of legal or similarly significant effects will have a bearing on whether the processing can occur (as in the case of GDPR), whether an opt-out right is implicated (as in the case of Virginia and Colorado), and whether heightened compliance obligations will apply (in the case of Colorado).

Notably, Connecticut is similar to Virginia and Colorado but its opt out is limited to solely automated decision-making that result in legal or similarly significant effects.

 

  1. Legal or Similarly Significant Effects

With regard to what constitutes “legal or similarly significant effects,” GDPR guidance and the definitions provided under the Virginia and Colorado laws provide some insight. The European Data Protection Board (“EDPB”) states that in order for the outcome of an automated decision to amount to a “legal effect”, the decision must “affect[] someone’s legal rights, such as the freedom to associate with others, vote in an election, or take legal action. A legal effect may also be something that affects a person’s legal status or their rights under a contract.” In the same vein, an automated decision would amount to “similarly significant effects” if it is “sufficiently great or important to be worthy of attention. In other words, the decision must have the potential to: significantly affect the circumstances, behavior or choices of the individuals concerned; have a prolonged or permanent impact on the data subject; or at its most extreme, lead to the exclusion or discrimination of individuals.”

The EDPB provides a number of examples in its guidance. In addition, in May 2022, the Future of Privacy Forum released a comprehensive report on automated decision-making cases from EU courts and data protection authorities. The report is a compilation and summary of decisions on key issues in ADM, such as what is a legal and similarly significant effect, what is automated decision making, and so on. The timing of the FPF report is pretty fortuitous in light of the CPPA’s rulemaking activities, and perhaps the FPF is trying to provide this information for the CPPA’s consideration during the rulemaking process. Certainly the FPF is a very reputable and renowned organization and it would make sense for the Agency to be exploring various thought leadership, including by the FPF, as it considers regulations.

Under the EDPB guidance, several relatively routine business activities are considered to produce “legal or similarly significant effects,” particularly processing activities involving employees and applicants, including automatic refusal of job applications or making automated decisions about workers in relation to performance reviews.  Certain online behavioral advertising use cases may also have legal or similarly significant effects. In particular, the EDPB advises that factors such as intrusiveness of the profiling, expectations of the individual, the manner of delivery of the advertisement, and use of knowledge of vulnerabilities of the individual may result in the profiling activity having legal or similarly significant effects on an individual (for example, targeting a person known to be experiencing financial difficulties with ads for high-interest loans). This is of note because, depending on the outcome of the CPPA’s rulemaking activities, consumers may have additional rights related to digital advertising on top of do not sell and share.

As to Virginia and Colorado, the opt out right is limited to “profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.” As defined by these laws, such profiling includes decisions that results in the provision or denial of financial/lending services, housing, insurance, education or educational opportunities, criminal justice, employment, health-care services, or access to essential goods or services. However, in light of the fact that government agencies, and GLBA regulated entities such as financial institutions, insurance companies are not subject to the law, as well as the exclusion of employee and applicant data, these profiling opt-outs are seemingly pretty limited.

Table 2. Profiling and ADM: Legal and Similarly Significant Effects.

GDPR CPRA VCDPA CPA
Examples:

  • Automatically refusing job applications.
  • Algorithmically ranking workers.
  • Online behavioral advertising (under certain circumstances).
  • Cancellation of a contract (e.g., Terms of Service).
  • Certain applications of facial recognition technologies.
  • Credit scoring practices resulting in rejection of financial services.
  • Housing and benefits decisions.
TBD by Regs. “Decisions that produce legal or similarly significant effects concerning a consumer” means a decision made by a controller of financial and lending services, housing, insurance, education enrollment, criminal justice, employment opportunities, health care services, or access to basic necessities, such as food and water. “Decisions that produce legal or similarly significant effects concerning a consumer” means a decision made by the controller that results in the provision or denial by the controller of financial and lending services, housing, insurance, education, enrollment, criminal justice, employment opportunities, health care services, or access to essential goods or services.

 

  1. Access Rights

In addition to the three key issues above—profiling, ADM (solely automated or with human involvement), and legal or similarly significant effects—the other areas where we know we are definitely getting regs relate to access rights. In particular, the statute specifically mandates the Agency to issue regulations “ requiring businesses’ response to access requests to include meaningful information about the logic involved in such decision-making processes, as well as a description of the likely outcome of the process with respect to the consumer.”[1]

The first draft of the CPPA regulations includes detailed requirements with respect to other CCPA / CPRA rights (like the rights to know, access, correct, delete, and opt out of sales or sharing). These requirements will apply to ADM and profiling, in addition to any restrictions specific to those technologies. Assuming the CPPA regulations regarding ADM are similarly detailed, businesses should expect the Agency to define specific expectations with respect to the mechanics by which a business must respond to requests involving ADM and/or profiling. Such requirements should be built into the business’s process for handling consumer rights. As the draft regulations make clear, “[a] business that has failed to put in place adequate processes and procedures to comply with consumer requests in accordance with the CCPA and these regulations cannot claim that responding to a consumer’s request requires disproportionate effort.”[2]

Table 3. Profiling and ADM: Notice/Transparency, Access Rights.

GDPR CPRA VCDPA CPA
Access to meaningful logic Disclose in privacy policy and in responses to access request. Disclose in responses to access requests (subject to requirements set forth by Regs). No. Disclose in privacy policy and if denying request to opt out of profiling which does not produce legal or similarly significant effects.
Description of the likely outcome of the process with respect to the consumer Disclose in privacy policy and in responses to access request. Disclose in responses to access requests (subject to requirements set forth by Regs). No. Disclose in privacy policy and if denying request to opt out of profiling which does not produce legal or similarly significant effects.

 

  1. Notice at Collection

Under the GDPR Articles 13 and 14, data subjects are entitled to information regarding “the existence of” qualifying ADM, including profiling, “and, at least in those cases, meaningful information about the logic involved, as well as the significance and envisaged consequences of such processing for the data subject.” Note, these disclosure requirements only apply to qualifying ADM (decisions without human involvement producing legal or similarly significant effects on an individual). Under a similar approach, only qualifying ADM producing consequential impacts on an individual would require enhanced disclosures. Consumers are entitled to similar disclosures under the Colorado Privacy Act Regulations, as well as: (1) what decision was subject to profiling; (2) the categories of personal data used in the profiling; (3)  why profiling is relevant to the decision made; (4) if the profiling was used to serve ads regarding housing, employment, or financial or lending services; (5) if the profiling systems have been evaluated for accuracy, fairness, or bias, including the impact of the use of sensitive data; (6) information regarding the right to opt out.[3]

While we do not yet have any regs on ADM and profiling, the CPRA draft regulations broadly state that “The purpose of the notice at collection is to provide consumers with timely notice…so that consumers can exercise meaningful control over the business’s use of their personal information….For example, upon receiving the notice at collection, the consumer have all the information necessary to choose whether or not to engage with the business.” As a result, it is conceivable that the CPPA could issue specific regulations touching on profiling or ADM or perhaps expect that ADM and/or profiling activities be meaningfully disclosed in a business’s notice at collection.

 

  1. Data Protection Impact Assessments (“DPIAs”)

The CPRA requires the Agency to “[i]Issu[e] regulations requiring businesses whose processing of consumers’ personal information presents significant risk to consumers’ privacy or security, to” perform cybersecurity audits and submit risk assessments to the Agency.[4] Similarly, GDPR Article 35 requires companies to conduct DPIAs for “high risk” processing activities and identifies a number of activities which are presumptively high risk. Enumerated in the list of presumptively high risk activities is “a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person[.]” Additionally, many supervisory authorities maintain lists of activities presumed to be “high risk”.[5]

The Colorado Privacy Act Regulations also require data protection assessments for automated profiling resulting in legal or similarly significant effects. In addition to the data protection assessments outlined in the statute, the regulations require consideration of factors such as the personal data processed, the decisions which will be made regarding consumers, explanation of the training data and logic used to create the profiling system, plain language description of the outputs of the profiling process, and safeguards for data sets produced by or derived from the profiling.

Presumably, processing activities involving qualifying ADM (and potentially certain profiling activities), will require a risk assessment and audit under the CPRA Regulations. As indicated in the statutory text, the risk assessments must be submitted to the CPPA “on a regular basis[.]”

Table 4.  Profiling and ADM: DPIAs.

GDPR CPRA VCDPA CPA
Data Protection Impact Assessment (“DPIA”) required? Yes, generally for high risk processing. Yes, generally for high risk processing. Yes, for profiling that presents substantial injury to consumers. Yes, for profiling that presents the risk of substantial injury to consumers and processing producing legal or similarly significant effects.

 

  1. ADM Involving Sensitive Data

Under the GDPR, controllers may not use qualifying ADM to process EU special categories of personal data,[6] except where the data subject has consented, or where “processing is necessary for reasons of substantial public interest, on the basis of Union or Member State law[.]”[7] Under the Colorado Privacy Act Regulations, controllers that use sensitive personal data for profiling producing legal or similarly significant effects must include a description of the impact of the use of such data in privacy notices. Additionally, data protection assessments must include the data elements to be considered in the profiling (including sensitive personal data), and such data must be described when requesting consent from consumers or denying requests to opt out of profiling which does not produce legal or similarly significant effects.

The CPRA identified a new category of “sensitive personal information” defined as:

[P]ersonal information that reveals (A) a consumer’s social security, driver’s license, state identification card, or passport number; (B) a consumer’s account log-In, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account; (C) a consumer’s precise geolocation; (D) a consumer’s racial or ethnic origin, religious or philosophical beliefs, or union membership; (E) the contents of a consumer’s mall, email and text messages, unless the business is the intended recipient of the communication; (F) a consumer’s genetic data; and (2)(A) the processing of biometric information for the purpose of uniquely identifying a consumer; (B) personal information collected and analyzed concerning a consumer’s health; or (C) personal information collected and analyzed concerning a consumer’s sex life or sexual orientation.

Like the GDPR, the Agency may decide to more strictly regulate (or outright prohibit) qualifying ADM involving sensitive personal information. At the least, such processing will most likely be subject to the CPRA’s audit and assessment requirements.

 

  1. What else might we see in the ADM and Profiling Regs?

Will there be a separate, standalone profiling opt-out?  Arguably, the mandate is only for opt-out rights for “automated decision-making technology, including profiling” – as such, the mandate does not seem to consider profiling separate and apart from ADM.

Will the CPPA go beyond an ADM opt-out as mandated, by, for example, prohibiting solely ADM (with certain exceptions, such as consent) like the GDPR? Doing so would seem to go beyond its mandate and regulatory authority.

Will the CPPA take a risk-based approach as is the case under AI-specific legislation (such as the EU’s proposed AI Act)? As we discussed in a blog post on public comments regarding this topic, many commenters urged the Agency to do so, including by only regulating high risk activities. Certainly, that would almost seem necessary, as many ADM processes are routine and innocuous, and requiring businesses to operationalize opt-out rights and prescriptive access rights with respect to non-high risk activities would be overly burdensome and not further the purposes of the law.

Final decisions only? Notably, a number of submissions to the CPPA in the Fall of 2021 in response to the CPPA’s solicitation for comments urged the Agency to only regulate final decisions, and most who put forth this argument also urged the Agency to regulate only final decisions, without human involvement, where legal or similarly significant effects have occurred. Many ADM processes involve a number of decisions prior to a final decision being made.

Given the broad mandate to issue regulations furthering the purpose of the CPRA, the CPPA could conceivably decide to wade into additional issues regarding ADM or profiling.

 

Takeaways

At the very least, we know at least five or six areas that your data compliance program should begin to address and focus on with respect to automated decision-making and profiling. Taking clues from the Agency’s seemingly comparative approach, and using what we know from other frameworks such as the GDPR and other state laws, we have proposed a framework that will address what will likely come out of the rulemaking mandate, and that you should already have on hand for GDPR compliance (if applicable) or begin to compile for compliance with the other 2023 state laws.

More broadly, AI and algorithmic decision-making are and will continue to be a focus of regulators and legislators in the US and abroad. Leading and progressive companies are already taking steps to ensure responsible and ethical use of AI, including by building out AI governance programs alongside their privacy compliance and data governance programs, to get ahead of future legislation and to address the already real risk associated with AI and algorithm-based systems.

We have experience in, and are actively working with a number of clients in, building out algorithmic and AI governance programs alongside their privacy and data governance programs. If you have any questions on these topics or would like to further discuss, please reach out to one of the authors or your SPB relationship attorney.

[1] CPRA 1798.185(16).

[2] Draft CPRA Regulations § 7001(h).

[3] CPA Rule 9.03(A).

[4] CPRA 1798.185(15)(A) and (B).

[5] See, e.g., Irish Data Protection Commission, List of Types of Data Processing Operations which Require a Data Protection Impact Assessment, available at https://www.dataprotection.ie/sites/default/files/uploads/2018-11/Data-Protection-Impact-Assessment.pdf

[6] GDPR Article 9 lists several items of personal data viewed as particularly sensitive under the GDPR, which include “personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation[.]”

[7] GDPR Article 22(4).