Keypoint: The California Privacy Protection Agency continued its rulemaking efforts by releasing draft automated decisionmaking technology regulations although the Agency has yet to initiate the formal rulemaking process.
On November 27, 2023, the California Privacy Protection Agency (Agency) published draft automated decisionmaking technology regulations as well as revised draft risk assessment regulations. The draft regulations were released in connection with the Agency’s December 8 board meeting. Importantly, the draft regulations are only intended to facilitate Board discussion and public participation. The Agency has not yet started formal rulemaking.
This article focuses on how the two draft regulations address automated decisionmaking technology (ADMT). The risk assessment regulations contain additional provisions that are not addressed herein. In addition, given that these are only draft regulations, this article only provides a high-level summary and some takeaways. It does not provide an exhaustive analysis of the draft regulations.
- This rulemaking activity will place the Agency as a (if not “the”) primary regulator in the United States for business’s use of artificial intelligence for in-scope processing activities. In the absence of federal preemptive lawmaking, states continue to regulate in emerging areas. That said, these regulations are likely only the start as a number of states (including California) are expected to introduce AI legislation when legislatures open next year.
- Although there is some conceptual overlap with the Colorado Privacy Act’s Rules on the right to opt out of profiling and data protection impact assessments, the draft California regulations are much broader. For example, the California regulations apply to automated decisionmaking technology, which includes (but is not limited to) profiling.
- The draft regulations apply to the automated processing of certain types of employee information such as keystroke loggers, productivity or attention monitors, and web-browsing. Businesses with California employees would need to take a close look at their employee data collection activities to ensure compliance.
- The draft regulations also apply to automated processing of certain types of information at publicly accessible locations such as wi-fi and Bluetooth tracking and facial recognition/detection. Depending on the eventual scope of the regulations, a number of advertising and marketing practices may fall into scope.
- We are still a long way away from final regulations. These are draft regulations that still need Board input. Once final proposed regulations are ready, the Agency will receive Board approval to initiate formal rulemaking, which will allow for further public comment and proposed changes.
Between the two sets of regulations, the Agency would regulate certain uses of ADMT by requiring businesses to provide consumers with (1) a notice of the business’s use of ADMT; (2) a right to opt out of certain uses of ADMT; and (3) a right to access certain information about the business’s use of ADMT (as well as an affirmative obligation for businesses to provide a notice to consumers of an adverse action using ADMT under certain circumstances). Businesses also would need to conduct risk assessments for certain types of ADMT uses.
The draft regulations define ADMT broadly to include “any system, software, or process—including one derived from machine-learning, statistics, or other data-processing or artificial intelligence—that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking.” In particular, the phrase “facilitate human decisionmaking” suggests the regulations will apply broadly to many different types of activities.
ADMIT also is defined to include “profiling,” which, in turn, is defined as “any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”
The definition of profiling is similar, but not identical to, the definitions found in consumer privacy laws in states such as Colorado and Connecticut. For example, Colorado and Connecticut do not include “performance at work,” which makes sense given that those laws do not apply to employee data.
“Artificial intelligence” is defined as “an engineered or machine-based system that is designed to operate with varying levels of autonomy and that can, for explicit or implicit objectives, generate outputs such as predictions, recommendations, or decisions that influence physical or virtual environments. Artificial intelligence includes generative models, such as large language models, that can learn from inputs and create new outputs, such as text, images, audio, or video; and facial- or speech-recognition or -detection technology.”
Right to Opt Out
The draft regulations allow consumers to opt out of a business’s use of ADMT in three situations: (1) for a decision that produces legal or similarly significant effects concerning the consumer; (2) profiling of a consumer who is acting in their capacity as an employee, independent contractor, job applicant, or student; and (3) profiling a consumer while they are in a publicly accessible place.
The first situation is arguably conceptually similar to the right to opt out of profiling found in consumer privacy laws in states such as laws Colorado and Connecticut. However, as discussed, ADMT is defined more broadly than profiling is defined in those state laws.
Allowing employees to opt out of their employers use of ADMT for profiling is notable. The regulations define “profiling” to include, among other things, a “natural person’s performance at work.” The regulations specifically envision that the right would extend to things like keystroke loggers, productivity or attention monitors, web-browsing and social-media monitoring tools.
Allowing consumers to opt out of profiling is public places is also notable. The regulations define publicly accessible places as “a place that is open to or serves the public” such as malls, stores, movie theatres, hospitals, streets and parks. For example, California residents would have the right to opt out of a business’s use of automated processing of their personal information to analyze or predict their “personal preferences” while at a store. This includes using wi-fi or Bluetooth tracking, facial recognition or detection, geofencing and automated emotion assessment, among others.
In addition, the draft regulations identify three other situations for Board consideration: (1) profiling of a consumer for behavioral advertising; (2) profiling a consumer that the business has knowledge is under 16 years of age; and (3) processing personal information to train automated decisionmaking technology.
Businesses would need to provide two opt out methods. Businesses could deny an opt out request if they have a good faith belief that it is fraudulent. In addition and subject to certain qualifications, businesses could deny an opt out request if they have complied with regulation 7002 and their use of ADMT is for a specific purpose such as to prevent or detect security incidents, resist fraudulent actions or provide goods or services the consumer requested (under limited circumstances).
Businesses that must provide an opt out also must provide consumers with a “pre-use notice” that advises them of their rights to opt out and access information about the business’s use of ADMT. The pre-use notice must be provided before the business processes the consumer’s personal information using the automated decisionmaking technology.
The pre-use notice must include information such as a plain language explanation of the purpose for which the business proposes to use ADMT and a description of the rights to opt out and access. The notice also must direct consumers to a larger disclosure that includes additional information such as the logic used in the ADMT, the intended output and whether the business’s use of ADMT has been evaluated for “validity, reliability, and fairness, and the outcome of any such evaluation.”
Right to Access
Businesses that must provide an opt out also must provide consumers with a right to access information about the business’s use of ADMT. In responding to the request to access and subject to certain exemptions, a business must provide the consumer with (among other things): (1) the purpose for which the business used ADMT; (2) the output of the ADMT with respect to the consumer; (3) how the business used (or plans to use) the output to make a decision with respect to the consumer; (4) how the automated decisionmaking technology worked with respect to the consumer; (5) a method by which the consumer can obtain the range of possible outputs, and (6) instructions for how the consumer can exercise their other CCPA rights.
Notably, if a business makes a decision that results in the denial of certain goods or services that produces legal or similarly significant effects concerning the consumer (e.g., denial of an employment opportunity), the business must notify the consumer of the decision, including that the consumer has a right to access information regarding the business’s use of ADMT (and how to exercise that right) and how to complain to the Agency or Attorney General.
Businesses that use ADMT for any of the following four purposes also will need to conduct a risk assessment:
- For a decision that produces legal or similarly significant effects concerning a consumer;
- Profiling a consumer who is acting in their capacity as an employee, independent contractor, job applicant, or student;
- Profiling a consumer while they are in a publicly accessible place; and
- Profiling for behavioral advertising.
The draft regulation identifies another option for the Board to discuss including in the regulation, which is the processing of consumer personal information to train ADMT or artificial intelligence for (1) the above four purposes, (2) establishing individual identity on the basis of biometric information; (3) facial, speech, or emotion-detection, (4) the generation of deep fakes; or (5) the operation of generative models, such as large language models.
Risk assessments would need to follow extensive and detailed regulatory requirements and be completed before the business engages in the processing activity. Businesses would be required to submit a certificate of compliance and “risk assessment in abridged form” to the Agency within a to-be-defined time after the regulations go into effect and annually thereafter. The Agency also could request full risk assessments, which would need to be provided within five business days of the request.