Keypoint: Although they are only draft regulations and not part of the formal rulemaking process, the drafts demonstrate the Agency’s intent to create extensive obligations for businesses subject to these regulations.
In connection with its September 8, 2023 Board meeting, the California Privacy Protection Agency (“Agency”) published draft regulations on risk assessments and cybersecurity audits. The drafts were provided as meeting materials for a CPRA rules subcommittee update.
The drafts specifically state that they are intended “to facilitate Board discussion and public participation” and are “subject to change.” To that end, the drafts identify specific text for the Board to discuss and, in some instances, identify multiple options for Board consideration. The drafts also note that the Agency “has not yet started the formal rulemaking process for cybersecurity audits, risk assessments, or automated decisionmaking technology.”
Although these are only drafts, they nonetheless provide an initial insight into the Agency’s thought process for these new and significant rulemaking topics. In short, the drafts indicate the Agency’s intent to create extensive obligations for businesses subject to these regulations. In the below post, we provide a high-level summary and analysis of some of the more notable parts of the drafts.
The Agency’s risk assessment draft regulations are conceptually analogous to data protection impact assessments required by GDPR and data protection assessments required by the Colorado Privacy Act and similar state laws but with important differences. In short, the draft regulations would create extensive compliance obligations across a broad array of processing activities.
Businesses would be required to conduct risk assessments if their processing of consumers’ personal information “presents significant risk to consumers’ privacy.” The draft regulations go on to identify seven instances in which a risk assessment would be required:
- Selling or sharing of personal information.
- Processing of sensitive personal information.
- Using automated decisionmaking technology in furtherance of certain consequential decisions such as the provision or denial of financial or lending services.
- Processing personal information of consumers that the business has actual knowledge are under 16 years of age.
- Processing personal information of employees, independent contractors, job applicants or students using technology that monitors them such as keystroke loggers, location trackers, and facial or speech recognition or detection.
- Processing of personal information of consumers in publicly accessible places using technology to monitor consumers’ behavior, location, movements or actions.
- Processing personal information to train artificial intelligence or automated decisionmaking technology.
The drafts provide definitions of “artificial intelligence” and “automated decisionmaking technology.”
The scope of potential areas for risk assessments appears to be inclusive of the scope found in the Colorado Privacy Act and similar state laws. However, the Agency adds additional categories such as the processing of certain types of information in public places and from employees. The draft also provides six helpful examples where risk assessments would be required although it is important to note that in prior rulemaking activities the Agency often changed or eliminated such examples.
Similar to the Colorado Privacy Act Rules, the draft regulations create extensive requirements for completing risk assessments. The Agency identifies thirteen (or potentially fourteen depending on which option the Board chooses) topics that businesses would need to consider. Many of these topics include multiple sub-topics.
The draft regulations also create additional requirements for (1) businesses using automated decisionmaking technology for processing that will be subject to the automated decisionmaking technology access/opt-out rights and (2) the processing of personal information to train artificial intelligence or automated decisionmaking technology where the business has made or is making that artificial intelligence or automated decisionmaking technology available to other persons for their own use.
Based on this analysis, the business must satisfy itself that the benefits resulting from the processing to the consumer, the business, other stakeholders, and the public outweigh the risks to consumers’ privacy. If it does not, then the business “shall not” engage in that processing activity.
Tangentially, the draft regulations reference sections 7030 and 7031, which have not yet been published. Those sections will create new obligations for processing that is subject to the automated decisionmaking technology access and opt-out rights.
Submission of Risk Assessments to the Agency
Businesses would need to make risk assessment available to the Agency or the Attorney General upon request. Businesses also would be required to annually submit to the Agency (1) the business’s risk assessment in “an abridged form” and (2) a certification by a designated executive that the business has complied with the risk assessment requirements.
The cybersecurity audit regulations seek to operationalize the CCPA’s information security provisions, including the requirement that businesses whose processing of consumers’ personal information presents significant risk to consumers’ privacy or security must perform an annual cybersecurity audit.
The scope of which businesses will need to complete cybersecurity audits is identified as one of the areas for Board discussion. The draft includes data brokers (i.e., businesses that derive 50% or more of their annual revenue from selling or sharing personal information) as one category of covered businesses. However, the draft identifies a number of additional potential categories based on the amount and type of personal information a business processes, its gross revenue, and its number of employees.
As with risk assessments, the draft creates extensive requirements for conducting cybersecurity audits. Businesses are required to complete an audit “using a qualified, objective, independent professional . . using procedures and standards generally accepted in the profession of auditing.” A business may use an internal auditor but only under certain conditions. The draft contains around six pages of requirements for conducting these audits. Although too lengthy to summarize here, the takeaway is that businesses subject to the cybersecurity audit requirements will need to implement and maintain extensive information security requirements if they are not already doing so.
Notice of Compliance
Businesses that are required to complete a cybersecurity audit would need to submit to the Agency either (1) a written certification that the business complied with the regulatory requirements during the 12 months that the audit covers or (2) a written acknowledgment that the business did not fully comply with the requirements, including identification of the areas of noncompliance as well as a remediation timeline. The written certification or written acknowledgment must be signed by a member of the board or governing body or, if none exists, the business’s highest-ranking executive with authority to bind the business.