720-891-1663

CPPA Publishes New Draft Regs Addressing AI, Risk Assessments and Cyber Audits

California’s privacy regulator has released draft regs covering cybersecurity audit requirements and risk assessments. While they have not started the formal rulemaking process, they are putting out bait to see what kind of comments they get. They also discussed the drafts at their last meeting.

The Draft Cybersecurity Audit Regulations make both modifications and additions to the existing California Consumer Privacy Act (“CCPA”) regulations, including:  

  • Outline the requirement for annual cybersecurity audits for businesses “whose processing of consumers’ personal information presents significant risk to consumers’ security”;
  • Outline potential standards used to determine when processing poses a “significant risk”;
  • Propose options specifying the scope and requirements of cybersecurity audits; and
  • Propose new mandatory contractual terms for inclusion in Service Provider data protection agreements.

Again, none of this is mandatory for anyone yet, but some version of it will be required.

The Draft Risk Assessment Regulations propose both modifications and additions to the existing CCPA regulations. The draft regulations:

  • Propose new and distinct definitions for Artificial Intelligence and Automated Decision-making technologies;
  • Identify specific processing activities that present a “significant” risk of harm to consumers, requiring a risk assessment. These activities include:
    • Selling or sharing personal information; processing sensitive personal information (outside of the traditional employment context); using automated decision-making technologies; processing the information of children under the age of 16; using technology to monitor the activity of employees, contractors, job applicants, or students; or
    • Processing personal information of consumers in publicly accessible places using technology to monitor behavior, location, movements, or actions.
  • Propose standards for stakeholder involvement in risk assessments;
  • Propose risk assessment content and review requirements;
  • Require that businesses that train AI for use by consumers or other businesses conduct a risk assessment and include with the software a plain statement of the appropriate uses of the AI; and
  • Outline new disclosure requirements for businesses that implement automated decision-making technologies.

You will notice that the categories of activities that present a significant risk of harm include at least one item that your company does.

Credit: The Ballard Spahr law firm

Facebooktwitterredditlinkedinmailby feather

Leave a Reply

Your email address will not be published. Required fields are marked *