Balancing act: how California is navigating AI and privacy

Introduction 

2023 has been a landmark year for artificial intelligence (AI). AI innovations experienced a boom back in March with OpenAI’s release of GPT-4. Since then, other tech companies have thrown their hat into the ring, from Google’s Bard (a direct response to ChatGPT) to Meta’s LLaMA (a freely available AI model).

As the public started to form their opinions, those more privacy-focused started to wonder: “what does this mean for my data?” Luckily, the California Privacy Protection Agency (CPPA) is continuing to lead the way in privacy protections in the U.S. by addressing the rise of AI. Let’s go over the agency’s plans to strike a balance between AI innovation and privacy preservation through transparency and proper risk assessment.

Transparency around automated decisions

Under the provisions of the California Privacy Rights Act (CPRA), the CPPA is spearheading efforts to draft regulations around AI and Automated Decision Making (ADM). The draft regulations, though not yet finalized, outline requirements for businesses using AI or Automated Decision-making Technologies (ADT).

Automated Decision-making Technologies (ADT) – Any system processing personal information to facilitate human decision-making.

One of the core aspects of the proposed regulations is requiring transparency in ADT. The CPPA plans on requiring companies to provide information on the following:

  • Why they are using ADT
  • The benefits of using ADT over manual processing
  • The appropriate use and limitations of ADT
  • What output will be generated
  • Steps taken to maintain the quality of personal information processed by ADT
  • How its AI systems are trained

One of the most important focuses in automated decision-making is detecting and mitigating any biases that may unknowingly exist in ADTs. Having proper audits and a thorough understanding of the systems will go a long way for the public to gain trust in these technologies.

Proper risk assessment and cybersecurity

The second key component of the recent draft regulations is a focus on in-depth risk assessments. These risk assessments are designed to make businesses consider the security implications when working with personal data. To mitigate risk, the draft regulations recommend strong cybersecurity infrastructure standards, including the use of firewalls, encryption protocols, and intrusion detection systems. The open questions around this topic are:

  • The specific requirements for conducting a cybersecurity audit (number of customers, gross revenue, number of employees, etc.)
  • The scope of a cybersecurity audit
  • What’s considered proper safeguards

Additionally, the CPPA proposes that a business’ incident response preparedness should be audited. This ensures that proper actions, reporting, and remediation are in place to minimize any potential damage in the event of a data breach.

Conclusion

The CPPA discussed these proposals during their last meeting on September 8th and are slated to reconvene on December 8th of this year. Be sure to keep your eyes on the agency and stay up-to-date on this ever-changing privacy landscape.