California Privacy Protection Agency
Seeks to Advance Proposed Regulations That Clearly Exceed the Actual Statutory Authority Granted to Agency by Voters Under Proposition 24
•In 2018, the Legislature enacted the California Consumer Privacy Act, creating eight core privacy rights for consumers.
•In 2020, voters approved the California Privacy Rights Act via Proposition 24, expanding upon those rights, including a new right to opt out of “sharing” personal information (PI), where sharing includes “cross-context behavioral advertising.” Voters did not create any right to opt out of behavioral advertising (or first-party advertising) more generally.
•Voters also approved the creation of the new California Privacy Protection Agency (CPPA) and expressly authorized the CPPA to adopt certain implementing regulations. These included provisions on issuing regulations requiring certain businesses to conduct cyber audits and to submit a risk assessment to the agency on a regular basis with respect to their processing of personal information (PI), and a narrow provision on issuing regulations governing access and opt-out rights with respect to businesses’ use of automated decision-making technology (ADMT). Nowhere in statute did voters grant the agency authority over artificial intelligence (AI) or have reason to believe that a privacy agency would have general authority over AI.
•Nonetheless, on November 8, 2024, the CPPA voted to advance a set of proposed regulations to formal rulemaking that far exceed their authority, as recognized by Board member Alastair Mactaggart, who was also the proponent of Proposition 24. These regulations create significant uncertainty and pose significant risk to California’s economy, harming businesses and consumers alike, and getting ahead of both the Legislature and Governor in the process.
History of California Consumer Privacy Act
In 2018, the Legislature unanimously passed AB 375 (Chau et al., Chapter 55, Statutes of 2018), enacting the California Consumer Privacy Act (CCPA), to increase transparency and consumer control over the collection and sale of their personal information (PI), and to supplant a pending ballot measure, as discussed below.
Modeled in part on the European Union’s General Data Protection Regulation (GDPR), which took effect in May 2018, the CCPA was the first comprehensive, technology-neutral, and industry-neutral consumer privacy statute of its type in the United States, establishing eight general privacy rights, with limited exceptions. The law applies to businesses of all sizes (for example, not only capturing businesses with an annual revenue above $25 million, but also those with revenue lower than $25 million that sell or share significant amounts of consumer PI), across all industries, irrespective of the specific technology (if any) used to collect or sell consumer PI — brick-and-mortar businesses, and technology companies alike.
These new CCPA rights included: the right to be told certain information, including their CCPA rights and the categories of PI that a business collects about its consumers; the right to know / request access to the certain categories of PI that the business collected from the consumer, including the right to access specific pieces of information collected about that consumer; the right to request deletion; the right to opt out of the sale of their PI (or opt in, if under 13); the right against discrimination for exercising their rights; a limited private right of action for certain data breaches; the right to know and be given an opportunity to opt out of any further sale of PI that was sold to a third party; and a right of portability. (See Civil Code Section 1798.100 et seq.)
A major element of the deal that led to the passage of the 2018 legislation and agreement to pull the alternative measure from that year’s ballot was that the law would be subject to a single enforcement entity that would also be charged with establishing implementing regulations and have authority to provide guidance to businesses for compliance purposes: the Attorney General’s office.
Just two years later, that agreement was undone in a new initiative run by the same proponents. That new initiative, Proposition 24, created a new administrative enforcement entity and regulatory body for the privacy law, to be known as the California Privacy Protection Agency.
Proposition 24’s Changes to the CCPA and Creation of Privacy Agency
In November 2020, voters adopted Proposition 24, “the California Privacy Rights Act of 2020” (CPRA) adding to and otherwise revising various consumer privacy rights under the CCPA, establishing demanding new standards regarding the collection, retention and use of consumer PI. Relevant to the Agency’s current rulemaking activities, the proposition expanded the consumer’s existing right to opt out of the sale of his or her PI (where “sale” included any form of disclosure for valuable consideration, and not just disclosures made in exchange for monetary value) to also include the right to opt out of any “sharing” of their PI. “Sharing,” in turn, was specifically defined to include any form of dissemination or disclosure of PI for “cross-context behavioral advertising” — effectively, targeted advertising of a consumer across third-party platforms based on the consumer’s PI obtained from tracking their activity across businesses, websites, apps or services, as opposed to advertising from a (first-party) business to its own customers.
Notably, in passing Proposition 24, voters also established a new regulatory and administrative enforcement entity within state government vested with full administrative power, authority and jurisdiction to implement and enforce the CCPA, called the California Privacy Protection Agency.
Governed by a five-member board appointed by the Governor, Attorney General, Senate Rules Committee and Assembly Speaker, comprised of Californians with expertise in areas of privacy, technology, and consumer rights (Civil Code Section 1798.99.10), the Agency was given specific responsibilities, such as providing consumers guidance about their rights, providing technical assistance to the Legislature upon request with respect to privacy-related legislation, and monitoring relevant developments relating to the protection of PI and, in particular, the development of information and communication technologies and commercial practices.
Most notably, the Agency was charged with assuming rulemaking responsibilities from the Attorney General (Civil Code Section 1798.99.40) and required to adopt a mandatory set of final regulations on or before July 1, 2022, that would flesh out and operationalize 15 new requirements imposed by the law, subject to a one-year moratorium that would have provided businesses time to ramp up implementation and come into full compliance prior to the law becoming enforceable (Civil Code Section 1798.185).
The Agency did not begin any formal rulemaking activities until July 8, 2022, and even then, split its regulations into separate rulemaking packages. The first set of regulations was finalized and approved by the Office of Administrative Law (OAL) at the end of March 2023, but covered only eight of the 15 regulatory areas for which regulatory guidance was required, leaving a significant number of missing regulations on topics such as cybersecurity audits, risk assessments and automated decision-making technology (ADMT).
The express authority granted by voters, however, was limited to certain issues within those topics. Meaning, the voters did not elect to give authority for the Agency to adopt regulations on automated decision-making technology, on risk assessments, or on cybersecurity audits in general. Instead, they gave specific authority to regulate specific issues relating to automated decision making, risk assessments, and cyber audits. For example, they granted authority to issue “regulations governing access and opt out rights with respect to businesses’ use of automated decision-making technology” as opposed to “regulations governing automated decision-making technology.” By that same token, while the enumerated list is not necessarily exhaustive (as indicated by the “including, but not limited to” phrase), it is not unlimited. Any “implied” authority is limited to “regulations to further the purposes of the title” (the California Consumer Privacy Act).
Notably, nowhere in that title was there any mention of AI when the voters passed Proposition 24 creating the Agency and charging it with adopting regulations on issues relating to ADMT, risk assessments, or cyber audits. Even if personal information somehow is connected to AI, and even if ADMT involves AI, that does not give the Agency implied authority over AI. Imagine if the Agency were to argue that they have implied authority over the Department of Motor Vehicles (DMV), all cars, and California highways simply because there can be PI in DMV records or cars traveling on California highways where there may be surveillance.
Proposed Cyber Audit, Risk Assessment and ADMT Regulations Advance to Formal Rulemaking Despite Significant Flaws, Including Lack of Statutory Authority
On November 22, 2024, the CPPA entered a major rulemaking effort related to cybersecurity audits, risk assessments, and automated decision-making technology (ADMT). This rulemaking, alarmingly, includes regulations for which the CPPA has no legal authority (or questionable legal authority at best) and stands to devastate the California economy.
Since the Agency first circulated the draft regulations more than a year ago, the California Chamber of Commerce and the business community at large have testified on several key points ad nauseam, including that the Agency has far exceeded its authority, going beyond the bounds of the CCPA and even what is commonly understood to be privacy regulations. In fact, at times the Agency has veered into issuing general AI regulations, getting ahead of the Legislature and Governor in doing so.
At other times the Agency has effectively rewritten law, such as by creating overly broad requirements for first-party behavioral advertising when they exist only for cross-context behavioral advertising under the plain letter of the law approved by voters. In doing so, the Agency is effectively creating a new opt-out right of first-party advertising between a business and its own consumers, as opposed to the type of targeted advertising that voters had in mind, which tracks a consumer across businesses, distinctly branded websites, apps or services “other than” the business, distinctly branded website, app or service that the consumer intentionally interacted with (Civil Code Section 1798.140).
And there is also, of course, the issue of the Agency ignoring the voters’ directives by issuing regulations that do not adhere to the parameters that the voters specifically set. For example, the Agency fails to ensure that the regulations are focused on significant risk and consider not only the size of a business, but also the size and complexity of the business, and the nature and the scope of their processing activities. Notably, similar issues regarding the Agency exceeding the scope of its statutory authority have been raised for months by various CPPA Board members — including former board member Lydia de la Torre and sitting member Alastair Mactaggart, particularly when it comes to issues around AI.
Unfortunately, the Agency held meeting after meeting to hear these concerns, only to do nothing time and time again. This was true even after the July board meeting, when the Agency surprisingly decided not to proceed to formal rulemaking and gave the impression that staff would come back with options to address board member concerns at the next meeting. Once again, no changes were made by the time the board reconvened in November 2024.
At that meeting, the business community showed up in droves to testify, making it known under no uncertain terms that the Agency’s proposed rules had gone too far, would hurt businesses, and should not advance to rulemaking. The board’s response, however, was to dismiss the regulated community’s comments. Their comments ranged from confusion over why the public felt the rules were rushed when they had been at this for so many months (Board member Jeffrey Worthe, missing the point that the Agency had sat on its hands for 90% of those months when it could have fixed these issues); espousing that the Agency has conducted “far more intense, careful, deliberative work here than [they] could ever expect [their] colleagues at the Legislature to do” (Board member Drew Liebert); and misrepresentations of law and fact, stating they had no choice but to proceed to formal rulemaking (Chair Jennifer Urban “and it is public record that we have in fact been sued on a theory that we have been too late in promulgating these regulations. So this is not a question of us just deciding to do this. This is a question of us being mandated to do it.”). (See CPPA Transcript, November 8 Board Meeting (November 8, 2024) pp. 92, 94, 127, available at Transcription, Audio 11-08-2024 as of December 10, 2024.)
Ultimately, the Agency voted 4-1 to advance the proposed rules to formal rulemaking, with Mactaggart voting against moving forward, and subsequently issued the notice of rulemaking to do so on November 22, 2024, starting the public comment period and formal rulemaking process.
Wildly Inaccurate Economic Impact Analysis
Since 2011, any major regulation subject to Office of Administrative Law (OAL) review that has an economic impact exceeding $50 million, as estimated by the agency, requires that the agency conduct a Standardized Regulatory Impact Assessment (SRIA) consistent with regulations adopted by the Department of Finance. (See SB 617 (Calderon, Chapter 496, Statutes of 2011)).
In this case, the CPPA’s SRIA concluded that the regulations will result in direct costs to California businesses of $3.5 billion in the first full year; average annual business costs of $1.08 billion over the first 10 years; and employment losses peaking at 126,000 in 2030. Similarly, it estimated annual state revenue losses reaching $2.8 billion in 2028. And yet, the SRIA claims long-term benefits will exceed these costs.
An independent economic analysis by experts at Capitol Matrix Consulting, including a former Director of Finance, however, reflects that that the Agency’s SRIA is off by billions of dollars, having vastly underestimated the costs and overestimated the savings and even included a mathematical error in its calculations. Commissioned by CalChamber, the report analyzes anticipated savings detailed in the CPPA’s SRIA and concludes that businesses, consumers and governments in California will suffer net economic losses, translating into reduced jobs and tax revenues, from the Agency’s proposed rules. Specifically, it details errors in the SRIA that include:
• Underestimating external auditor and employee compensation rates paid by businesses;
• Excluding from its economic analysis out-of-state businesses that sell into California markets; and
• Ignoring the massive ongoing costs and business productivity losses resulting from behavioral changes by businesses and consumers following adoption of the regulations.
In addition, the SRIA overstates the savings from the proposed regulations by:
• Grossly overestimating baseline cybercrime losses due to an arithmetical error and other factors, including a flawed approach to estimating future cybercrime losses; and
• Overestimating savings from audits and risk assessments based on assumptions not supported by the literature, including articles listed in the SRIA.
The analysis warns that there are major implications for California jobs and state budget revenues from the Agency’s erroneous estimations.
The CalChamber submitted this report to the Agency and each of the CPPA Board members prior to their vote to advance the rules to formal rulemaking. Not unsurprisingly, one of the reasons for not revising their draft regulations despite the objections of a Board member that the regulations exceeded their scope of authority, was that they would have to redo the SRIA. (See CPPA Transcript at pp. 116-118: “MR. LE: So say we decide for example, say, we take out behavioral advertisement … What would happen if we did it now versus when we did it in formal rulemaking? […] MR. LAIRD: …If we do that now, we would need to update a number of these requirements. We would also need to update our standardized regulatory impact analysis.”)
At this point, it is painfully obvious that the Privacy Agency does not have sufficient checks on its authority. Furthermore, it is almost certain that the Agency will follow the same path it did for its last rulemaking — meaning it can be expected to amend its proposed regulations upon the conclusion of this formal comment period, release modified proposed regulations for comment but decline to make further revisions, and then seek approval and immediate effect from OAL. OAL’s review, unfortunately provides little check on the Agency’s actions. And while the budget remains the Legislature’s primary method for executing checks on agencies, such a check — while helpful — would not unwind or stop the damage done by these regulations.
CalChamber Position
It is becoming increasingly imperative that California’s elected officials rein in a state agency that is seemingly intent on writing public policy for the state on critical matters that far exceed its legal authority and have the ability to bring California’s economy to a grinding halt, all while downplaying the impact that their regulations will have on the regulated community. The California Privacy Protection Agency cannot be permitted to adopt regulations for which it has no legal authority or otherwise extend beyond the explicit statutory authority voters granted the Agency in Proposition 24 — least of all in situations where they have the ability to damage the economy and prevent the state from harnessing technology for the benefit of society, undermining clear directives from the Governor; and where they would be getting ahead of the Legislature on issues on which they know elected officials intend to act in the immediate future.
The California Chamber of Commerce will continue to shine light on the issue and participate in any available public process to represent the interest of members and outline the ways in which the Agency has exceeded its express voter mandate and what is commonly understood to be privacy regulations. The CalChamber will also support legislative proposals or other actions to add checks and balances and otherwise prevent the Agency from usurping what is clearly within the bounds of legislative authority in regulating artificial intelligence (AI) more generally, and in effectively rewriting the California Consumer Privacy Act (CCPA) by substituting their judgment for the voters’ actual mandate.
The CalChamber supports proposals to address concerns over rogue agencies and to add additional layers of checks and balances on those agencies, such as additional review of certain major regulations that have particularly significant economic impact, the potential to fall within the authority of multiple state agencies, or new issues that have yet to be addressed in statute first wherein the Legislature and Governor also can decide the appropriate regulatory authority on those issues. The CalChamber also supports proposals that would enforce independent review or analysis of certain economic assessments for major regulations.
February 2025
Agriculture and Resources
California Environmental Quality Act (CEQA)
Climate Change
Education
Energy
Environmental Regulation
Health Care
Housing and Land Use
Immigration Reform
International Trade
Labor and Employment
Legal Reform
Managing Employees
Privacy
Product Regulation
Taxation/Budget
Tourism
Transportation
Unemployment Insurance/Insurance
Water
Workers’ Compensation
Workplace Safety
Related News
Privacy Bills
Coalitions
Committees
Staff Contact
Ronak Daylami
Policy Advocate
Privacy and Cybersecurity