Proposed privacy regulations being developed by the California Privacy Protection Agency (CPPA) Board are overly burdensome, insufficiently risk-based, and out of sync with requirements of other states and the law passed by voters (Proposition 24), the California Chamber of Commerce and other groups told the board earlier this month.
In fact, the draft rules go beyond common understanding of privacy regulations and veer into rewriting the law, CalChamber Policy Advocate Ronak Daylami said in testimony to the CPPA Board on March 8.
Much of the discussion centered on whether the proposed regulations dealing with automated decision-making tools (ADMT) should be moved along to the next stage of the rulemaking process. Ultimately, the CPPA Board voted 3-2 to advance the regulations to the formal rulemaking stage.
Overly Broad
One way in which the draft regulations are overly broad is by including “profiling a consumer for behavioral advertising,” Daylami noted.
As written, the draft captures even ads where businesses are advertising to their own customers, whereas California voters charged the CPPA with developing rules on “cross-context behavioral advertising” specifically. The latter focuses on sharing personal information for ads based on customer activities across multiple websites and services.
Another way the proposed definitions of artificial intelligence (AI) and ADMT remain so overly broad is that they encompass simple algorithms and commonplace tools such as spreadsheet software.
Including profiling and behavioral advertising in the regulations will have far-reaching, negative effects, creating opt-out requirements for situations in which AI isn’t making decisions, Daylami said. Besides hurting innovation, the rules will lead to a more frustrating customer experience by limiting personalization, she said.
By including requirements of detailed disclosure and assessments related to model testing, model logic, outputs, testing for fairness and validity, and alternative technologies that a business considered, the draft rules enter the realm of general regulations of ADMT as opposed to privacy regulations, Daylami stated.
Employment Concerns
Reiterating concerns raised at the board’s December meeting, Daylami pointed out that the use of ADMT in employment raises unique considerations, given that existing laws already protect against the use of AI tools that directly or indirectly discriminate against job applicants and employees. Problems include:
• Requiring employes to provide an opt-out of ADMT where it is unrelated to a significant employment decision or where the use is shown to be job-related and consistent with business necessity.
• Regulating the personal information used for training of ADMT, which isn’t a high-risk activity. Allowing an opt-out of training will result in inferior models and increase the risk of bias, to the detriment of consumers and innovation.
• Requiring an opt-out for training doesn’t protect consumers’ privacy and actually would require additional processing of their data because developers typically don’t identify individuals during the training process. Consumer data used in developing models is generic; developers rely on trends and patterns, not individualized data.
• Requiring businesses to provide risk assessments to the CPPA annually, as opposed to only where they are relevant to an investigation.
Daylami commented that the draft rules result in the disclosure of substantial amounts of confidential or proprietary information, if not trade secrets, yet fail to include protections from public disclosure or ensure that all applicable legal privileges are retained — which is available under other state privacy laws. At a minimum, such protections should be added to the California rules, she said.
Board Divided
Comments by CPPA Board members at the hearing indicated the divergence of opinions about the latest draft ADMT regulations. For example, board member Alastair Mactaggart, proponent of the privacy initiative passed in 2020, described the definitions in the regulations as being “extraordinarily broad.”
He said, “I think we’re getting very far afield from privacy…No one’s going to argue that we need to have a more just and equitable society, but we’re talking about privacy.”
Board member Lydia de la Torre said she would find it very hard to back any finding about the regulation’s scope that wasn’t supported by Mactaggart because he “literally wrote the law…that we’re now trying to interpret.” She noted the board could face litigation that would be very challenging to defend against if board members disagree on whether the rules are within the scope of their assignment, and cited the many comments the board has received saying the draft rules are out of scope.
Other board members (Vinhcent Le and chair Jennifer Urban) characterized the draft as a reasonable balance with the proposed exceptions and the potential opt-out if a human was involved in reviewing company assessments being helpful for covered companies implementing the regulations. Le theorized that issuing a broader draft could result in the board receiving comments that would help narrow the scope of the regulation before it is finalized.
Jeffrey Worthe, the newest member of the board, said he thought it was time to move the discussion on the draft “to a wider audience” and ask for “feedback from stakeholders…in a very formal process.”