As companies begin to move beyond large language model (LLM)-powered assistants into fully autonomous agents—AI systems that can plan, take actions, and adapt without human-in-the-loop—legal and privacy teams must be aware of the use cases and the risks that come with them.

What is Agentic AI?
Agentic AI refers to AI systems—often built using LLMs but not limited to them—that can take independent, goal-directed actions across digital environments. These systems can plan tasks, make decisions, adapt based on results, and interact with software tools or systems with little or no human intervention.

Agentic AI often blends LLMs with other components like memory, retrieval, application programming interfaces (APIs), and reasoning modules to operate semi-autonomously. It goes beyond chat interfaces and can initiate real actions—inside business applications, internal databases, or even external platforms.

For example:

  • An agent that processes inbound email, classifies the request, files a ticket, and schedules a response—all autonomously.
  • A healthcare agent that transcribes provider dictations, updates the electronic health record , and drafts follow-up communications.
  • A research agent that searches internal knowledge bases, summarizes results, and proposes next steps in a regulatory analysis.

These systems aren’t just helping users write emails or summarize docs. In some cases, they’re initiating workflows, modifying records, making decisions, and interacting directly with enterprise systems, third-party APIs, and internal data environments. Here are a handful of issues that legal and privacy teams should be tracking now.

Continue Reading What is Agentic AI? A Primer for Legal and Privacy Teams

The rulemaking process on California’s Proposed “Regulations on CCPA Updates, Cybersecurity Audits, Risk Assessments, Automated Decisionmaking Technology, and Insurance Companies” (2025 CCPA Regulations) has been ongoing since November 2024.  With the one-year statutory period to complete the rulemaking or be forced to start anew on the horizon, the California Privacy Protection Agency (CPPA) voted unanimously to move a revised set of draft regulations forward to public comment on May 1, which began May 9 and closes at 5 pm Pacific June 2, 2025.  The revisions cut back on the regulation of Automated Decision-making Technology (ADMT), eliminate the regulation of AI, address potential Constitutional deficiencies with regard to risk assessment requirements and somewhat ease cybersecurity audit obligations.  This substantially revised draft is projected by the CPPA to save California businesses approximately 2.25 billion dollars in the first year of implementation, a 64% savings from the projected cost of the prior draft.

Continue Reading Revised Draft California Privacy Regulations Lessen Impact on Business

(Updated May 12, 2025)

Since January, the federal government has moved away from comprehensive legislation on artificial intelligence (AI) and adopted a more muted approach to federal privacy legislation (as compared to 2024’s tabled federal legislation). Meanwhile, state legislatures forge ahead – albeit more cautiously than in preceding years.

As we previously reported, the Colorado AI Act (COAIA) will go into effect on February 1, 2026. In signing the COAIA into law last year, Colorado Governor Jared Polis (D) issued a letter urging Congress to develop a “cohesive” national approach to AI regulation preempting the growing patchwork of state laws. Absent a federal AI law, Governor Polis encouraged the Colorado General Assembly to amend the COAIA to address his concerns that the COAIA’s complex regulatory regime may drive technology innovators away from Colorado. Eight months later, the Trump Administration announced its deregulatory approach to AI regulation making federal AI legislation unlikely. At that time, the Trump Administration seemed to consider existing laws – such as Title VI and Title VII of the Civil Rights Act and the Americans with Disabilities Act which prohibit unlawful discrimination – as sufficient to protect against AI harms. Three months later, a March 28 Memorandum issued by the federal Office of Management and Budget directs federal agencies to implement risk management programs designed for “managing risks from the use of AI, especially for safety-impacting and rights impacting AI.”

Continue Reading States Shifting Focus on AI and Automated Decision-Making

Companies in all industries take note: regulators are scrutinizing how companies offer and manage privacy rights requests and looking into the nature of vendor processing in connection with application of those requests. This includes applying the proper verification standards and how cookies are managed. Last month, the California Privacy Protection Agency (“CPPA” or “Agency”) provided yet another example of this regulatory focus in a March 2025 Stipulated Final Order (“Order”) against a global vehicle manufacturer (referred to throughout this blog as “the Company”). We discuss this case in further detail, and provide practical takeaways from the case, further below.

On the heels of the CPPA’s landmark case against the Company, various state AGs and the CPPA announced a formal agreement to promote collaboration and information sharing in the bipartisan effort to safeguard the privacy rights of consumers. The announcement Attorney General Bonta of California can be found here. The consortium includes the CPPA and State Attorneys General from California, Colorado, Connecticut, Delaware, Indiana, New Jersey and Oregon. According to an announcement by the CPPA, the participating regulators established the consortium to share expertise and resources and coordinate in investigating potential violations of their respective privacy laws. With the establishment of a formal enforcement consortium, we can expect cross-jurisdictional collaboration on privacy enforcement by the participating states’ regulators. On the plus side, perhaps we will see the promotion of consistent interpretation of these seven states’ various laws that make up almost a third of the current patchwork of U.S. privacy legislation.

CPPA Case – Detailed Summary

In the case against the Company, the CPPA alleged that it violated the California Consumer Privacy Act (“CCPA”) by:

  • requiring Californians to verify themselves where verification is not required or permitted (the right to opt-out of sale/sharing and the right to limit) and provide excessive personal information to exercise privacy rights subject to verification (know, delete, correct);
  • using an online cookie management tool (often known as a CMP) that failed to offer Californians their privacy choices in a symmetrical or equal way and was confusing;
  • requiring Californians to verify that they gave their agents authority to make opt-out of sale/sharing and right to limit requests on their behalf; and
  • sharing consumers’ personal information with vendors, including ad tech companies, without having in place contracts that contain the necessary terms to protect privacy in connection with their role as either a service provider, contractor or third party.

This Order illustrates the potential fines and financial risks associated with non-compliance with the state privacy laws. Of the $632,500 administrative fine lodged against the company, the Agency clearly spelled out that $382,500 of the fine accounts for 153 violations – $2,500 per violation – that are alleged to have occurred with respect to the Company’s consumer privacy rights processing between July 1 and September 23, 2023. It is worth emphasizing that the Agency lodged the maximum administrative fine – “up to two thousand five hundred ($2,500)” – that is available to it for non-intentional violations for each of the incidents where consumer opt-out/limit rights were wrongly applying verification standards. It Is unclear to what the remaining $250,000 in fines were attributed, but they are presumably for the other violations alleged in the order, such as disclosing PI to third parties without having contracts with the necessary terms, confusing cookie and other consumer privacy requests methods and requiring excessive personal data to make a request. It is unclear the number of incidents that involved those infractions but based on likely web traffic and vendor data processing, the fines reflect only a fraction of the personal information processed in a manner alleged to be non-compliant.

The Agency and Office of the Attorney General of California (which enforces the CCPA alongside the Agency) have yet to seek truly jaw-dropping fines in amounts that have become common under the UK/EU General Data Protection Regulation (“GDPR”). However, this Order demonstrates California regulators’ willingness to demand more than remediation. It is also significant that the Agency requires the maximum administrative penalty on a per-consumer basis for the clearest violations that resulted in denial of specific consumers’ rights. This was a relatively modest number of consumers:

  • “119 Consumers who were required to provide more information than necessary to submit their Requests to Opt-out of Sale/Sharing and Requests to Limit;
  • 20 Consumers who had their Requests to Opt-out of Sale/Sharing and Requests to Limit denied because the Company required the Consumer to Verify themselves before processing the request and;
  • 14 Consumers who were required to confirm with the Company directly that they had given their Authorized Agents permission to submit the Request to Opt-out of Sale/Sharing and Request to Limit on their behalf.”

The fines would have likely been greater if applied to all Consumers who accessed the cookie CMP, or that made requests to know, delete or correct. Further, it is worth noting that many companies receive thousands of consumer requests per year (or even per month), and the statute of limitations for the Agency is five years; applying the per-consumer maximum fine could therefore result in astronomical fines for some companies.

Let us also not forget that regulators also have injunctive relief at their disposal. Although, the injunctive relief in this Order was effectively limited to fixing alleged deficiencies, it included “fencing in” requirements such as use of a UX designer to evaluate consumer request “methods – including identifying target user groups and performing testing activities, such as A/B testing, to access user behavior” – and reporting of consumer request metrics for five years. More drastic relief, such as disgorgement or prohibiting certain data or business practices, are also available. For instance, in a recent data broker case brought by the Agency, the business was barred from engaging in business as a data broker in California for three years.

We dive into each of the allegations in the present case further below and provide practical takeaways for in-house legal and privacy teams to consider.

Requiring consumers to provide more info than necessary to exercise verifiable requests and requiring verification of CCPA sale/share opt-out and sensitive PI limitation requests

The Order alleges two main issues with the Company’s rights request webform:

  • The Company’s webform required too many data points from consumers (e.g., first name, last name, address, city, state, zip code, email, phone number). The Agency contends that requiring all of this information necessitates that consumers provide more information than necessarily needed to exercise their verifiable rights considering that the Agency alleged that the Company “generally needs only two data points from the Consumer to identify the Consumer within its database.” The CPPA and its regulations allow a business to seek additional personal information if necessary to verify to the requisite degree of certainty required under the law (which varies depending on the nature of the request and the sensitivity of the data and potential harm of disclosure, deletion or change), or to reject the request and provide alternative rights responses that require lesser verification (e.g., treat a request of a copy of personal information as a right to know categories of person information). However, the regulations prohibit requiring more personal data than is necessary under the particular circumstances of a specific request. Proposed amendments the Section 7060 of the CCPA regulations also demonstrate the Agency’s concern about requiring more information than is necessary to verify the consumer.
  • The Company required consumers to verify their Requests to Opt-Out of Sale/Sharing and Requests to Limit, which the CCPA prohibits.

In addition to these two main issues, the Agency also alluded to (but did not directly state) that the consumer rights processes amounted to dark patterns. The CPPA cited the policy reasons behind differential requirements as to Opt-Out of Sale/Sharing and Right to Limit; i.e., so that consumers can exercise Opt-Out of Sale/Sharing and Right to Limit requests without undue burden, in particular because there is minimal or nonexistent potential harm to consumers if such requests are not verified.

In the Order, the CPPA goes on to require the Company to ensure that its personnel handling CCPA requests are trained on the CCPA’s requirements for rights requests, which is an express obligation under the law, and confirming to the Agency that it has provided such training within 90 days of the Order’s effective date.

Practical Takeaways

  • Configure consumer rights processes, such as rights request webforms, to only require a consumer to provide the minimum information needed to initiate and verify (if permitted) the specific type of request. This may be difficult for companies that have developed their own webforms, but most privacy tech vendors that offer webforms and other consumer rights-specific products allow for customizability. If customizability is not possible, companies may have to implement processes to collect minimum information to initiate the request and follow up to seek additional personal information if necessary to meet CCPA verification standards as may be applicable to the specific consumer and the nature of the request.
  • Do not require verification of do not sell/share and sensitive PI limitation requests (note, there are narrow fraud prevention exceptions here, though, that companies can and should consider in respect of processing Opt-Out of Sale/Sharing and Right to Limit requests).
  • Train personnel handling CCPA requests (including those responsible for configuring rights request “channels”) to properly intake and respond to them.
  • Include instructions on how to make the various types of requests that are clear and understandable, and that track the what the law permits and requires.

Requiring consumers to directly confirm with the Company that they had given permission to their authorized agent to submit opt-out of sale/sharing sensitive PI limitation requests

The CPPA’s Order also outlines that the Company allegedly required consumers to directly confirm with the Company that they gave permission to an authorized agent to submit Opt-Out of Sale/Sharing and Right to Limit requests on their behalf. The Agency took issue with this because under the CCPA, such direct confirmation with the consumer regarding authority of an agent is only permitted as to requests to delete, correct and know.

Practical Takeaways

  • When processing authorized agent requests to Opt-Out of Sale/Sharing or Right to Limit, avoid directly confirming with the consumer or verifying the identity of the authorized agent (the latter is also permitted in respect of requests to delete, correct and know). Keep in mind that what agents may request, and agent authorization and verification standards, differ from state-to-state.

Failure to provide “symmetry in choice” in its cookie management tool

The Order alleges that, for a consumer to turn off advertising cookies on the Company’s website (cookies which track consumer activity across different websites for cross-context behavioral advertising and therefore require an Opt-out of Sale/Sharing), consumers must complete two steps: (1)  click the toggle button to the right of Advertising Cookies and (2) click the “Confirm My Choices” button.

The Order compares this opt-out process to that for opting back into advertising cookies following a prior opt-out. There, the Agency alleged that if consumers return to the cookie management tool (also known as a consent management platform or “CMP”) after turning “off” advertising cookies, an “Allow All” choice appears. This is likely a standard configuration of the CMP that can be modified to match the toggle and confirm approach used for opt-out. Thus, the CPPA alleged, consumers need only take one step to opt back into advertising cookies when two steps are needed to opt-out, in violation of and express requirement of the CCPA to have no more steps to opt-in than was required to opt-out.

The Agency took issue with this because the CCPA requires businesses to implement request methods that provide symmetry in choice, meaning the more privacy-protective option (e.g., opting-out) cannot be longer, more difficult or more time consuming than the less privacy protective option (e.g., opting-in).

The Agency also addressed the need for symmetrical choice in the context of “website banners,” also known as cookie banners, pointing to an example cited as insufficient symmetry in choice from the CCPA regulations – i.e., using “’Accept All’ and ‘More Information,’ or ‘Accept All’ and ‘Preferences’ – is not equal or symmetrical” because it suggests that the company is seeking and relying on consent (rather than opt-out) to cookies, and where consent is sought acceptance and acceptance must be equally as easy to choose. The CCPA further explained that “[a]n equal or symmetrical choice” in the context of a website banner seeking consent for cookies “could be between “Accept All” and “Decline All.”” Of course, under CCPA consent to even cookies that involve a Share/Sale is not required, but the Agency is making clear that where consent is sought there must be symmetry in acceptance and denial of consent.

The CPPA’s Order also details other methods by which the company should modify its CCPA requests procedures including:

  1. separating the methods for submitting sale/share opt-out requests and sensitive PI limitation requests from verifiable consumer requests (e.g., requests to know, delete, and correct);
  2. including the link to manage cookie preferences within the Company’s Privacy Policy, Privacy Center and website footer; and
  3. applying global privacy control (“GPC”) preference signals for opt-outs to known consumers consistent with CCPA requirements.

Practical Takeaways

  • It is unclear whether the company configured the cookie management tool in this manner deliberately or if the choice of the “Allow All” button in the preference center was simply a matter of using a default configuration of the CMP, a common issue with CMPs that are built off of a (UK/EU) GDPR consent model. Companies should pay close attention to the configuration of their cookie management tools, including in both the cookie banner (or first layer), if used, and the preference center, and avoid using default settings and configurations provided by providers that are inconsistent with state privacy laws. Doing so will help mitigate the risk of choice asymmetry presented in this case, and the risks discussed in the following three bullets.
  • State privacy laws like the CCPA are not the only reason to pay close attention and engage in meticulous legal review of cookie banner and preference center language, and proper functionality and configuration of cookie management tools.
  • Given the onslaught of demands and lawsuits from plaintiffs’ firms under the California Invasion of Privacy Act and similar laws – based on cookies, pixels and other tracking technologies – many companies turn to cookie banner and preference center language to establish an argument for a consent defense and therefore mitigate litigation risk. In doing so it is important to bear in mind the symmetry of choice requirements of state consumer privacy laws. One approach is to make it clear that acceptance is of the site terms and privacy practices, which include use of tracking by the operator and third parties, subject to the ability to opt-out of some types of cookies. This can help establish consent to use of cookies by using the site after notice of cookie practices, while not suggesting that cookies are opt-in, and having lack of symmetry in choice.
  • In addition, improper wording and configuration of cookie tools – such as providing an indication of an opt-in approach (“Accept Cookies”) when cookies in fact already fired upon the user’s site visit, or that “Reject All” opts the user out of all, including functional and necessary cookies that remain “on” after rejection – present risks under state unfair and deceptive acts and practices (UDAAP) and unfair competition laws, and make the cookie banner notice defense to CIPA claims potentially vulnerable since the cookies fire before the notice is given.
  • Address CCPA requirements for GPC, linking to the business’s cookie preference center, and separating methods for exercising verifiable vs. non-verifiable requests. Where the business can tie a GPC signal to other consumer data (e.g., the account of a logged in user), it must also apply the opt-out to all linkable personal information.
  • Strive for clear and understandable language that explains what options are available and the limitations of those options, including cross-linking between the CMP for cookie opt-outs and the main privacy rights request intake for non-cookie privacy rights, and explain and link to both in the privacy policy or notice.
  • Make sure that the “Your Privacy Choices” or “Do Not Sell or Share My Personal Information” link gets the consumer to both methods. Also make sure the opt-out process is designed so that the required number of steps to make those opt-outs is not more than to opt-back in. For example, linking first to the CMP, which then links the consumer rights form or portal, rather than the other way around, is more likely to avoid the issue with additional steps just discussed.

Failure to produce contracts with advertising technology companies

The Agency’s Order goes on to allege that the Company did not produce contracts with advertising technology companies despite collecting and selling/sharing PI via cookies on its website to/with these third parties. The CPPA took issue with this because the CCPA requires a written contract meeting certain requirements to be in place between a business and PI recipients that are a CCPA service provider, contractor or third party in relation to the business. We have seen regulators request copies of contracts with all data recipients in other enforcement inquiries.

Practical Takeaways

  • Vendor and contract management are a growing priority of privacy regulators, in California and beyond, and should be a priority for all companies. Be prepared to show that you have properly categorized all personal data recipients and have implemented and maintain processes to ensure proper contracting practices with vendors, partners and other data recipients, which should include a diligence and assessment process to ensure that the proper contractual language is in place with the data recipient based on the recipient’s data processing role. To state it another way, it may not be proper as to certain vendors to simply put in place a data processing agreement or addendum with service provider/processor language. For instance, vendors that process for cross-context behavioral advertising cannot qualify as a service provider/contractor. In order to correctly categorize cookie and other vendors as subject to opt-out or not, this determination is necessary.
  • Attention to contracting is important under the CCPA in particular because different language is required depending on whether the data recipient constitutes a “third party,” “service provider” or a “contractor,” the CCPA requires different contracting terms be included in the agreements with each of those three types of personal information recipients. Further, in California, the failure to have all of the required service provider/contractor contract terms will convert the recipient to a third party and the disclosure into a sale.

Conclusion

This case demonstrates the need for businesses to review their privacy policies and notices, and audit their privacy rights methods and procedures to ensure that they are in compliance with applicable state privacy laws, which have some material differences from state-to-state. We are aware of enforcement actions in progress not only in California, but other states including Oregon, Texas and Connecticut, and these states are looking for clarity as to what specific rights their residents have and how to exercise them. Further, it can be expected that regulators will start, potentially in multi-state actions that have become common in other consumer protection matters, looking beyond obvious notice and rights request program errors to data knowledge and management, risk assessment, minimization and purpose and retention limitation obligations. Compliance with those requirements requires going beyond “check the box” compliance as to public facing privacy program elements and to the need to have a mature, comprehensive and meaningful information governance program.

If you have any questions, or for more information, contact the authors or your SPB relationship attorney.

Disclaimer: While every effort has been made to ensure that the information contained in this article is accurate, neither its authors nor Squire Patton Boggs accepts responsibility for any errors or omissions. The content of this article is for general information only and is not intended to constitute or be relied upon as legal advice.

As reported previously, the California Privacy Protection Agency (“CPPA”) closed the public comment period for its proposed cybersecurity audit, risk assessment and automated decision-making technology (“ADMT”) regulations (the “Proposed Regulations”) in late February. In advance of the CPPA’s April 4 meeting, the CPPA released a new draft of the Proposed Regulations, which proposed relatively minor substantive changes, but pushed back the dates for when certain obligations would become effective. The Agency’s Board met on April 4, 2025, to discuss the new proposals and comments received, as well as the potential for some very different alternatives, especially related to ADMT. Members of the CPPA Board debated the staff’s approach and ultimately sent the staff back to narrow the scope of the Proposed Regulations, clarify what was in and out of scope with more examples, and to further consider how to reduce the costs and burdens on businesses. While it is unclear exactly what staff will come back with, the alternatives discussed provide some hints on what a more constrained approach may look like.

Continue Reading The Future for California’s Latest Generation of Privacy Regulations is Uncertain

As we have previously detailed here, the latest generation of regulations under the California Consumer Privacy Act (CCPA), drafted by the California Privacy Protection Agency (CPPA), have advanced beyond public comments are closer to becoming final. These include regulations on automated decision-making technology (ADMT), data processing evaluation and risk assessment requirements and cybersecurity audits. Recently, Privacy World’s Alan Friel spoke at the California Lawyer’s Association’s Annual Privacy Summit at UCLA in Westwood, California (Go Bruins!) on the evaluation and assessment proposals. Separately, Privacy World’s Lydia de la Torre, a CPPA Board Member until recently, spoke on artificial intelligence laws and litigation. A transcript of Alan’s presentation follows:

Continue Reading Data Processing Evaluation and Risk Assessment Requirements Under California’s Proposed CCPA Regulations

After what seems like forever, the most recent (and last?) public comment period for the draft California Consumer Privacy Act (CCPA) regulations finally closed on February 19, 2025. (Read Privacy World coverage here and here.) 

Following an initial public comment period on an earlier draft, the formal comment period for the current version of the proposed CPPA regulations (Proposed Regulations) began on November 22, 2024. The Proposed Regulations include amendments to the existing CCPA regulations and new regulations on automated decision-making technology, profiling, cybersecurity audits, requirements for insurance companies and data practice risk assessments. The California Privacy Protection Agency (CPPA) may either submit a final rulemaking package to the California Office of Administrative Law (OAL, which confirms statutory authority) or modify the Proposed Regulations in response to comments received during the public comment period.

Continue Reading Light at the End of the Tunnel – Are You Ready for the New California Privacy and Cybersecurity Rules?

On January 23, 2025, President Trump issued a new Executive Order (EO) titled “Removing Barriers to American Leadership in Artificial Intelligence” (Trump EO). This EO replaces President Biden’s Executive Order 14110 of October 30, 2023, titled “Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence” (Biden EO), which was rescinded on January 20, 2025, by Executive Order 14148.

The Trump EO signals a significant shift away from the Biden administration’s emphasis on oversight, risk mitigation and equity toward a framework centered on deregulation and the promotion of AI innovation as a means of maintaining US global dominance.

Continue Reading Key Insights on President Trump’s New AI Executive Order and Policy & Regulatory Implications

Nineteen states have followed the lead of California and passed consumer privacy laws.  Three went into effect this year and eight will become effective in 2025.  The remainder become effective in 2026.  Charts at the end of this post track effective dates (see Table 1) and applicability thresholds (see Table 2).  While there are many similar aspects to these laws, they also diverge from each other in material ways, creating a compliance challenge for organizations. In addition, there are other privacy laws pertaining specifically to consumer health data,[1] laws specific to children’s and minors’ personal data and not part of a comprehensive consumer privacy law,[2] AI-specific laws,[3] or laws, including part of overall consumer privacy laws, regulating data brokers[4] that enterprises need to consider. 

A recent article published by the authors in Competition Policy International’s TechReg Chronical details the similarities and differences between the 20 state consumer privacy laws and a chart at the end of this post provides a quick reference comparison of these laws (see Table 3).

Continue Reading Are You Ready for The Latest U.S. State Consumer Privacy Laws?

As we predicted a year ago, the Plaintiffs’ Bar continues to test new legal theories attacking the use of Artificial Intelligence (AI) technology in courtrooms across the country. Many of the complaints filed to date have included the proverbial kitchen sink: copyright infringement; privacy law violations; unfair competition; deceptive and acts and practices; negligence; right of publicity, invasion of privacy and intrusion upon seclusion; unjust enrichment; larceny; receipt of stolen property; and failure to warn (typically, a strict liability tort).

A case recently filed in Florida federal court, Garcia v. Character Techs., Inc., No. 6:24-CV-01903 (M.D. Fla. filed Oct. 22, 2024) (Character Tech) is one to watch. Character Tech pulls from the product liability tort playbook in an effort to hold a business liable for its AI technology. While product liability is governed by statute, case law or both, the tort playbook generally involves a defective, unreasonably dangerous “product” that is sold and causes physical harm to a person or property. In Character Tech, the complaint alleges (among other claims discussed below) that the Character.AI software was designed in a way that was not reasonably safe for minors, parents were not warned of the foreseeable harms arising from their children’s use of the Character.AI software, and as a result a minor committed suicide. Whether and how Character Tech evolves past a motion to dismiss will offer valuable insights for developers of AI technologies.

Continue Reading Artificial Intelligence and the Rise of Product Liability Tort Litigation: Novel Action Alleges AI Chatbot Caused Minor’s Suicide