As we predicted a year ago, the Plaintiffs’ Bar continues to test new legal theories attacking the use of Artificial Intelligence (AI) technology in courtrooms across the country. Many of the complaints filed to date have included the proverbial kitchen sink: copyright infringement; privacy law violations; unfair competition; deceptive and acts and practices; negligence; right of publicity, invasion of privacy and intrusion upon seclusion; unjust enrichment; larceny; receipt of stolen property; and failure to warn (typically, a strict liability tort).

A case recently filed in Florida federal court, Garcia v. Character Techs., Inc., No. 6:24-CV-01903 (M.D. Fla. filed Oct. 22, 2024) (Character Tech) is one to watch. Character Tech pulls from the product liability tort playbook in an effort to hold a business liable for its AI technology. While product liability is governed by statute, case law or both, the tort playbook generally involves a defective, unreasonably dangerous “product” that is sold and causes physical harm to a person or property. In Character Tech, the complaint alleges (among other claims discussed below) that the Character.AI software was designed in a way that was not reasonably safe for minors, parents were not warned of the foreseeable harms arising from their children’s use of the Character.AI software, and as a result a minor committed suicide. Whether and how Character Tech evolves past a motion to dismiss will offer valuable insights for developers of AI technologies.

Continue Reading Artificial Intelligence and the Rise of Product Liability Tort Litigation: Novel Action Alleges AI Chatbot Caused Minor’s Suicide

Join us for our Data Privacy Thought Leadership Series, where we dive into the latest trends shaping AI, marketing, and data monetization. With new state privacy laws, evolving regulatory requirements, and AI procurement challenges, this series offers practical insights to help you navigate the complex data privacy landscape.

Learn how to manage privacy assessments, stay compliant, and strengthen your data governance strategies to keep your organization ahead of the curve.


State Privacy Law Roundup

📅Thursday, October 3 | 9 – 10 a.m. PT

Speakers: Julia Jacobson, Elizabeth Berthiaume, Kyle Dull

In the first half of 2024, seven new state consumer privacy laws were enacted and three state consumer privacy laws became effective (plus one on October 1, 2024). Eight more state consumer privacy laws will become effective in 2025 and the California Privacy Protection Agency (CCPA) continued its rulemaking activity. Plus, 2024’s American Privacy Rights Act could gain traction now that Congress is back in session after the August recess. Join us on October 3rd for a rundown on where we are and what’s ahead for 2025 in consumer privacy.


AI, Marketing, and Data Monetization: Understanding and Managing Consents, Opt-Outs, and Other Regulatory Requirements

📅Thursday, October 10 | Noon – 1 p.m. PT

Speakers: Kyle Fath, Niloufar MassachiGicel Tomimbang

The convergence of industry trends, business needs, and significant technology advances, particularly advancements in AI, marketing, and data monetization, has led many companies to collect more personal data and do more with it. This comes at a time when regulators are actively and aggressively pursuing privacy enforcement and over twenty states have passed comprehensive privacy laws, with most of them imposing consent obligations, opt-out rights, and even outright prohibitions with respect to specific activities or certain types of data.

Please join us for a discussion on consent, opt-out, and other regulatory requirements that are relevant to AI, marketing, and data monetization. Our goal is for you to leave this session armed with information that will help you identify risks, inform business decisions and strategy, and serve as a thoughtful and resourceful partner to your organization’s GC/CLO, business stakeholders, and C-suite.

Attend virtually or join us at our LA Office for further discussion and lunch.


Privacy Rulemaking and Enforcement

📅Thursday, October 17 | 9 – 10 a.m. PT

Speakers: Alan Friel, Lydia de la Torre

Join Squire Patton Boggs Global Data Chair Alan Friel and of Counsel Lydia de la Torre, and former CPPA Board member, for a discussion on the next generation of CCPA regulations, including regarding employment, ADM / Profiling / AI, and Risk Assessments and Security Audits, as well as enforcement priorities and cooperation between regulators in the states that have enacted consumer privacy laws.


Privacy Assessments: A Discussion of Requirements and Risks and a Mock Assessment Exercise

📅Tuesday, October 22 | Noon – 1 p.m. PT

Speaker: Kyle Fath

State privacy laws already require, or will soon require, companies to carry out assessments – referred to as data protection assessments, risk assessments or DPIAs. These requirements extend to “high-risk” activities or those that involve a “heightened risk of harm,” including, in most cases, targeted advertising, the sale of personal data, and the processing of personal data, among other things. The Colorado Privacy Act and proposed regulations under the California Consumer Privacy Act (CCPA) lay out detailed content requirements that companies must follow, including requiring significant input from both internal teams and external stakeholders, such as vendors and other recipients of personal data. In addition to prescriptive content requirements, businesses should also be aware of regulators’ ability to request copies of assessments under the state privacy laws, and the proposed CCPA regulations that would require businesses to file certifications of compliance and abridged versions of their assessments with the California Privacy Protection Agency.

Join us for this event where we will:

  • Discuss privacy assessment requirements and risks
  • Carry out a mock assessment exercise, walking through the completion of various aspects of a privacy assessment, focused on use cases involving targeted advertising and the sale of personal data
  • Touch on available resources that you can use to carry out assessments more efficiently and effectively

AI in Action: AI Procurement

📅Wednesday, October 30 | 9 – 10 a.m. PT

Speakers: Julia Jacobson, Charles Helleputte

The same thing, only different. Procuring AI presents many of the same challenges as procuring any other technology. An organization seeks to harness the full potential of the technology together with a supplier contract that minimizes risks. Two key issues distinguish Al procurement: AI systems are designed to continually learn and improve and the AI legal structure is dynamic. Tune in for a trans-Atlantic view on adapting technology and data governance risk management for AI procurement.

SPB’s Gabrielle Martin authored a piece on the recently passed Illinois HB 3773. The bill amends the Illinois Human Rights Act to protect employees against discrimination from, and require transparency about, the use of AI in employment-related decisions. Head over to Employment Law Worldview, for an in-depth discussion of the bill, including a contrast with Colorado’s Artificial Intelligence Act: Illinois Enacts New AI Legislation, Joining Colorado as the Only States Regulating Algorithmic Discrimination in Private Sector Use of AI Systems (US).

In a move that will be unwelcomed by plaintiffs’ lawyers, Illinois has enacted an amendment to its biometrics privacy law – the Biometric Information Privacy Act (“BIPA”) – to provide that when a private entity that, in more than one instance, discloses, rediscloses, or otherwise disseminates the same biometric identifier or biometric information from the same person to the same recipient using the same method of collection, without the required prior notice and written release, it commits only a single violation for penalty calculation purposes, regardless of the number of times the data was disclosed, redisclosed, or otherwise disseminated.  This will significantly reduce the potential damages and lower the settlement value of BIPA claims.  The amendment also provides that an e-signature satisfies the written requirements for the release.  “Electronic signature” means an electronic sound, symbol, or process attached to or logically associated with a record and executed or adopted by a person with the intent to sign the record[,]” thus clarifying that online “clickwrap” releases suffice.  This amendment follows previous failed attempts at similar reforms to stem the fold of BIPA class action litigation that has plagued companies that have enacted fingerprint time cards or other biometric fraud and security measures without strictly complying with BIPA.  Colorado recently enacted a BIPA-like biometrics law, but like other states except only Illinois, it does not have a privacy right of action and can only be enforced by the state.  However, states are active in enforcing their privacy laws as illustrated by a recent Texas settlement with a social media company for biometric consent claims that included a 9-figure civil penalty payment.

For more information, contact the author or your SPB relationship lawyer.


Disclaimer: While every effort has been made to ensure that the information contained in this article is accurate, neither its authors nor Squire Patton Boggs accepts responsibility for any errors or omissions. The content of this article is for general information only and is not intended to constitute or be relied upon as legal advice.

Regulators in states without omnibus state privacy laws, like New York, are staking their claim over privacy regulation and enforcement. After months of investigating the deployment of tracking technologies and privacy controls on various websites, the New York State Attorney General (“NY AG”) published its guidance, Website Privacy Controls: A Guide for Business. The NY AG also published a companion guidance for consumers, A Consumer Guide to Web Tracking, which provides a high-level overview of how websites track consumers and what steps consumers can take to protect their privacy. Stay tuned for potential enforcement actions and big-figure settlements. Will New York follow Texas in this regard?

NY AG Investigation and Findings

Tracking technologies, like cookies and tags (i.e., pixels), are utilized by businesses to collect and assess information regarding how individuals interact with the business’ website or mobile app. While tracking technologies can provide valuable insights for businesses, they also raise privacy concerns regarding data collection, selling, sharing, creation of detailed profiles about individuals that are used for targeted advertising, cross-site tracking that leads to a comprehensive understanding of an individual’s interests and behavior without the individual’s knowledge or consent, and more.  The Federal Trade Commission (“FTC”) is attempting Section 5 Magnuson-Moss rulemaking on this, which they call surveillance capitalism.

Continue Reading Businesses Beware: New York Eyeing Privacy Regulation and Enforcement Even Absent Omnibus State Privacy Law

We reported earlier that at the July 16th California Privacy Protection Agency (CPPA) Board meeting, the Board would be considering a rulemaking package that staff prepared further the Board’s vote and direction in March.  Copies of those documents are here.  At the July 16th Board meeting the staff presented on those, and reported that it was still working on the required Standardized Regulatory Impact Assessment (SRIA) that will need to be approved by the CA Department of Finance prior to publication for public comment and the commencement of the formal rulemaking process.  The Board also debated the substance of the draft rules but did not vote on them.  The Board asked staff to make clear certain alternatives to the draft in the call for public comments, most notably if risk assessments related to processing that, results in consequential decision-making, should be for all processing or just processing using automated decision-making (ADM) technologies.  Board Member MacTaggert raised several concerns about the current drafts, including:

Continue Reading California Privacy Regs Advance But Vote on Drafts Delayed

As we reported in our post about the Minnesota Customer Data Privacy Act, the Rhode Island Data Transparency and Privacy Protection Act (RI-DTPPA) was passed by the state legislature on June 13th.  Governor McKee did not either sign or veto but transmitted it to the Rhode Island Secretary of State. i.e., it is effective without the Governor’s signature. 

1. WHEN IS RI-DTPPA IN FORCE?

The RI-DTPPA effective date is January 1, 2026 – the same date as the customer privacy laws in Indiana and Kentucky. 

Since Vermont’s consumer privacy law was vetoed, the RI-DTPPA makes 20 state consumer privacy laws.  The 19 state customer privacy laws preceding RI-DTPPA (collectively, the State Customer Privacy Laws) are in force as follows.

StateState Customer Privacy Law TitleEffective Date
CaliforniaCalifornia Customer Privacy Act (CCPA)January 1, 2020; CCPA Regulations effective January 1, 2023
ColoradoColorado Privacy ActJuly 1, 2023
ConnecticutConnecticut Personal Data Privacy and Online Monitoring ActJuly 1, 2023
DelawareDelaware Personal Data Privacy ActJanuary 1, 2025
FloridaFlorida Digital Bill of RightsJuly 1, 2024
IndianaIndiana Customer Data Protection ActJanuary 1, 2026
IowaIowa’s Act Relating to Customer Data ProtectionJanuary 1, 2025
KentuckyKentucky Customer Data PrivacyJanuary 1, 2026
MarylandMaryland Online Data Privacy ActOctober 1, 2025
MinnesotaMinnesota Customer Data Privacy ActJuly 31, 2025
MontanaMontana Customer Data Privacy ActOctober 1, 2024
NebraskaNebraska’s Data Privacy ActJanuary 1, 2025
New HampshireAct Relative to the Expectation of PrivacyJanuary 1, 2025
New JerseyNew Jersey Data Protection ActJanuary 15, 2025
OregonOregon Customer Privacy ActJuly 1, 2024 (July 1, 2025, for in-scope non-profit organizations)
TennesseeTennessee Information Protection ActJuly 1, 2025
TexasTexas Data Privacy and Security ActJuly 1, 2024
UtahUtah Customer Privacy ActDecember 31, 2023
VirginiaVirginia Customer Data Protection ActJanuary 1, 2023
Continue Reading Rhode Island Makes it an Even 20

In a final push before adjourning for the summer, state legislators across the country contemplated consumer privacy laws.  Three legislatures made it to the finish line.  One – Minnesota’s state legislature passed the Minnesota Consumer Data Privacy Act on May 19th as part of an appropriations bill, which was signed by Minnesota’s governor on May 24th.  Of the other two, one is pending gubernatorial action, and the other was vetoed.

The Rhode Island Data Transparency and Privacy Protection Act (RI-DTPA) was passed by the state legislature on June 13th.  Before RI-DTPA becomes law, Governor McKee must either sign, take no action or veto it.  If signed, RI-DTPA is in force on January 1, 2026, like the Indiana Consumer Data Protection Act and Kentucky Consumer Data Privacy.

We are not, however, making assumptions about RI-DTPA’s passage.  This post was originally planned to cover the Minnesota Consumer Data Privacy Act and the Vermont Data Privacy Act, not the RI-DTPA.  On June 13th (the same day that RI-DTPA was passed), Vermont’s Governor Phil Scott vetoed the Vermont Data Privacy Act.  In his letter to Vermont’s General Assembly, Governor Scott noted that the Vermont Data Privacy Act created “big and expensive new burdens and competitive disadvantages for the small and mid-sized businesses Vermont communities rely on.”  He also noted that the private right of action is “a national outlier, and more hostile” than any other state privacy law, notwithstanding its limited scope and sunset.  He raised the possibility of a First Amendment challenge to the Age-Appropriate Design Code (Section 6), noting that “similar legislation in California has already been [preliminarily enjoined] for likely First Amendment violations.” (See here.)  A veto override was not successful.

The RI-DTPA already faces opposition from privacy advocacy organizations claiming that RI-DTPA is too weak (see, e.g., here).  Advertising associations also reportedly oppose RI-DTPA.  Nonetheless, we have highlighted some key elements of RI-DTPA in this post so you can decide for yourself, together with answers to FAQs about the Minnesota Consumer Data Privacy Act (MN-CDPA) and how it is similar to and different from the other state consumer privacy laws.

Continue Reading Minnesota Makes 19: Will Rhode Island’s Privacy Law Replace Vermont’s Vetoed Privacy Law as #20?

Please join us in New York, NY (or virtually) for the Association of National Advertisers (ANA) Law 1-Day Conference on June 26th. Team SPB will cover a variety of privacy topics affecting the advertising and marketing industry, including consumer privacy compliance, data assessments and advertising enforcement actions and class actions. Register soon because in-person space is limited.   

Team SPB panelists are Alan Friel, Julia Jacobson, Marisol Mork, Kristin Bryan, Stacy Swanson, Kyle Dull, and Sasha Kiosse, joined by industry leaders from Ankura Consulting Group, BECU, Curacity, and TikTok.

Use the code LAWCODE24 to receive complimentary registration  
WHENWHERE
June 26, 2024
11:30am – 3:45pm EST
Networking reception to follow, co-sponsored by Squire Patton Boggs and Ankura!
ANA Headquarters
155 E 44th Street, 8th Floor
New York, NY 10017
-or-
Virtual
Continue Reading ANA Law One-day Conference – Join Us June 26 in New York City

Since its inception in 1998, the Children’s Online Privacy Protection Act (COPPA) has been the cornerstone of protecting the personal data of minors under the age of 13 in the United States. COPPA imposes various requirements, including parental consent, notice and transparency, and data minimization, among other things, on online services that are “directed to children [under 13]” and “mixed audience” online services, or those that have actual knowledge that they have collected personal data from a child [under 13] online.

Many organizations that previously did not have to worry about COPPA or COPPA-based standards as applied to state consumer privacy laws should be aware of the trend in state privacy legislation to expand restrictions and obligations beyond COPPA’s under age 13 standard, to minors that are at least 13 and under the age of 18 (“Teens”). This trend began in 2020 with the California Consumer Privacy Act (CCPA) requiring consent for “sale” of personal information of consumers at least age 13 but younger than 16 years of age  (the California Privacy Rights Act expanded that requirement to “sharing” as well). Consent must be given by the Teen or, if the consumer is under age 13, by the parent, using COPPA verification standards. Other relevant aspects regarding this trend, of which organizations should be aware, include:

Continue Reading Trending: Teens’ Data Subject to Heightened Restrictions Under Ten (and Counting?) State Privacy Laws