Today, the Illinois Supreme Court resolved the hotly disputed question of whether a one-year or five-year statute of limitations period applies to claims brought under the Biometric Information Privacy Act (“BIPA”). In Tims v. Black Horse Carriers, Inc., the Court conclusively held that a five-year statute of limitations period applies to BIPA claims, expanding the timeframe for a plaintiff to bring a claim in a plaintiff-friendly ruling.  

The Tims Decision 

Tims initially brought claims under Sections 15(a), 15(b), and 15(d) of BIPA against his former employer, Black Horse Carriers, Inc., and Black Horse moved to dismiss the complaint as untimely, claiming that because the text of BIPA does not contain a statute of limitations period, the one-year statute of limitations for privacy actions under Illinois code provision 735 ILCS 5/13-201 should apply. Plaintiff argued in response that the five-year statute of limitations provided as a catchall for civil actions under 735 ILCS 5/13-205 should apply instead. In a closely watched decision, the First District split the difference, holding that Section 13-201 applied to claims brought under Sections 15(c) and (d), while Section 13-205 applied to BIPA actions under Sections 15(a), (b), and (e).

On appeal, the Illinois Supreme Court affirmed in part and reversed in part, holding that the five-year statute of limitations in Section 13-205 applies to all BIPA claims. The Court agreed with the plaintiffs’ assertion that the five-year limitations period should apply where a statute itself does not contain its own limitations period. Observing that Section 13-201 governs actions for the “publication of matter violating the right of privacy” (emphasis added), the Court looked to the plain text of BIPA and affirmed that Sections 15(a), (b), and (e) did not concern publication in any respect. Although the Court acknowledged that the terms “sell,” “lease,” “trade,” “disclose,” “redisclose,” and “disseminate” in Sections 15(c) and (d) could potentially be read as involving publication, it found that it would be “best” to apply the five-year statute of limitations period to the entire statute in considering the intention of the legislature, the intended purposes of BIPA, and the absence of a statute of limitations in the law. The Court found that this would also further certainty and predictability in BIPA actions.

Analysis & Takeaways 

Expanded Scope of Potential Liability 

With the Tims decision, plaintiffs now have five years from the date of non-compliance with Illinois’s biometric statute to file suit for BIPA non-compliance. More importantly, in addition to the extremely low bar set for establishing cognizable claims in BIPA litigation set by the Illinois Supreme Court in Rosenbach v. Six Flags Ent. Corp. 2019 IL 123186, 129 N.E.3d 1197 (Ill. 2019), the Tims opinion now allows plaintiffs in BIPA disputes to broaden putative classes. Classes may now comprise all individuals who allegedly had their privacy rights violated due to BIPA non-compliance over a five year period dating back from the time suit is filed—a significant expansion for BIPA putative class actions.

Continued Trend of Liberal Interpretations of BIPA’s Statutory Text 

As noted in Privacy World’s 2022 Biometrics and Artificial Intelligence Year-in-Review Report, one of the most significant trends in BIPA class action litigation that took place over the course of last year was the broad, expansive interpretation of key aspects of Illinois’s biometric privacy statute employed in a number of BIPA decisions by both state and federal courts. The Illinois Supreme Court’s decision in Tims continues this trend and, in so doing, also continues the outward expansion of the contours of Illinois’s biometric statute even further. Of note, the Tims Court readily acknowledged that Section 15(c) and (d) claims could arguably involve activities properly characterized as a “publication,” which would make Illinois’s shorter, one-year limitations period applicable. Despite this, however, the Court nonetheless applied the longer, five-year period, which the Court reasoned was necessary in order to best safeguard the privacy interests of Illinois residents that BIPA was enacted to protect.

Importantly, the reasoning set forth in Tims demonstrates how courts heavily favor plaintiff-friendly, liberal interpretations of BIPA’s statutory text, often reasoning that these interpretations align with the stated intent and purposes of Illinois’s biometrics statute. Tims serves as a cautionary tale and a reminder of the significant risks and liability exposure associated with BIPA non-compliance. Not only that, but the Illinois Supreme Court’s use of BIPA’s statutory intent and purposes as its main basis for applying a more plaintiff-friendly limitations period for BIPA claims will likely be utilized by plaintiffs in subsequent class actions in support of arguments designed to expand the contours and scope of Illinois’s biometrics statute even further as it relates to other key, unsettled aspects of the law.

The Illinois Supreme Court May Soon Expand Liability Exposure Even Further in Resolving the Question of Claim Accrual in BIPA Class Litigation 

Beyond Tims, the Illinois Supreme Court is set to render another much-anticipated opinion in Cothron v. White Castle Sys., No. 128004 (Ill Sup. Ct.) sometime in the immediate future, which will definitively resolve the currently unsettled issue of claim accrual in BIPA litigation. Depending on how the Court answers the question of whether every discrete failure to comply with BIPA’s requirements amounts to a separate, independent violation of the statute, the scope of liability exposure and damages underlying BIPA class actions may further increase for those companies that leverage the benefits of biometrics in their day-to-day operations.

What to Do Now: Practical Compliance Tips

The forthcoming Cothron opinion will offer much-needed clarity regarding the scope of statutory damages at issue for purported BIPA violations. However, if the Illinois Supreme Court rejects a “one and done” theory of accrual, and instead applies the continuing violation theory to BIPA claims, the overall scope of potential damages—which is already significant—will further expand.

In the interim, companies should work closely with experienced biometric privacy counsel to review and conduct a thorough audit of their current compliance practices to identify and remediate any gaps in advance of the Cothron decision and any resulting expansion in liability exposure. In particular, companies should assess their current compliance programs to ensure they encompass the following practices:  

  • Maintain a Public Privacy Policy: Maintain a publicly-available privacy policy which, at a minimum, establishes a retention schedule and guidelines for permanently destroying biometric data when the initial purpose for collecting or obtaining such data has been satisfied.
  • Permanently Destroy Biometric Data in a Timely Manner: Maintain practices and protocols to ensure that biometric data is permanently destroyed within BIPA’s mandated timeframes. As a general rule of thumb, biometric data should be permanently destroyed when it is no longer needed for the initial purpose for which it was originally collected (even where compliance with BIPA is not required). 
  • Supply Pre-Collection Notice: Provide notice to all individuals prior to the time biometric data is collected which, at a minimum, informs the individual: (1) that biometric data is being collected/stored; (2) the specific purpose for collecting the individual’s biometric data; and (3) the period of time over which the company will use and store such biometric data before it is permanently destroyed.
  • Obtain Pre-Collection Consent: Obtain consent from all individuals prior to the time biometric data is collected, allowing the company to collect, use, and store their biometric data, as well as permitting the company to share/disclose such data with the company’s vendors and service providers.
  • Maintain Security Measures to Safeguard Biometric Data: Store, transmit, and safeguard biometric data using reasonable security measures designed to prevent unauthorized access, disclosure, or acquisition of such data. Two security protocols that all companies should consider implementing whenever feasible are encryption and multi-factor authentication, both of which are extremely effective in safeguarding all types of sensitive personal information. At the same time, only those individuals with a business need for biometric data should be afforded access to such data. 
  • Strictly Prohibit Sales and Any Other Form of Profiting From Biometric Data: Strictly bar employees and vendors from selling or otherwise profiting from biometric data, which can be accomplished through the implementation and enforcement of an internal biometric data policy.
  • Vendor Compliance: Ensure that all of the company’s vendors and service providers are also fully compliant with the mandates of Illinois’s biometric privacy statute. 

Our Squire Patton Boggs lawyers and contributors are well known for their thought leadership, here on Privacy World, as well as on other platforms and at renowned conferences and events. Two members of our team were selected to speak at the International Association of Privacy Professionals (IAPP) Global Privacy Summit in April, where over ten thousand privacy lawyers and professionals will converge in Washington, DC for the multi-day conference. This continues the tradition of speaking engagements by our team at IAPP conferences where both Kyle and Julia, as well as Alan Friel (Partner, Chair of Global Data Group), Charles Helleputte (Partner, Chair of European Data Practice), Diletta De Cicco (Counsel, Brussels), and Stephanie Faber (Of Counsel, Paris) have spoken in the past. Continue Reading SPB’s Julia Jacobson and Kyle Fath to Speak at IAPP’s Global Privacy Summit in April

The start of a new year always brings New Year’s resolutions. If privacy by design is one of yours (just months after the Irish watchdog announced a €265 million fine for a breach of this concept, it seems reasonable to have it on your radar), 2023 is off to a good start with a new “privacy by design” international standard. On January 31, 2023, the International Organization for Standardization (ISO) published the standard numbered ISO 31700, officially titled “Consumer protection – Privacy by design for consumer goods and services.” It consists of two parts: a list of requirements (31700-1) and use cases (31700-2). The standard is due to be adopted by ISO on February 8.

The new standard bears an obvious resemblance to “data protection by design and by default” – a concept that is well known to companies striving to comply with (and operationalize the requirements of) the General Data Protection Regulation (GDPR). It is, therefore, worth exploring whether the two have anything in common and, if so, whether the new regime brings any good news to those dealing with the GDPR.

A Quick Overview

ISO is a global network of national bodies tasked with setting standards in different areas to address, for example, technology or societal issues. In essence, an ISO standard is an internationally recognized way of doing “things.” Some standards allow businesses to (voluntarily) certify as operating at that level if they meet the prescribed specifications and pass appropriate reviews.

On the other hand, privacy by design is a concept calling for the integration of privacy into the design and architecture of systems and business practices. Initially developed in 2009 by the Information and Privacy Commissioner of Ontario, it became an express requirement under EU law following the adoption of the GDPR. Article 25 GDPR requires all data controllers to embed data protection by design (and by default, which is a complementary concept) into their processes from the design stage and throughout their life cycle.

“Data protection by design” means that controllers must apply appropriate technical and organizational measures to their processing of personal data. There is no exhaustive list of measures, and they may vary depending on the available technology, circumstances of the processing, costs and risk assessment. The bottom line is that any design must respect data protection principles and rights. “Data protection by default” builds upon this requirement and prevents controllers from using default settings that result in “excessive” processing. Further guidance on how to operationalize these obligations is provided by the European Data Protection Board (EDPB) guidelines. For example, in relation to transparency, EDPB clarifies that this would entail clear and plain language, accessibility, timeliness, etc.

Anything New?

ISO 31700 lays down 30 requirements for embedding data privacy into consumer products and services. Like the EDPB’s approach, it does not specify thresholds or steps but keeps the ruleset high-level and provides examples for better understanding.

The standard revolves around a few pillars, each consisting of several privacy requirements. For example, the “consumer communication” pillar instructs on how to provide consumers with privacy information, respond to inquiries and complaints or prepare a data breach communication. The “risk management” pillar addresses processes such as privacy risk assessments or third-party due diligence. Further, there is an entire pillar dedicated to “privacy controls” such as data breach management. ISO 31700 also covers many other requirements, including the enforcement of consumers’ privacy rights, the assignment of relevant roles and authorities and allowing for the determination of consumer privacy preferences.

ISO 31700 is not directly linked to the EU data protection framework, but some overlaps do exist. For example, it adopted a “GDPR-ish” definition of personal information, and many of its requirements overlap with those from the GDPR. The obligation to provide privacy information and to ensure the enforcement of privacy rights is just one of the examples. Also, the standard’s sources clearly reveal that both the GDPR and the EDPB’s guidelines were used in the preparation of ISO 31700.

So, what is the relationship between ISO 31700 and the GDPR’s privacy by design and by default requirement? For now, officially, none. Conformity with the ISO standard does not equate to complying with the GDPR (and vice versa), and businesses looking to adhere to the GDPR must still observe its requirements separately.

But …

By all means, ISO 31700 should prove to be helpful for organizations. For some, ISO can serve as an inspiration for those developing technical and organizational measures and safeguards under the GDPR – a sort of “cheat sheet” with guidance and ideas. Also, the EDPB itself encouraged controllers to make use of certifications and codes of conduct available on the market. This suggests that companies relying on international standards may find it easier to showcase their compliance to authorities or build trust from consumers, which could also prove to be a strategic advantage over competitors. Finally, it is worth remembering that the GDPR foresees the introduction of special certification mechanisms according to the GDPR criteria. In providing guidance on this topic, the EDPB accepted that the certification criteria may be drawn up in observance of the ISO standards. There is certification for (almost) everything; here are another set of standards that could serve as a relevant compliance benchmark.

While Madison Square Garden might normally make headlines for musical artists or sporting events, the venue’s parent company, MSG Entertainment, has been in the spotlight following media and regulator attention regarding its use of facial recognition technology to ban certain individuals from its venues. Read on to learn more and its implications for other uses of facial recognition technology.

First, some background.  MSG Entertainment’s use of biometric facial recognition came under scrutiny last December, when an attorney employed by a law firm engaged in litigation against MSG Entertainment was denied entry from attending the Radio City Christmas Spectacular with her child. She was apprehended by the venue’s security staff, who knew her name and firm she was associated with, and purportedly informed her she had been identified by the venue’s facial recognition system as part of an “attorney exclusion list.”

This was not the only instance in which an attorney was seemingly denied entry based solely on being an attorney who is personally or whose firm is engaged in litigation against MSG. Based on several news reports, the company has a policy of excluding from its venues not only attorneys representing parties engaged in litigation against MSG Entertainment, but also all attorneys employed by the firms engaged in those litigations, and uses software to identify those attorneys from their photos on the firms’ websites. For example, a Long Island attorney was banned from MSG before a Knicks-Celtics game after her law firm filed a suit on behalf of a fan who fell from a skybox at MSG during a Billy Joel concert and another attorney was stopped from entering a MSG for a Rangers game because the attorney was employed by a firm suing MSG.

At least two law firms filed suit against MSG Entertainment in December 2022 over the ban. Although these suits did not raise biometric or AI-based claims, they alleged violations of New York state civil rights laws and prima facie tort claims and requested declaratory judgment in addition to a temporary restraining order, preliminary injunction, and permanent injunction. The ban has been met with criticism from the judges presiding over these actions, including Chancellor Kathaleen McCormick of the Delaware Chancery Court, who remarked that MSG Entertainment’s letter reinforcing the ban was “the stupidest thing [she’d] ever read.”

The debate over MSG Entertainment’s facial recognition software illustrates the divide between consumer perception of using facial recognition for authentication or verification purposes, which has generally become more accepted, versus using such technology for real-time surveillance or identification outside of the context of express consumer consent.

This shifting public perception of the various purposes for which facial recognition may be utilized is also congruent with recent legislative activity.  For example, in response to the recent events at MSG Entertainment’s venues, a bill was introduced in the New York state legislature to add “sporting events” to the list of public places of entertainment that are barred from refusing entry to individuals with a valid ticket.  New York State Senator Brad Hoylman-Sigal condemned MSG Entertainment’s policy, stating, “MSG claims they deploy biometric technology for the benefit of public safety when they remove sports fans from the Garden. This is absurd given that in at least four reported cases, the patrons who were booted from their venues posed no security threat and instead were lawyers at firms representing clients in litigation with MSG.”

Although the bill does not specifically address the use of facial recognition technology, it would nonetheless work to limit the ways in which such technology is used.  Similarly, New York Attorney General Letitia James penned a letter to MSG Entertainment warning that the ban could violate anti-discrimination laws and could chill attorneys from taking on certain types of litigation against the company.

Biometric technology has been a focus of state regulation for some time, most significantly with Illinois’ Biometric Information Privacy Act (“BIPA”); Texas’ Capture or Use of Biometric Identifier Act (“CUBI”); and Washington’s HB 1493. While BIPA is considered the most stringent of the three state statutes, each imposes certain requirements relating to notice, consent, and data security measures for biometric information or identifiers. BIPA also contains a private right of action permitting for the recovery of statutory damages, which has made it a frequent target for class action litigation. New biometric privacy bills have also recently been introduced in New York, Hawaii, Mississippi, and Maryland, which would similarly regulate the collection and use of all forms of biometric data.

Lawmakers have also enacted legislation at a local level to govern the use of facial recognition technology and, more specifically, to thwart potential improper uses of the technology. In late 2020, Portland, Oregon became the first U.S. jurisdiction to ban the use of facial recognition by the private sector, clarifying in the prefatory materials for the ordinance that lawmakers were primarily concerned with the use of facial recognition for surveillance purposes within physical spaces and its corresponding potential risks for misidentification and misuse. New York City has already enacted a municipal-level ordinance regulating the use of biometrics-powered technologies by “commercial establishments.”

As a result of certain high-profile incidents, including those discussed above related to MSG Entertainment, more states may be inclined to enacted biometric privacy bills modeled after BIPA (or taking a more tailored approach to still provide certain protections regulating biometric privacy concerns). Simultaneously, these developments may also encourage lawmakers contemplating regulating the use of this technology in other jurisdictions—but who have not yet introduced legislation and who lack an appetite for passing an outright ban—to push forward with additional biometric regulations.

MSG Entertainment is due to respond to Attorney General Letitia James’s Letter by February 13, 2023 “to state the justifications for the Company’s Policy and identify all efforts you are undertaking to ensure compliance with all applicable laws and that the Company’s use of facial recognition technology will not lead to discrimination.” For updates on MSG Entertainment’s response and other developments relating to facial recognition software in New York, Privacy World will be there to keep you in the loop.

On February 15th, our partner Scott Warren (Tokyo/Shanghai) will be speaking at the Global Legal ConfEx in Singapore on the topic of Data Privacy v. Data Security: Understanding the Distinction in Defending Your Data. This session will explore the need for both in order to create a robust and compliant data policy within your organization and to protect from cyberattack. He will further moderate the final Leaderships Round Table panel addressing a number pressing legal technology issues currently challenging companies.

Further event details can be found at online at this link. Scott has a very few free passes available, so if you are wanting to attend, please reach out to him at

In case you missed it, below are recent posts from Privacy World covering the latest developments on data privacy, security and innovation. Please reach out to the authors if you are interested in additional information.

Privacy World’s Kristin Bryan talks to Bloomberg Law on the Supreme Court’s In re Grand Jury Dismissal | Privacy World

California AG Announces CCPA Compliance Sweep of Mobile Apps ahead of Data Privacy Day | Privacy World

California Federal Court Dismisses Session Replay Litigation Following Ninth Circuit Remand, Leaves Open Future Wiretap Claims | Privacy World

Kick Start Your Data Inventory Project in 7-Steps | Privacy World

Privacy World 2022 Year in Review: Biometrics and AI | Privacy World

2023 Global Legislative Predictions – Belgium | Privacy World

Potential Rulemaking on the Horizon: CPPA Board Announces February Public Meeting | Privacy World

Supreme Court Dismisses Case Involving Attorney-Client Privilege Issues, Notwithstanding Oral Argument Occurred Two Weeks Ago, With Potential Impact for Privacy Litigations Going Forward | Privacy World

California Federal Court Dismisses GPS Data Tracking Privacy Class Action in Ruling of First Impression For CIPA Claims Involving Devices Installed by Car Manufacturers | Privacy World

ABC News Interviews Kristin Bryan In Article on Biometric Privacy Litigation | Privacy World

Scott Warren Speaking at Tokyo Summit 2023 | Privacy World

Top Ten Privacy World Posts of 2022 | Privacy World

Are You Ready for the 2023 Privacy Laws? | Privacy World

Privacy World Authors Recognized as Lexology Legal Influencers in the Technology, Media and Telecommunications (TMT) Category | Privacy World

SEC Sues Law Firm for Refusing to Disclose List of Clients Affected by Cyberattack | Privacy World

LinkedIn’s Data Scraping Battle with hiQ Labs Ends with Proposed Judgment | Privacy World

SEC Accused of Violating FOIA Deadlines for Documents on Improper Database Access | Privacy World


2022 saw cases continue to be filed under the California Consumer Privacy Act (“CCPA”), although perhaps reflecting the increasing reliance of the plaintiffs’ bar on negligence and tort-based privacy claims concerning a defendant’s alleged failure to maintain “reasonable security,” the number of cases of CCPA based claims declined. Read on for Privacy World’s highlights of the year’s most significant events concerning the CCPA, as well as our predictions for what 2023 may bring.


The CCPA went into effect on January 1, 2020, with the vast majority of its provisions applying to entities that qualify as “businesses.”

As a recap, what entities qualify as a business under the CCPA? The statute defines a business as a for-profit, private entity that (1) collects “personal information”, (2) determines the purposes and means of processing that personal information, (3) does business in California, and (4) meets certain revenue thresholds (>$25 million global gross revenue annually) and/or data collection/selling/sharing thresholds.

In addition to imposing numerous compliance obligations* on businesses, CCPA covered businesses are also subject to the law’s limited private right of action for certain security breaches.

*While the majority of this post focuses on the private right of action and enforcement-related issues, for those interested in the CCPA’s compliance obligations, effectiveness of the California Privacy Rights Act (“CPRA,”* which substantially amends the CCPA and became effective as of Jan. 1 this year), applicability of the CCPA to human resources and business-to-business data, and information on other state privacy laws, please see our recent post Are You Ready for the 2023 Privacy Laws? *References to CPRA in the remainder of this article mean the CCPA as amended by the CPRA, unless otherwise indicated.

Back to the private right of action, Section 1798.150(a)(1) of the CCPA provides a private right of action to “[a]ny consumer whose nonencrypted and nonredacted personal information … is subject to an unauthorized access and exfiltration, theft, or disclosure” due to a business failing to satisfy “the duty to implement and maintain reasonable security procedures and practices….” (emphasis supplied).

Damages available for a private right of action under Section 1798.150(a)(1) include a statutory amount of between $100 and $750 “per consumer per incident or actual damages, whichever is greater”, as well as injunctive or declaratory relief and “any other relief the court deems proper” (emphasis supplied).

CCPA Litigation Activity in 2022

Since the CCPA came into effect, nearly 300 cases have been filed by plaintiffs alleging violations of the statute.  The majority of these have been filed in California federal court (Northern and Central Districts of California being the most favored jurisdiction for such filings), with some also being brought in California state court and in other jurisdictions.

Although the number of CCPA filings declined from 2021, this may be due to the plaintiffs’ bar shifting towards alleging negligence and tort-based privacy claims in the wake of a data event.  This can be explained in part that such claims typically (although not always) are less burdensome to plead for them to survive past the motion to dismiss stage.  By contrast, it appears that based on at least rulings thus far courts have attempted to narrowly construe the CCPA’s limited private right of action.

Courts have consistently dismissed CCPA claims when it is clear from the face of the complaint that Plaintiff’s allegations do not concern a security breach as required to plead a civil cause of action under the CCPA.  Additional rulings this year reinforced the temporal requirements of the statute (that it must involve conduct arising as of the CCPA’s date of enactment, not before) and that the CCPA could not be relied upon by a defendant as a basis for refusing to comply with its discovery obligations in litigation.  Although many CCPA litigations involve software based claims and the tech industry in the wake of a data breach, healthcare and financial services entities, among others, have also been targeted.

CCPA Claims, Article III standing and Settlement Activity

As longtime readers of the blog are aware, Article III standing in the context of data privacy cases is in a constant state of flux—particularly in the Ninth Circuit.

When a CCPA claim is asserted in federal court, it must meet that “irreducible minimum,” as it is frequently described.  Article III standing consists of 1) suffering some actual or threatened injury; 2) fairly traceable to the defendant; which 3) is likely to be redressed by a favorable decision.  The injury must be concrete, rather than abstract, and particularized, meaning that it affects the plaintiff in a personal and individual way.  Spokeo, Inc. v. Robins, 578 U.S. 330, 339 (2016).  But as the Supreme Court held in 2021, “an injury in law is not an injury in fact,” and a plaintiff must do more than show a bare statutory violation for a claim to exist. TransUnion LLC v. Ramirez, 141 S. Ct. 2190, 2205 (2021).

In Kirsten, 2022 WL 16894503, the Central District of California addressed a defendant’s contention that a plaintiff lacked standing to pursue a CCPA claim, among others, because they could not fairly trace instances of identity theft, fraudulent credit card charges, and inability to access online accounts to the data breach at issue.  The court rejected the defendant’s argument, holding instead that past injury from misappropriated personal information gave rise to a substantial risk of threatened injury in the future.  Particularly notable is the court’s premising standing both on the actual injuries the plaintiffs experienced and the injuries they might experience in the future.

In Hayden v. Retail Equation, Inc., 2022 WL 2254461 (reconsidered and vacated in part on other grounds), the Central District of California addressed the specific requirements necessary to give rise to an injury under the CCPA.  Plaintiffs, retail consumers, sued a variety of retailers for their use of a “risk scoring” system that collected and shared individualized personal data with a vendor in order to assess the risk of fraud when a consumer attempted a product return or exchange.

Plaintiffs sued under Cal. Civ. Code § 1798.150(a), which required them to show that “nonencrypted and nonredacted personal information” was “subject to an unauthorized access and exfiltration, theft, or disclosure as a result of the business’s violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information.”  The Court found that Plaintiffs had not asserted a claim under the CCPA because the disclosure of their information was not the result of a failure to implement and maintain reasonable security procedures and practices; rather, it was “a business decision to combat retail fraud.”  Plaintiffs’ failure to allege a violation of specific duties under the CCPA, as opposed to a more generalized complaint about the misuse of their data, could not support their claim.  The Hayden court also found that non-California residents lacked standing to bring suit under the CCPA.

The most significant CCPA settlement of 2022 was the $350 million T-Mobile settlement to resolve multidistrict litigation brought by T-Mobile customers whose data was allegedly exposed in a 2021 data breach.  In August 2021, T-Mobile disclosed that it had been the victim of a cyberattack that resulted in the compromise of some current, former and prospective customers’ SSN, name, address, date of birth and driver’s license/ID information the “Data Event”).  By T-Mobile’s account, no “customer financial information, credit card information, debit or other payment information” was exposed in the attack.  Nevertheless, over 40 putative class action claims were filed seeking damages for the improper disclosure of Plaintiffs’ personal information.

On July 22, 2022, Plaintiffs in the T-Mobile case filed an unopposed motion for preliminary approval of a proposed settlement to the class.  As part of the settlement, T-Mobile agreed to fund a non-reversionary $350 million settlement fund to pay class claims for out-of-pocket losses or charges incurred as a result of identity theft or fraud, falsified tax returns, or other alleged misuse of a class member’s personal information.  The settlement fund will then make payments to class members on a claims-made basis with a $25,000 aggregate claims cap per class member.  The proposed settlement also contemplates attorneys’ fees of no more than 30% of the settlement fund, approximately $105 million, and $2,500 individual service awards to class representatives.

2022: Continued Enforcement Activity by California OAG

As we predicted at the end of last year, 2022 saw continued enforcement activity at the state level. Headlines were ablaze in August with California’s Office of the Attorney General announcing its first settlement of a CCPA enforcement action.

Readers of the blog will know that the CA OAG’s CCPA enforcement efforts started in July 2020. While numerous cookie DNS and GPC cases were initially (and quietly) settled by the OAG without monetary penalty or public settlements, that all changed in August 2022 with the OAG announcing its required payment of $1.2 million from a retailer to settle claims of alleged CCPA violations.

The settlement marks a new era of CCPA enforcement in which real repercussions, including monetary penalties, may be imposed. In addition to the settlement, the OAG released “illustrative examples” of other non-public enforcement cases, including the types of violations, remediation activities carried out by the alleged violators, and the alleged violators’ type of business/industry (which included a number of industries that surprised many who thought they were perhaps not on the OAG’s radar for CCPA compliance, such as B2B-focused businesses and companies that are largely (but not fully) exempt from the CCPA, such as healthcare businesses and financial and insurance businesses.  For detailed analysis of the OAG’s settlement, see our blog post here.

Litigation and Enforcement in 2023 and Beyond


The CPRA’s amendments to the CCPA brought some changes to the private right of action for certain security breaches, namely an expansion of the private right of action where a breach involves data in the form of an email address in combination with a password or security question and an answer that would permit access to an account. In addition, the CPRA’s amendments provide that that remediation of vulnerabilities post-breach are an insufficient cure to preclude statutory damages.

There is not otherwise a private right of action for non-security breach related violations under the CPRA; however, the CPRA opens the possibility of enforcement by all California county district attorneys and the four largest city district attorneys (though that is up for debate). In addition, despite the clarity that the private right of action is limited to certain types of security incidents, it is conceivable that an incomplete or inaccurate response to a consumer request might also give rise to an independent deception claim, and plaintiffs’ lawyers are expected to otherwise test the scope of the limitation on private consumer and class action relief. There is no private right of action for violations of the Virginia Consumer Data Protection Act (“VCDPA”), Colorado Privacy Act (“CPA”), Utah Consumer Privacy Act (“UCPA”), or Connecticut Act Concerning Personal Data Privacy and Online Monitoring (referred to as the “CTPA” herein). Put another way, this means there is not a private right of action for security breaches or security-breach related violations under those laws.


The enforcement risk will certainly increase under the CPRA in 2023 with the California Privacy Protection Agency, or CPPA, enforcing the CPRA alongside the OAG starting on July 1, 2023. In addition to California, Virginia’s privacy law came into effect and was enforceable as of January 1, and privacy laws in Colorado, Connecticut, and Utah will become effective throughout the year (see chart below).

Effective Date Jan. 1, 2023 Jan. 1, 2023 July 1, 2023 Dec. 31, 2023 July 1, 2023
Enforcement Date July 1, 2023 Jan. 1, 2023 July 1, 2023 Dec. 31, 2023 July 1, 2023
Enforcement Details 30-Day Notice and Cure Provision will remain in effect indefinitely for security breach violations only. 30-Day Notice and Cure Provision will remain in effect indefinitely. 60-Day Notice and Cure Provision will remain in effect until January 1, 2025 30-Day Notice and Cure Provision will remain in effect indefinitely. 30-Day Notice and Cure Provision will remain in effect until December 31, 2024.

Enforcement of the CPRA is delayed until July 1, 2023 and, unlike the CCPA between its effective and enforcement dates, there is an explicit grace period between January 1 and July 1, 2023. However, the CCPA’s provisions (without the CPRA’s amendments) will remain effective and enforceable between January 1 and July 1, and the required 30-day cure period no longer exists. Importantly, this means that the full scope of the CCPA also currently applies to HR and B2B data, and there is no delay in enforcement with respect to the same.

Under the CPRA, both agencies can seek civil penalties of $2,500 for each violation or $7,500 for each intentional violation or violations involving the data of minors. Violations may be potentially calculated based on each applicable piece of data or consumer, and, thus, exposure could be substantial. The existing requirement in the CCPA to provide notice of violation and give a 30-day cure period before bringing an enforcement action is eliminated by the CPRA, but the law permits the agencies to consider good faith cooperation efforts by the business when calculating the fine, and prosecutorial discretion is not limited. Further, CPPA actions are subject to a probable cause hearing prior to commencement of an administrative enforcement proceeding.

In Virginia, Utah, and Connecticut, the Attorney General has exclusive enforcement authority. The Virginia Attorney General may seek injunctive relief and civil penalties of $7,500 per violation. In Colorado, the state Attorney General or District Attorneys may bring an action for injunctive relief and civil penalties under the Colorado Consumer Protection Act, which provides for civil penalties of $500 per violation, actual damages, or three times actual damages if bad faith is shown. In Utah, the Attorney General may bring an action for actual damages to consumers and civil penalties of up to $7,500 per violation. In Connecticut, the Attorney General may treat a violation of CTPA as an unfair trade practice under the Connecticut Unfair Trade Practices Act (“CUTPA”); however, the private right of action and class action provisions of CUTPA dot not extend to violations of the CTPA. Nevertheless, remedies available for violations of CUTPA include restraining orders; actual and punitive damages, costs, and reasonable attorneys’ fees; and civil penalties of up to $5,000 for willful violations and $25,000 for restraining order violations.

However, like the CCPA (but unlike the CPRA), the respective Attorneys General of Virginia and Utah must provide a controller or processor with 30 days’ written notice of any violation of the VCDPA/UCPA, specifying the provisions that the Attorney General alleges have been violated. In Virginia and Utah, a controller or processor can avoid statutory damages if, within this 30-day cure period, it cures the noticed violation and provides the Attorney General with an express written statement that the alleged violations have been cured and that no further violations will occur. Under Connecticut and Colorado’s laws, their respective AGs must provide violators with notice of alleged violations and an opportunity to cure any such violations within a 60-day period following delivery of the notice. The requirement to allow for a cure period in Colorado sunsets on January 1, 2025 (though, the AG would almost certainly have prosecutorial discretion to allow for a cure). In Connecticut, the cure requirement becomes discretionary on January 1, 2025, as well.

Check back often for our continued updates on privacy litigation and enforcement trends and updates.  Privacy World will be there to keep you in the loop.

This week, Privacy World’s Kristin Bryan was interviewed by Bloomberg Law regarding the Supreme Court dismissal as “improvidently granted” a case involving an unnamed law firm seeking to prevent the U.S. government from accessing the records of a client accused of violating tax laws.  The Court’s ruling has implications for frequently raised privilege issues in other data breach and cybersecurity cases, as covered by Bloomberg.  In re Grand Jury, Dkt. No. 21-1397.

In re Grand Jury presented the issue of whether a communication involving both legal and non-legal advice is protected by attorney-client privilege when obtaining or providing legal advice was one of the significant purposes behind the communication. The Supreme Court’s dismissal leaves intact the prior ruling of the Ninth Circuit from September 2021.  13 F.4th 710 (9th Cir. 2021).  There, the grand jury issued subpoenas related to a criminal investigation.   The district court held a law firm and its client (an unnamed company) in contempt after they failed to comply with the subpoenas.  The district court had ordered the law firm to produce documents to the government after redacting tax-related legal advice. The district court ruled that certain dual-purpose communications between the law firm and its client were not privileged because the “primary purpose” of the documents was to obtain tax advice, not legal advice.  Before the Ninth Circuit, the law firm and its client (collectively, “appellants”) argued that the district court erred in relying on the “primary purpose” test and should have instead relied on a broader “because of” test.  The Ninth Circuit, however, affirmed and concluded that the primary-purpose test governs in assessing assertions of attorney-client privilege for dual-purpose communications.

Disputes over privilege can arise in data breach and cybersecurity litigations, as forensic reports and communications related to a forensic report’s findings are frequently sought by plaintiff’s in discovery to buttress their claims and theories, Privacy World previously covered).  Be sure to check out Bloomberg Law’s analysis of this decision.

And for more, stay tuned.  Privacy World will be there to keep you in the loop.

California Attorney General Rob Bonta announced today an investigative sweep of mobile apps, focused on popular apps in the retail, travel, and food service industries that fail to comply with the California Consumer Privacy Act (CCPA). According to a press release, the sweep is focused on apps that allegedly fail to comply with consumer opt-out requests or do not offer any mechanism for consumers who want to stop the sale of their data. The press release also highlights enforcement in relation to handling of agent requests, namely through an agent service created by Consumer Reports called “Permission Slip.”

Continue Reading California AG Announces CCPA Compliance Sweep of Mobile Apps ahead of Data Privacy Day

Last summer, the Court of Appeals for the Ninth Circuit buoyed plaintiffs’ lawyers  interest in “session replay” software when it revived a putative class action against a website operator and a session replay software provider for violations of the California Invasion of Privacy Act (CIPA).  Earlier this month, addressing issues left by the Ninth Circuit for remand, the district court dismissed the same complaint as being barred by the statute of limitations.  Javier, No. 3:20-cv-02860-CRB, 2023 WL 114225 (N.D. Cal. Jan. 5, 2023).  However, the District Court’s decision, in addition to giving plaintiff an opportunity to refile, rejected other defendants’ arguments on the application of CIPA to session replay software. Ultimately, the Court’s opinion may prove to bolster future plaintiff’s claims.

A short refresher on the technology at issue: session replay software captures certain aspects of a user’s interactions on web applications (mouse movements, clicks, typing, etc.) along with underlying contextual user data to help website operators enhance users’ experiences.  Accordingly, session replay software allows a website operator to recreate (or “replay”) a visitor’s journey on a web site or within a mobile application or web application.  Rather than focusing on user activity after leaving a particular website, session replay software concerns how a user interacts with a specific website.

In the Javier case, the plaintiff has brought claims under Section 631 of CIPA against a website operator and the maker of session replay software called “TrustedForm” (ActiveProspect).  CIPA, like other state wiretap laws, provides for liability against third-party eavesdroppers and those that abet eavesdropping when one party to a communication has not consented to the eavesdropping.  In reversing the previous order dismissing the case, the Ninth Circuit held that plaintiff’s allegations that he did not consent to having his session tracked before agreeing to the website’s privacy policy were sufficient.  However, the Court did not address three of the defendants other defenses: (1) that the plaintiff gave “implied consent”, (2) that the software provider is not a ”third party” under CIPA, and (3) that the statute of limitations had run.  On remand, the District Court rejected two of defendants’ arguments and accepted, for now, the third.

Implied Consent

Defendants first argued that even before plaintiff consented to session replay via the website privacy policy, he gave implied consent “the moment he arrived at [the operator’s] website” and began filling out a webform.  The Court quickly dismissed this argument, holding that while the plaintiff’s use of the website may show that he consented to the website operator’s collection of his information, it provides no evidence that he consented to the third-party software provider’s collection of the information, which is the key question for wiretap claims.

Third-Party Eavesdropper

Defendants’ also argued that plaintiff’s CIPA claims fail because the software provider was not a “third party” under the statute; instead it was merely an “extension” or “tool” of a first party.  So far, this argument has divided courts in an intra-circuit split.

One side of the split has held that where the alleged third party (the session replay software provider) is doing only what the party to the communication (website operator) directs, and does not use the information for its own benefit, then that purported third party is nothing more than ‘an extension’ of the party and cannot be liable under a statute concerned only with non-party recording.  See Graham v. Noom, Inc., 533 F. Supp. 3d 823, 833 (N.D. Cal. 2021) (Beeler, J.); Johnson v. Blue Nile, Inc., No. 20-CV-08183-LB, 2021 WL 1312771, at *2 (N.D. Cal. Apr. 8, 2021) (Beeler, J.); Yale v. Clicktale, Inc., No. 20-CV-07575-LB, 2021 WL 1428400, at *3 (N.D. Cal. Apr. 15, 2021) (Beeler, J.); Williams v. What If Holdings, LLC, No. 22-cv-3780, 2022 WL 17869275 (N.D. Cal. Dec. 22, 2022) (Alsup, J.).

The other side of the split, and the winning side in Javier, holds that software providers are not “extensions” of participants to the conversation equivalent to inanimate tape recorders.  See Revitch v. New Moosejaw, LLC, No. 18-CV-06827-VC, 2019 WL 5485330, at *2 (N.D. Cal. Oct. 23, 2019) (Chhabria, J.); Yoon v. Lululemon USA, Inc., 549 F. Supp. 3d 1073, 1081 (C.D. Cal. 2021) (Holcomb, J.).  The court in Javier emphasized that there is no intentionality or “use” requirement in Section 631 and that such a requirement would render parts of the statute superfluous.

While the Court denied defendants’ motion to dismiss on this argument, it left open the factual question for a later stage whether the “the ubiquity of services like ActiveProspect on the internet effectively renders it party” to the communication such that the plaintiff would have not been an unannounced third party.

Statute of Limitations

Finally, the defendants were successful, at least for now, on their statute of limitations argument.  CIPA has a one-year statute of limitations, and the plaintiff filed his complaint 14 months after his visit to the website.

Plaintiff argued that his claims were not barred by the statute of limitations because of the “delayed discovery doctrine”, where the time did not start to run until he should have suspected that he suffered a legal injury caused by wrongdoing.  The Court held that plaintiff could not invoke the delayed discovery doctrine because he admitted to “assum[ing]” that the website operator would collect his data, and thus was on notice of his potential injury and that a third party may be aiding in the collection.  The Court pointed to the website’s privacy policy, which states that it “may use third party vendors to assist” with “monitoring and analyzing Site activity.”  The Court also noted that there were no plausible allegations that the website “surreptitiously” hid its use of session replay software.

However, the Court ultimately decided that “it is not clear that this defect cannot be cured by amendment” and granted plaintiff leave to amend as to the delayed discovery rule.


With the Court granting plaintiff leave to amend, it is possible that there may still be more to come with Javier.  More importantly, the Court’s rulings against defendants’ defenses can only bolster future plaintiff’s filings.

After its decision granting the motion to dismiss, the Court granted an administrative motion to relate another class action against the same session replay vendor and the parent company of the website operate for a related website.  In other words, there is no signs of session replay litigation slowing down.  Privacy World will be here to break down the developments.  Stay tuned.