Following unanimous votes by the California legislature and signature by the Governor, California enacted an Age-Appropriate Design Code Act (CAADCA) in September 2022 (codified at CA Civil Code Section 1798.99.28-32), as a measure purportedly “aimed at protecting the wellbeing, data, and privacy of children [under 18] using online platforms.” Industry group NetChoice soon turned to federal court and sought an injunction seeking to prevent the law from being enforced on the grounds, among others, that it violates the First Amendment and the dormant Commerce Clause of the United States Constitution and is preempted by other federal statutes addressing online child safety, including the Children’s Online Privacy Protection Act (COPPA).
In September 2023, a U.S. district court granted a preliminary injunction in favor of NetChoice, holding that CAADCA likely violates the First Amendment. Specifically, the court reasoned that the law regulates expression by limiting the use and sharing of (personal) information and that California’s justifications did not rise to the level of state interest required to regulate expression under the U.S. Constitution. The 9th Circuit upheld that decision, but only as to Data Protection Impact Assessments (DPIAs), and went further to find that such assessments are subject to strict scrutiny and are facially unconstitutional. See Netchoice, LLC v Rob Bonta, Atty General of the State of California (9th Cir., August 16, 2024)(“Netchoice I”). (We discuss the implications on all privacy assessment reporting or risk of harm publication requirements here and here.) The appellate court, however, overruled the district court as to the enjoining of other provisions of CAADCA, such as restrictions on the collection, use, and sale of a minor’s personal data and how data practices are communicated, and remanded to the lower court for further consideration as to if the standards for a “facial challenge” could be met, or if individualized analysis would be necessary to consider constitutionality as applied to specific plaintiffs on an “as applied” factual basis. NetChoice amended its facial challenges, and the court again granted an injunction as to the more narrowly challenged provisions, which the state again appealed.
On March 12, 2026, the Ninth Circuit ruled on the “Netchoice II” appeal, holding:
- Netchoice is unlikely to succeed on a facial challenge to CAADCA’s coverage definition.
- Content-based restrictions on expressive speech must be narrowly tailored and serve compelling state interests. (A somewhat lower standard applies to commercial speech.)
- However, Netchoice brought a facial challenge to the law applied to all companies, not just certain content publishers or specific content or applications. As such it must first show content-based restrictions to all to which the law applies, and then that compelling state interests do not justify the restrictions on the same facial basis.
- The law applies to any “business that provides an online service, product or feature likely to be accessed by children” and provides eight indicators of likely access. While the court found some indicators were likely or potentially content-based, it found others like audience composition and web traffic demographic data to be content neutral.
- The court agreed with the state that ride sharing services, ticket brokers, fitness products and other content neutral app and site operators could all potentially fall into coverage and thus a facial challenge on grounds of content regulation was not viable.
- The court concluded that any one element triggers coverage, so practically a service with user demographics that indicate “significant” minor usage would be covered even if some of the content-oriented coverage indicators might not pass constitutional muster in an as applied challenge.
This means any service provider with a “significant” number of minors using its service are covered. While there is no explicit diligence obligation in the statue, the court commented that “[t]he record establishes, and Netchoice nowhere denies that frequently, this is data that online businesses already have.” The court also cited to its 2025 decision rejecting Netchoice’s facial challenge to age verification under a different CA child safety law, which relied in part on the ability to conduct age verification in the background using tools that do not require user input and stated “the same is true of determining whether ‘[a] significant amount of the audience’ is ‘children.’” See Netchoice, LLC v Bonta (Netchoice SB 976), 152 F4th 1002 (9th Cir. 2025). This suggests an expansive net to bring websites and mobile app operators, and other online service providers, under the CAADCA provisions, and that if a regulator can show “based on competent and reliable evidence regarding audience composition, [that a service is] routinely accessed by a significant number of [minors],” having turned a blind eye to that will not suffice to avoid coverage.
- Netchoice is unlikely to succeed on a facial challenge to CAADCA’s age estimation requirements.
This gives a green light to age verification obligations, at least if narrowly tailored, even if what can be required in so far as privacy and safety requirements for minors remains subject to debate.- The CAADCA requires covered business to “[e]stimate the age of child [under 18] users with a reasonable level of certainty appropriate to the risk that arise from the data management practices of the business or apply the privacy and data protections afforded to children to all consumers.”
- The Court found that age verification on its own was content neutral and noted that companies could avoid age verification by applying the Act’s privacy and data protection provisions for children to all users. It further noted that Netchoice had not challenged about half of those provisions and that, while some might impact expressive activity and be subject to First Amendment scrutiny (which is not to say they might not survive that analysis), others more narrowly focused on data practices would not be.
- So, under this holding, online service providers can be required to verify age or provide the highest level of protections to all users, though what protections fall in and out of regulation of expression, and which of the latter can be justified by state interest in protecting teens and children, remains subject to fact-based, as applied challenges. In addition, the court found that some of the data practice provisions could not be enforced as written due to vagueness of what they require, as discussed in the next section below.
- The court left the door open to challenges to age verification requirements that “prevent access to content or require data collection for compliance.”
The Court of Appeals also held that:
- Netchoice is likely to succeed on a facial challenge to certain data use restrictions as unconstitutionally vague.
- The CAADCA prohibits data use “in a way that a business knows, or has reason to know, is materially detrimental to the physical health, mental health or well-being of a child” and prohibits profiling unless both “it can demonstrate appropriate safety guidelines” and that the profiling is either necessary to provide a requested service or “in the best interest of children.”
- Netchoice contended, and the lower and appeals courts agreed, that the undefined terms “material detriment,” “best interests,” and “well-being” are overly vague as the “terms have no established meaning and the CAADCA provides no guidance.”
- Vagueness is cured by specificity which “give(s) fair notice of conduct that is forbidden or required.” Accordingly, the legislature could pass more specific child and teen data usage and online safety requirements or authorize agencies to do so by regulation. Some of the newer state child and teen safety laws have attempted to do so.
- Netchoice is likely to succeed on a facial challenge to some dark pattern restrictions as unconstitutionally vague.
- The CAADCA applies the California Consumer Privacy Act’s definition of “dark pattern” as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation.” This definition is based on long-established deceptive practices principles under state and federal consumer protection laws and was not challenged. Also, not challenged were the CAADA’s prohibition on using dark patterns to “lead or encourage children to provide more personal information beyond what is reasonably expected to provide that online service, or feature to forgo privacy protections.”
- What was challenged as void for vagueness was the prohibition of using dark patterns “to take any action that the business knows, or has reason to know, is materially detrimental to the child’s physical health, mental health or well-being.” Like the data use restriction, the court found that “the range of harms that could plausibly qualify as ‘materially detrimental’ is vast, spanning everything from financial exploitation, to sleep loss, distraction or hurt feelings.” Absent more statutory or regulatory guidance “[b]usinsses of ordinary intelligence cannot reliably determine what compliance requires[,] especially since the “the prohibition’s use of the singular ‘child’ .. suggests that it is actionable based on a single child’s response … whenever one of them experiences a harm that a regulator deems ‘material.’”
- Again, specificity cures vagueness, so legislature is free to redraft with more details of what is required, especially where the requirements remain content neutral (but, keep in mind that even content restrictive requirements may still survive a First Amendment challenge).
So, what next? Here is where it gets even more procedurally and technically complicated. In Netchoice I, the Ninth Circuit upheld the lower court’s injunction of the CAADCA’s notice and cure provision as not grammatically severable from the enjoined assessment and reporting publication requirements, which the appeals court agreed were Constitutionally infirm. In Netchoice II, the Ninth Circuit did not reach a decision as to whether notice and cure provisions were also volitionally non-severable from the CAADCA provisions the appeals court found likely to succeed. Instead, it remanded the issue back to the lower court for a more directed analysis and vacated the injunction as to non-severability in the meantime. In other words, although not strictly inseverable from the opportunity to cure in the way the DIPA provisions were found to be, a determination under state statutory interpretation principles needs to be conducted to establish if the legislature would have passed the CAADCA’s provisions that have been found otherwise valid, at least facially, were they not subject to a notice and cure opportunity. The appeals court found that the record was simply insufficient at the current stage of litigation to enable such a finding. This, like the vagueness infirmities, could be fixed by legislative action.
So, while the court stated that California may “pursue compliance … of CAADCA’s valid remainder,” such enforcement will be subject to “as applied” Constitutional challenges. Moreover, absent legislative clarity through amendment, California may also face a volitional severability challenge arguing that, without a statutory opportunity to cure, the purportedly valid remainder is not actually valid. Subject to those basis for a challenge, and separating out the obligations that are arguably not neutral as to expressive content, the following CAADA provisions find validity support in the Netchoice II decision:
- Estimate the age of child users with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business or apply the privacy and data protections afforded to children to all consumers.
- Provide any privacy information, terms of service, policies, and community standards concisely, prominently, and using clear language suited to the age of children likely to access that online service, product, or feature.
- If the online service, product, or feature allows the child’s parent, guardian, or any other consumer to monitor the child’s online activity or track the child’s location, provide an obvious signal to the child when the child is being monitored or tracked.
- Enforce published terms, policies, and community standards established by the business, including, but not limited to, privacy policies and those concerning children.
- Provide prominent, accessible, and responsive tools to help children, or if applicable their parents or guardians, exercise their privacy rights and report concerns.
- Not:
- Collect, sell, or share any precise geolocation information of children by default unless the collection of that precise geolocation information is strictly necessary for the business to provide the service, product, or feature requested and then only for the limited time that the collection of precise geolocation information is necessary to provide the service, product, or feature.
- Use dark patterns to lead or encourage children to provide personal information beyond what is reasonably expected to provide that online service, product, or feature to forego privacy protections.
- Use any personal information collected to estimate age or age range for any other purpose or retain that personal information longer than necessary to estimate age. Age assurance shall be proportionate to the risks and data practice of an online service, product, or feature.
The following requirement is less clearly valid, as it incorporates the “best interest” standard the Ninth Circuit found to be vague, though not specifically in this context:
- Configure all default privacy settings provided to children by the online service, product, or feature to settings that offer a high level of privacy, unless the business can demonstrate a compelling reason that a different setting is in the best interests of children.
It should also be noted that the provisions of the California Consumer Privacy Act regarding children under 13 and those aged 13-16 (e.g., regarding opt-in to selling or sharing (for cross-context behavioral advertising)) are unaffected by the Netchoice challenges.
Add to the ongoing uncertainty regarding the exact scope of CAADCA validity and enforcability, the various constitutional challenges to other child and teen online safety legislation, and the various online product safety tort and consumer protection unfairness claims percolating through the nation’s courts, and future of online minors’ safety standards in the U.S. remains unclear – at least from a legal compliance perspective. On the other hand, publishers, platforms and other online service providers are free to adopt minor safety best practices standards, and doing so could serve to protect reputation and be a market differentiator.
For more information, contact the author, or your Squire Patton Boggs relationship partner.

