On March 31, 2026, the Office of the Australian Information Commissioner (OAIC) released its much-anticipated Exposure Draft of the Privacy (Children’s Online Privacy) Code (Draft Code). It introduces a number of novel concepts in addition to drawing from the UK Age-Appropriate Design Code (UK AADC), in an effort to “uplift privacy practices across entities more broadly” and keep children’s privacy safe in Australia.  This posts breaks down how it could impact businesses.

Background

  • The Draft Code would apply to providers of designated internet services, relevant electronic services and social media services (Online Service) but only if the Online Service is primarily concerned with the activities of children or likely to be accessed by children.
  • The Draft Code, therefore, intends to cover any online spaces that may still process large amounts of children’s personal information. This will capture Online Services which are not directed to children, such as update sharing apps used by daycares.   
  • The Draft Code may apply to organisations who provide an Online Service, even where those organisations are not typically considered as technology platforms. For example, organisations who offer a subscription service to children (in addition to their broader offering) will need to ensure that that service complies with the Code.  
  • The key features of the Draft Code, as are now well publicised and discussed below, are:
    • age assurance for all users;
    • assessing and building into an Online Service’s data practices the “best interests” of a child;
    • enhanced consent requirements;
    • unprecedented obligations when direct marketing to children; and
    • introducing a right to delete for children.
  • The Code is open for public consultation until 5 June 2026, with the opportunity for participants to attend virtual roundtables between 31 May and 5 June. While the Code must be registered by 10 December 2026, no commencement date has been announced. Participants are welcome to propose an appropriate commencement date (and transition period) in their submissions.
  • It is clear that the OAIC has been ambitious in both the scope and content of the Draft Code. We strongly encourage clients to consider the effect of the Draft Code – including those features which we have flagged as possible Issues for Implementation below – and make submissions on areas which may be of concern.
  • Many of the proposed inclusions in the Code would change the application of the Australian Privacy Principles (APP) and could hint at broader changes to come – which is another good reason to make your views known.

What is the Code?

  • The Code is intended to sit under the Privacy Act 1988 (Cth) (Privacy Act) as a form of delegated legislation.
  • Once registered, the Code will operate in addition to the requirements of the APPs. Organisations who are caught by the Code will demonstrate their compliance with the APPs (as they relate to children) through adherence to the Code.

Key Features

Age Assurance

Key Insights

  • Under the Draft Code, before personal information is collected from an individual, an Online Service must take reasonable steps to ascertain the end user’s age (Age Assurance).
  • Like the UK AADC, the form of Age Assurance is required to align with the “risk of harm” arising from the processing of personal information:
    • If an Online Service collects a very small amount of information from children, it may be that that Service can impose low-friction Age Assurance.
    • On the other hand, if an entity were to collect significant personal information about children and use it for high-risk purposes (e.g. targeting advertising), the more robust the Age Assurance needs to be.  
  • It is open to the Online Service not to undertake Age Assurance if they apply the protections in the Code to all end-users, but this is unlikely to be a practical option for Online Services with a mixed audience.
  • Age assurance applies differently and more extensively under the Draft Code than under the Phase 2 Online Safety Codes, the social media pause or other countries’ online safety laws. This is because:
    • It applies before any personal information is collected from an end-user – regardless of whether that end-user has an account with the Online Service or not.
    • Unlike the online safety pause, Age Assurance involves ascertaining an end-user’s age. Presumably, this is because the Draft Code applies differently depending on a child’s age (e.g., some obligations only apply if a child is under 15). However, it means that, as drafted, an Online Service must ascertain an adult end-user’s age. This is disproportionate and contrary to the UK AADC, which requires instead that online services understand the “age range of children likely to access the service”.
    • Unlike Singapore’s Code of Practice for Online Safety for App Distribution Services, the OAIC has decided that it will not be the responsibility of a third party (like an app store provider) to conduct age assurance.

Issues for Implementation

  • Despite the slated effort to ensure consistency, the Draft Code is more prescriptive than the UK AADC: it requires entities to review and update, at least annually, its privacy processes (including for Age Assurance) and imposes strict limits on retention of facial age estimation information, which is seen as a category of sensitive information. This places a higher burden for businesses and organisations captured by the Code.
  • Further, unlike other Australian laws, Age Assurance will extend to registered and unregistered users. This will be a challenge for many Online Services, who will have to find a proportionate way of collecting and retaining such information.
  • Where an Online Service caters to differently-aged children and to both registered and unregistered users, there is real doubt on what “reasonable steps” will look like, and it may be practical for that Service to impose the highest possible threshold, regardless of risk.

Privacy by default

Key Insights

  • Under the Draft Code, an Online Service must implement measures which, by default, ensure that the entity only collects, uses or discloses personal information as is strictly necessary to provide their service.
  • Children must be able to control any additional collection, use or disclosure of personal information that is not strictly necessary, via accessible and clear means.

Issues for Implementation

  • This proposed requirement is closely aligned with the UK AADC (which requires “high privacy by default”) but, unlike the GDPR, there is no express requirement for data minimisation in the Privacy Act. There is also no requirement for data privacy by default. This new obligation therefore sits as a wholly new requirement, and operates quite differently from APP 3, which imposes a reasonably lenient condition that personal information must be “reasonably necessary” before it is collected.
  • As drafted, the Draft Code would apply broadly, including to Online Services which are likely to be accessed by children. Many older children who access these Services may appreciate having a tailored, targeted online experience. They will no longer be able to get that – at least by default.

Best Interests of a Child

Key Insights

  • The Draft Code provides that children’s personal information must be collected, used and disclosed consistent with the ‘best interests of the child’ (Best Interests). The principle has been taken from the United Nations Convention on the Rights of the Child and the Explanatory Statement to the Draft Code suggests it will involve consideration of factors like:
    • the nature and extent of child exploitation risks;
    • the likely mental or physical impacts on the child; and
    • the extent to which children’s abilities may be affected.
  • The Best Interests test applies to almost all activities where personal information is handled about a child. The way that it has been framed in the Draft Code contrasts with the UK AADC, which requires that an online service consider instead “best interests of child users in…[the] design of an online service”.
  • In particular, the test applies in addition to existing permissions for using personal information (including where for a related secondary purpose), and to prescriptive requirements for obtaining a children’s consent under the Draft Code (including a requirement to refresh consent every 12 months).
  • In this way, it operates similarly to the “fair and reasonable” test that the OAIC has frequently espoused and which the Government accepted in principle in their Response to the Privacy Act Review (Government Review)), which is equally broad and would apply on top of existing obligations under the APPs. 

Issues for Implementation

  • Best Interests under the Draft Code go beyond what the Australian Government agreed to in principle in 2023, creating onerous implementation challenges for entities.
  • Instead of treating “best interests” of children as one factor that entities must have regard to when considering whether collection, use or disclosure is ‘fair and reasonable’ (as proposed in the Government Review), the Draft Code adopts an extremely strict approach by imposing it as a threshold condition for all collection, use and disclosure. In particular, it removes the flexibility of an organisation being able to rely on use/disclosure of personal information in a way which is consistent with an individual’s “reasonable expectations” and related to the primary purpose of collection.
  • Even though the Draft Code does, at times, recognise that different age ranges have different interests and needs, Best Interests is another example of where this is not made out:
    • The obligation to handle personal information consistently with a child’s best interests does not make any reference to different age ranges – or, as flagged above, the different levels of interaction that a child user may have with an Online Service.  
    • Acting consistently with a 17‑year‑old’s best interests is quite different from acting consistently with a 13‑year‑old’s best interests, yet the Draft Code does not provide detailed guidance on how entities should navigate these differences in practice.

Direct marketing

  • To compliantly market to a child, an Online Service must:
    • obtain personal information directly from the child;
    • obtain consent, which meets the higher thresholds discussed below (in Consent);
    • ensure that such use and disclosure of personal information is consistent with a child’s best interests; and
    • offer a simple means to opt out of direct marketing.
  • Similar to Ireland’s Fundamentals for Children-Oriented Approach to Data Processing, the onus now lies with the Online Service to show that the targeting of advertising is in that child’s best interests.

Issues for Implementation

  • There is real doubt around the activities to which these provisions apply. In the Government Review, the Attorney-General draws a distinction between “direct marketing” and “targeted advertising”. However, the OAIC has previously indicated that “direct marketing” includes “online advertising”, such as displaying an advertisement based on cookie-based data.
  • The Explanatory Statement refers to “direct marketing newsletters” (suggesting a more conventional approach), but we will need clarity on what is meant and how these new rules are intended to operate.
  • It is difficult to see where an Online Service can ever successfully argue that direct marketing is in a child’s best interests, as the nature of such marketing is always intended to promote that Service’s brand over anything else. While the Explanatory Statement identifies that the Best Interests test does not prevent an entity from “pursuing its own commercial…interests”, the requirement that these not be incompatible with the best interests of a child makes this tricky.

Consent

Key Insights

  • The Draft Code sets the age of consent by a child to the collection, use or disclosure of their personal information at 15 years old, and requires parental consent for children under that age (with some exceptions).
  • There is acknowledgment that an Online Service must take reasonable steps to confirm that a person who gives consent holds parental responsibility, but still no clear indication of how to do so without collecting significant personal information.
  • When obtaining parental consent for children under 15, an Online Service must still issue a “consent notice” to children. In some circumstances (including if a child under 15 enables direct marketing), an Online Service must obtain both parental consent and seek a child’s “assent”. This is likely to be difficult to put into place. 
  • Finally, the requirements for valid consent under the Draft Code exceed the definition of “consent” under the Privacy Act. According to the Draft Code, children’s consent must be:
    • voluntary (which means not bundled and not obtained by manipulative, deceptive or misleading practices);
    • informed, with the specific information required set out in the Draft Code;
    • current and refreshed every 12 months;
    • specific; and
    • able to be withdrawn.

Issues for Implementation

  • The requirement for 12-month consent validity, mandatory child friendly notices and, in some cases, “double consent” (through child assent) introduces new concepts that many Online Services are not currently equipped to support. Online Services with a mixed audience will likely have to quarantine consents for children, to avoid the higher requirements for children infecting all of their consents.
  • There remains a question of whether parental assent is necessary, given children have varying levels of comprehension. There are further concerns about the effectiveness of relying on parental assent to protect children’s online data as this assumes that parents are cognisant of and are able to conceptualise all the varying forms of privacy dangers that children may be subject to.
  • Finally, like the social media pause, the requirements of consent under the Draft Code are higher than those under the APPs generally. Query also whether consent which is “manipulative, deceptive or misleading” re-has the potential to overlap or, at worst, contradict proposed amends to the Australian Consumer Law which seek to regulate just that.   

Notice and consent fatigue

Key Insights

  • On our count, children may, in certain circumstances, have six documents available to them in relation to an Online Service’s privacy practices:
    • APP 1 website privacy policy;
    • APP 1 website privacy policy (children’s version);
    • APP 5 privacy notice;
    • age appropriate consent notice;
    • child-specific information about inquiries and complaints; and
    • in some cases, anything required to obtain a child’s assent.

Issues for Implementation

  • Especially for children, the above is a lot of information and seems to contravene the Draft Code’s focus on clarity, simplicity and ease. We query whether more information necessarily helps children understand how their personal information is handled.
  • Online Services will need to ensure that the information contained in each of these documents is consistent and does not contradict anything said elsewhere.

Right to delete

Key Insights

  • The Draft Code introduces a broad right for children to request the destruction of their personal information.
  • There are exceptions, but they are very limited (such as where the Online Service is required under Australian law to retain such personal information).
  • Even if it is only directed to children, it is likely that this will influence broader privacy practice in Australia. Currently, the obligation to destroy or de-identify personal information is limited to “reasonable steps” only under APP 11.

Issues for Implementation

  • The Productivity Commission has rejected the proposed Tranche 2 right to erasure, warning that reforms “risk entrenching existing problems” and may “exacerbate the regulatory burden.”
  • While this is a significant uplift to Australian privacy law, the Commission’s concerns highlight uncertainty about whether such rights are proportionate or scalable for most entities.
  • There are also questions as to how effective a right to delete can be in preventing the aggregation and harvesting of children data. For instance, a right to delete alone cannot necessarily protect a child against material that has been screenshotted and shared by third-party end-users.
All of these terms are defined under the Online Safety Act 2021 (Cth).

Please contact the author for more information.

The author thanks and acknowledges Navanitha Gajendran for research and editorial support.

Check back (subscribe here) for updates on AI, privacy and cybersecurity law developments.  Or, consider a  subscription to Privacy Powered by SPB for access to comparative reference charts for the state consumer privacy laws.

Disclaimer: While every effort has been made to ensure that the information contained in this article is accurate, neither its authors nor Squire Patton Boggs accepts responsibility for any errors or omissions. The content of this article is for general information only and is not intended to constitute or be relied upon as legal advice.

Connecticut Attorney General William Tong recently issued an advisory memorandum (“Advisory”) to all “State Officials, Agencies and Concerned Parties” about how existing Connecticut laws apply to artificial intelligence (“AI”).

In the Advisory, Attorney General Tong hints at enforcement priorities and offers businesses a roadmap for compliance in describing how Connecticut’s civil rights, privacy and data security, competition, and consumer protection laws apply to AI system use.  Businesses operating in Connecticut are reminded that, even without a statewide AI law, obligations under these laws regulate their AI system use.  Those Connecticut residents who read the Advisory are reminded of their rights and encouraged to report AI related harms to the Connecticut Office of the Attorney General (“OAG”).

The Advisory discusses how the AG views these laws as applied to AI:

  1. Civil Rights Laws
    • The Advisory notes that an AI system is deployed by businesses in several contexts in which civil rights violations can occur, including for hiring, employment, healthcare, housing, insurance and lending and credit decision-making.
    • Although the Trump administration is focused on AI deregulation, the Advisory notes that federal antidiscrimination laws remain in effect and the Connecticut OAG can enforce them (Con. Gen. Stat. § 3-129g), as well as Connecticut’s own antidiscrimination laws.
  2. Privacy and Data Security Laws
    • Connecticut Data Privacy Act (“CTDPA”) (Conn. Gen. Stat. § 42-515, et seq.): The Advisory reminds businesses that personal data used in connection with AI system is subject to the CTDPA’s data minimization, data protection assessment, notice, consent, sensitive data processing and other requirements.  Businesses also are reminded that consumer privacy rights apply to personal data ingested into an AI system model, which presents unique data deletion challenges (among others).
    • In June 2025, the CTDPA was amended to add notice provisions related to training data for an AI model.  Specifically, as of July 1, 2026, a business must include in its privacy notice a statement disclosing whether it “collects, uses or sells personal data for the purpose of training large language models” (Con. Gen. Stat. § 42-520(b)(1)(H)).  In discussing this requirement, the Advisory also explains that an AI system developer, “integrator” (which means refers to a business that “combines data from different sources”), or user that “buys datasets from third party controllers that contain Connecticut consumers’ personal information (i.e., data brokers)” must ensure that the Connecticut consumers whose personal data is included in the dataset received proper notice from the third party controllers (Advisory, page 5).  The Advisory also includes a reminder that, when a privacy notice is updated to cover “any retroactive material change” (such as use for training an AI model), Connecticut consumers must receive notification and a mechanism to withdraw previously granted consent (Con. Gen. Stat. § 42-520(b)(3) (effective July 1, 2026)).
    • Connecticut’s Safeguards and Data Breach Laws (Con. Gen. Stat. § 42-471, § 36a-701b): The Advisory reminds businesses of their obligations (i) to protect personal information when deploying an AI system and (ii) to notify individuals of unauthorized access to or acquisition of their personal information. 
  3. Consumer Protection Laws
    • Connecticut Unfair Trade Practices Act (“CUTPA”) (Con. Gen. Stat. § 42-110b, et seq.): The Advisory describes how the CUTPA can apply to AI system use. The Advisory provides a non-exhaustive list of examples of potential violations, such as using an AI system to advertise a product or service in a manner that misrepresents the price, quality or other characteristics of the product or service. The Connecticut Department of Consumer Protection and the Connecticut Office of the Attorney General enforce the CUTPA, with broad authority to investigate potential violations by demanding documents and records, compelling testimony and entering establishments. The CUTPA also provides Connecticut consumers with a private right of action to sue any person who suffers a measurable loss of money or property as a result of an unfair or deceptive act.   Penalties for violations of CUTPA can include injunctive relief, civil penalties of up to $5,000 per violation, restitution and remediation.
    • Connecticut Antitrust Act (“CAA”) (Con. Gen. Stat. § 35-24, et seq.):The Advisory provides a non-exhaustive list of examples of potential violations of the CAA associated with AI system use: using an AI system in coordination with competitors to fix prices, allocate markets, or rig bids for AI products or for other goods and services.

* * * * *

In addition, the Advisory flags recent enforcement actions by the OAG in which businesses were held accountable for “misusing algorithms” to deploy design features that purposefully addict children and teens and create monopolies in internet search, smart phone markets and live event ticketing.

Meanwhile, the Connecticut legislature also is considering the following bills related to the topics covered in the Advisory:

  • SB4 – An Act Concerning Consumer Privacy and Protection, which addresses registration of data brokers; use of personalized algorithmic pricing; CTDPA amendments defining “facial recognition technology” and requirements, preventing sale, sharing, transfer or allowance of access to geolocation (among other proposed amendments).
  • SB 5 – An Act Concerning Online Safety, which would require that an operator of an “artificial Intelligence  companion” include a protocol to “take reasonable efforts to detect and address any user expression indicating a risk of suicide, self-harm or imminent violence”  and notice and other requirements similar to laws passed in California and Washington, as well as restrictions on how an “automated employment-related decision process” is deployed.
  • HB 5037 – An Act Promoting the Safety of Minors on Social Media Platforms, which includes restrictions on the times of day that a “covered platform” can send notifications to minors and requires that a covered platform track the number of minors users and display mental health warnings when a minor logs in and at specific intervals throughout an online session.
  • SB 435 – An Act Concerning Automated Decision Systems Protections For Employees, which, like SB 5, relates to deployment of an “automated employment-related decision process.”

All of the bills are in the early stages of the legislative process and seem unlikely to pass before the legislative session ends on May 6, 2026.

* * * * *

Check back (subscribe here) for updates on AI, privacy and cybersecurity law developments.  Or, consider a  subscription to Privacy Powered by SPB for access to comparative reference charts for the state consumer privacy laws.

Please contact the authors for more information.

The authors are grateful to Mary Aldrich, Paralegal, New York, for her assistance.

Disclaimer: While every effort has been made to ensure that the information contained in this article is accurate, neither its authors nor Squire Patton Boggs accepts responsibility for any errors or omissions. The content of this article is for general information only and is not intended to constitute or be relied upon as legal advice.

On March 20, 2026, Oklahoma Governor Stitt signed the first new comprehensive state privacy law of 2026. The “Act relating to data privacy” is in force on January 1, 2027. In this post, we compare the new Oklahoma privacy law to the other 20 state consumer privacy laws already in force below.

Continue Reading Oklahoma’s New Privacy Law Sweeps In

The countdown is on—the IAPP Global Privacy Summit is only six days away, and we’re looking forward to seeing so many of you in Washington, DC, for another energizing and insightful week in the privacy community. If you’ll be in town, we’d love to connect.

Continue Reading We Hope to See You at the IAPP Global Privacy Summit!

Following unanimous votes by the California legislature and signature by the Governor, California enacted an Age-Appropriate Design Code Act (CAADCA) in September 2022 (codified at CA Civil Code Section 1798.99.28-32), as a measure purportedly “aimed at protecting the wellbeing, data, and privacy of children [under 18] using online platforms.” Industry group NetChoice soon turned to federal court and sought an injunction seeking to prevent the law from being enforced on the grounds, among others, that it violates the First Amendment and the dormant Commerce Clause of the United States Constitution and is preempted by other federal statutes addressing online child safety, including the Children’s Online Privacy Protection Act (COPPA).

Continue Reading The Future of the CA Age-Appropriate Design Code Act: What Remains, What’s Still Open to be Contested, and What Companies Must Consider for Minors’ Online Safety

In its press release relating to the Court of Justice of the European Union (CJEU) judgment of 10 February 2026 in Case C-97/23 P, the CJEU has confirmed that the action brought by an organization against a Binding Decision of the European Data Protection Board (EDPB) is admissible.

With this decision, the CJEU has clarified that organizations have a right of direct appeal against binding decisions of the EDPB on which a national authority’s decision against them is based.

Continue Reading EDPB Binding Decisions Can Be Challenged Directly by Organizations Before EU Courts

PrivacyWorld is pleased to present the latest edition of SPB’s Advertising Media and Brands Updates.  Topics include:

  • AI and intellectual property
  • Agentic AI
  • US tax changes for prize promotions
  • EU Digital Networks Act
  • Anti-counterfeiting

Stay Ahead on Consumer Privacy News

Not a subscriber yet? Subscribe here to be among the first to receive timely updates on the fast-moving world of data privacy, security, and innovation—delivered straight to your inbox.

Looking for deeper insights and expert analysis? You can also subscribe here to our privacy attorneys’ marketing communications for thought leadership and rich content when you need a more comprehensive perspective.

In case you missed it, below are recent posts from Privacy World covering the latest developments on data privacy, security and innovation. Please reach out to the authors if you are interested in additional information.

CalPrivacy Update: Shifting to Structural Compliance and Auditing

Towards a Contextual Concept of Personal Data Under the GDPR: the Commission Moves Forward, the EDPB and EDPS Push Back

A Timely Look at HR Data and AI Regulation Trends: Webinar Recording Available

Join Us at the IAPP Global Privacy Summit in Washington, DC!

Stay Ahead on Consumer Privacy News

Not a subscriber yet? Subscribe here to be among the first to receive timely updates on the fast-moving world of data privacy, security, and innovation—delivered straight to your inbox.

Looking for deeper insights and expert analysis? You can also subscribe here to our privacy attorneys’ marketing communications for thought leadership and rich content when you need a more comprehensive perspective.

Privacy compliance has entered a new phase—one defined not only by high-profile enforcement actions but by the growing expectation that organizations implement and maintain mature information governance programs capable of validating true, system-level technical compliance rather than merely projecting the appearance of it.  A spate of recent California enforcement actions makes clear that companies must be prepared to validate how privacy control’s function, including across systems, platforms, and data flows, making thoughtful, system-oriented self-assessment an increasingly important tool for aligning policy commitments with operational reality—before regulators do it for them.  SPB helps client’s self-access, identify gaps and remediate issues under the cloak of privilege.

Continue Reading CalPrivacy Update: Shifting to Structural Compliance and Auditing

The European Data Protection Board1 (EDPB) and the European Data Protection Supervisor2 (EDPS) adopted on 10 February 2026 a joint opinion (Joint Opinion 2/20263) on the European Commission’s Digital Omnibus initiative (described by the Commission as “a set of technical amendments to a large corpus of digital legislation, selected to bring immediate relief to businesses, public administrations, and citizens alike, and to stimulate competitiveness”).

Although both bodies welcome (and largely endorse) the Commission’s proposals set out in the initiative (subject to certain caveats), the opinion expresses marked unease with the proposed approach to redefining personal data, which would be recalibrated to align with the CJEU’s most recent interpretation of the concept [Case C-413/23 (EDPS v SRB)4].

Continue Reading Towards a Contextual Concept of Personal Data Under the GDPR: the Commission Moves Forward, the EDPB and EDPS Push Back