EU – U.S. Data Privacy Framework
On July 10, 2023, U.S. business received some welcome news: the U.S. had fulfilled the European Commission’s requirements for the EU-U.S. Data Privacy Framework (DPF), a new mechanism for cross-border personal data transfers from the European Economic Area to the U.S. This announcement came almost three years to the date after the EU-U.S. Privacy Shield framework was invalidated by the Court of Justice of the European Union in Case C-311/18, known as “Schrems II”.
In brief, after completing a certification process, the DPF allows a U.S. business that has self-certified under the DPF to undertake trans-Atlantic transfers of personal data regulated by the General Data Protection Regulation (GDPR) without the need for Standard Contractual Clauses or Binding Corporate Rules. DPF certification also means a certified business can dispense with the written risk assessment – known as a transfer impact assessment – analyzing the impact on privacy when personal data is transferred from the EEA to a jurisdiction outside of the EEA that is not deemed ‘adequate’ by the European Commission. The UK/Gibraltar extension to the DPF (UK Extension) and the Swiss-U.S. Data Privacy Framework (Swiss DPF) were concurrently developed for personal data transfers to the U.S. from the UK and Gibraltar and Switzerland. On September 21, 2023, the UK government announced its approval of the adequacy decision for the UK Extension, effective as of on October 12, 2023. While certification to the Swiss DPF is available as of July 17, 2023, a certified business cannot rely on the Swiss DPF until the Swiss government issues its official adequacy decision (which it had not as of September 25, 2023.
On July 17, 2023, the U.S. Department of Commerce (DoC) launched the Data Privacy Framework Program website through which U.S. businesses can complete the DPF certification process (DPF Website). The certification process requires a U.S. business to: (i) commit to adhere to seven Principles and sixteen Supplemental Principles (Principles), (ii) develop and publish privacy policies explaining how the business protects privacy on par with European privacy laws, and (iii) implement policies and procedures designed to ensure that the business keeps its commitment to the Principles, including ongoing audits, contracts for personal data sharing, honoring privacy rights requests and dispute resolution procedures.
Time is of the essence for a business that maintained an active certification under the EU-U.S. Privacy Shield Framework (Privacy Shield). A Privacy-Shield certified business is automatically part of DPF – as long as the business’ privacy policies and procedures are updated to reflect the Principles by October 10, 2023 or, for the Swiss DPF, by October 17, 2023. A business listed as “Inactive” on the DPF Website – whether because the business withdrew or did not complete the annual re-certification – can use its DoC account to complete the DPF certification process.
Members of our Data Privacy, Cybersecurity & Digital Assets Practice have been hard at work helping U.S. businesses prepare for and complete the DPF certification process. Answers to common questions our clients have asked since the DPF Website was launched are available here. We welcome your questions, too – contact us online. We expect to add new and updated FAQs in the coming weeks so please check back.
Privacy/Biometric Litigation
The United States does not have comprehensive federal privacy legislation with a private right of action. Rather, in response to the patchwork of federal and state sectoral laws regulate the collection, processing, disclosure, and security of PI depending on the industry of the organization, the nature of the data in question, and other criteria. The net effect of this patchwork system is that data privacy and biometric litigation is constantly in a state of flux. Click through for an overview of privacy and biometric litigation in the US. Our data privacy and cybersecurity litigators stay abreast of these developments, for a complete understanding of the most up to-date arguments and strategies to defeat data privacy and biometric class actions and related disputes.
Privacy – US
The US lacks an omnibus data protection regime and data privacy and security law in the US is a patchwork of federal and state laws, regulations, and industry self-regulatory programs, enforced by a myriad of authorities and organizations. Data privacy and security is increasingly regulated in the US with several new paradigm-shifting state laws having recently gone into effect, or to become effective in 2023, with a half dozen more going into effect in 2024 and 2025. Click through for an overview of data regulation in the US. The US team of our Global Data Practice helps clients understand the evolving legal landscape and develop and operationalize information governance risk and compliance programs and to protect and maximize the value of their data and digital assets.
Privacy – Europe
In the European Union (EU), the legal framework for privacy and data protection centers around the General Data Protection Regulation (GDPR) and the Directive on Privacy and Electronic Communications (ePrivacy Directive, also known as the “Cookie Directive”).
Both the GDPR and the ePrivacy Directive (as implemented at national level) apply to the European Economic Area (EEA), which includes all 27 EU Member States, as well as Iceland, Lichtenstein and Norway.
Please note, although the GPDR and ePrivacy Directive do not apply in Switzerland, Swiss laws are in the process of being harmonized with the legislative requirements of the GDPR and the ePrivacy Directive.
Following the UK’s departure from the EU, the GDPR has been transposed into UK law (please see ‘UK GDPR’ below). The UK has additionally transposed the Privacy and Electronic Communications Regulations (PECR) into UK law. While the obligations stemming from the GDPR and UK GDPR are near on identical, it remains to be seen whether the UK will eventually deviate from the EU data protection rulebook to pursue its own regulatory path.
Privacy – Asia-Pacific
The challenge for anyone doing business in the Asia Pacific region is the ever-expanding number of countries initiating data privacy/cybersecurity requirements in the region, some with significant penalties for failure to follow. It would be one thing if they lined up to the GDPR perfectly, but each seems to have its own flavor, unique requirements and purpose. Several have standard GDPR obligations, like data subject notifications, consent requirements, retention and security requirements. But several have unique applications, such as:
- The lack of legitimate interests as a legal basis for processing, such as in China, Vietnam and India
- Broader restrictions on outbound transfers of personal data
- Local language specifications for notices and consents, as well as local representative requirements
- Heightened concerns over collection of national identification information
- The application of data privacy laws to the personal data of citizens living abroad
- Each jurisdiction with its own definition of what is a data breach and when/if/to whom it is notifiable
Below, we have prepared a comparison of the regional data privacy/cybersecurity laws across a set of consistent categories, such as:
- Obligations on collecting/handling/transporting data
- A data subject’s right to query/modify
- Cross-border obligations
- Breach notification requirements
- Penalties
We have also included whether a jurisdiction allows discovery and/or class action litigation, as that can factor in risk considerations.
Artificial Intelligence
After OpenAI introduced ChatGPT on November 30, 2022, a deluge of information about artificial intelligence (AI) was released, together with predictions about how AI will transform the world. The OECD AI Policy Observatory (OECD.AI)* reports that at least 69 countries have undertaken AI policy initiatives.
But, what exactly is AI? The precise meaning of “AI” is still the subject of debate. AI has evolved as a catch-all term for a broad range of automated decision-making systems – from algorithms that process large amounts of data to achieve a specific outcome infinitely faster than a human could complete the same task to so-called artificial general intelligence (AGI), which is a man-made intelligence that is indistinguishable from the human mind. Most experts agree that AGI is still out of reach but, between the task-specific algorithms and AGI are increasingly powerful AI systems ‘trained’ on large data sets to draw inferences and achieve outcomes – whether intended or unintended.
The large data sets on which current AI systems are developed, trained and tested invariably include personal data. These large data sets combined with increasingly powerful computing capacity create privacy compliance risks: risk of re-identification of de-identified data; the complexity of honoring privacy rights after the AI has already ‘learned’ on personal information in training data sets; the illusion of meaningful consent for complex AI that learns to process personal data in new ways and for different purposes than originally explained to individuals; knowing when a privacy risk assessment is needed; and the attractiveness of large data sets to hackers and other bad actors.
In addition to these privacy risks, the proliferation of AI systems has raised numerous other legal concerns: intellectual property protection; biased and discriminatory outcomes for AI system use; safety and reliability of AI systems; competition; and national security. AI also raises ethical concerns about AI system control, accountability and surveillance. While most governments around the world agree on the need to regulate AI system development and use, the approaches differ. AI system developers, operators and end-users face the challenge of navigating a complex legislative framework crossing in multiple jurisdictions.
We are curating our AI content in this InfoCenter to make it easier to find what you want. We also welcome your questions and topic ideas – contact us here.
*OECD.AI (2021), powered by EC/OECD (2021), database of national AI policies, accessed on 20/10/2023, https://oecd.ai.