In case you missed it, below are recent posts from Consumer Privacy World covering the latest developments on data privacy, security and innovation. Please reach out to the authors if you are interested in additional information.

Online Safety in Digital Markets Needs a Joined Up Approach with Competition Law in the UK

China’s Didi Fined Over $1B by Chinese Data Regulators

Federal Privacy Legislation Advances in House

CPW’s David Oberly to Participate in Panel Presentation on Tsunami of BIPA Class Action Suits & Potential Impact of Recent Decisions on Employer Biometrics Practices at ABA Section of Labor & Employment Annual Conference

Artificial Intelligence (AI) and the Risk of Bias in Recruitment Decisions

On the Speaking Circuit – Tokyo/Shanghai Partner Scott Warren Joins Faculty of Cyber/Digital Crime Legal Forum

California Moves Closer to Enacting More Stringent Online Privacy Protections for Children

Is the Dutch Data Protection Authority’s Restrictive Approach to Legitimate Interests an Eccentricity or a Trend?

Watch Now – “The Perfect Storm: New Privacy Laws & the Cookie-less Future”

Federal Court Rejects Attempt to Impose FCRA Executive Liability on CEOs of CRAs

Federal Court Refuses to Dismiss Biometric Claims Brought by Trucker Against Facial Recognition Company

Federal Court Dismisses Colonial Pipeline Cybersecurity Litigation

FTC Emphasizes Commitment to Protection of Highly Sensitive Data

Federal and State Actions to Protect Robocall Invasion of Consumer Privacy

China Publishes New Measures and Draft Standard Contract on Data Export

There is increasing public pressure on internet companies to intervene with content moderation, particularly to tackle disinformation, harmful speech, copyright infringement, sexual abuse, automation and bias, terrorism and violent extremism. The new UK Online Safety Bill (as introduced in the UK Parliament on 17 March 2022) is the British response to such public demand. Continue Reading Online Safety in Digital Markets Needs a Joined Up Approach with Competition Law in the UK

On Thursday, July 21st, the Cyberspace Administration of China fined Didi, China’s largest ride service, CNY8B (@US$1.2B) for violations of its data privacy, data security and cybersecurity laws. The fine reportedly amounts to more than 4% of its total revenue of last year. It also fined the company’s Chairman (Cheng Wei) and President (Jean Liu) each CNY1M (@US$150k) as being responsible for the company’s violations. Regulators claimed Didi, since July of 2015, collected nearly 12 million screenshots and 107M pieces of passenger facial recognition data and more that 167M records of location data, and other information, causing serious national security risks to the country’s critical information infrastructure and data security. Didi has posted on its social media account that it has ‘sincerely’ accepted the decision. It is reported that the government will now ease restrictions it had placed on Didi, including adding new users and having apps removed from online stores in China.

It should be noted that Didi initially listed on the NY Stock Exchange in June of 2021, a move that was not well-met by Chinese regulators, who launched a probe 2 days after the listing, a probe that included raids to the company’s facilities. China subsequently issued several regulations to quickly close the loopholes of the cybersecurity/data protection legal regime, such as the Cybersecurity Review Measures, which require Internet platforms holding more than one million Chinese individuals’ data to pass a cybersecurity review before being listed overseas. Didi subsequently delisted in June of 2022.  It is reported that this resolution may now pave the way for Didi to list in Hong Kong. (Note: the Hong Kong Stock Exchange is not considered “foreign”).

This matter shows the importance of knowing what data you are collecting in China and ensuring compliance with local laws, many of which are new (such as China’s far-reaching Personal Information Protection Law and Data Security Law implemented in the Fall of last year). Not only can the fines and penalties be substantial, but the disruption of services during any investigation can be just as serious. This is especially true if the data collected may be determined as important for national, political or economic security.

SPB Partner Beth Goldstein also contributed to this post.

With the powerful Committee on Energy and Commerce having approved a comprehensive, bipartisan privacy bill by a vote of 53-2, the US House of Representatives is one step closer to approving historic privacy legislation after over a decade of debate. Before formally reporting the legislation to the full House, the Committee adopted a substitute amendment that addressed concerns that had been raised in Subcommittee a few weeks ago. Among other provisions, the substitute amendment included the following changes:

  • The amended ADPPA provides an explicit right for the California Privacy Protection Agency (“CPPA”) to enforce the law. This is likely in response to calls by California Governor Newsom and the CPPA itself this week to eliminate the bill’s would-be preemption of the California Consumer Privacy Act (including as amended by the California Privacy Rights Act) (“CCPA”). Notably, however, preemption of the CCPA remains.
  • The definition of “third party” has been amended to provide that affiliated companies are considered a single covered entity if consumers reasonably expect them to share information with one another.
  • The substitute amendment provides a number of additional changes with respect to targeted advertising, including :
    • The FTC has the authority to establish global privacy control or “unified opt-out mechanisms” to allow individuals to opt out from targeted advertising.
    • The ADPPA retains its ban on targeted ads to an individual under 17, and also still considers information relating to such individuals as sensitive covered data, but has introduced a tiered knowledge approach with respect to an individual’s age
    • Internet browsing history over time and across third party websites or online services is now considered sensitive data.
  • Sensitive covered data has been further expanded to include race, color, ethnicity, religion, and union membership, and video data as a category of sensitive covered data has been clarified to include information showing the video content requested or selected by users of consumer generated media.

The leadership of the Committee appears to have found the sweet spot on the two major issues that have bedeviled legislators for years—how and to what extent to preempt state law and the extent to which consumers can vindicate their rights through a private right of action. The substitute amendment, for example, shortened from four year to two years after the date of enactment the date by which consumers can sue over alleged privacy violations. In addition, the substitute amendment limited forced arbitration agreements with respect to claims made by individuals facing domestic violence. With preemption and the private right of action now largely resolved, only a few additional minor issues, plus further changes to the arbitration provision, appear to stand in the way of likely House passage of the bill in September, if not before the August recess begins, on a bipartisan basis.

 

David J. Oberly will participate in the panel presentation Biometrics Are Back! (And So Are the Lawsuits) at the American Bar Association’s 16th Annual Section of Labor and Employment Law Conference, the Section’s signature event of the year, from November 9-12, 2022, in Washington D.C. 

Program Description

Technology and employees’ increasingly broad access to data and personal information concerning their employees have continued to be at the forefront of the employer-employee relationship for two-plus years—since the wave of local and state shutdowns in March 2020 forced employers and employees to adapt to telework. 

As some employers continue to rely on biometric technology to collect employee information—like fingerprints, retina scans, and facial geometry—others may be reconsidering the use of such technology due to the recent uptick in data privacy and biometrics litigation and the recent plaintiff-friendly decisions that followed. The recent settlements of biometrics litigation in California and Illinois and the multi-million dollar lawsuit filed by the Texas Attorney General provide ample evidence that biometrics are still a pressing issue. 

In addition, several states are moving to enact biometric privacy laws similar to the Illinois Biometric Information Privacy Act (“BIPA”), including Kentucky, Maine, Maryland, Massachusetts, and New York, among others. 

This panel will discuss the recent spike in class action litigation under BIPA and how recent decisions may impact how employer data collection and employee privacy policies are shaped going forward. The panel will also discuss the similarities and differences among the various privacy laws being considered in addition to the states that have already enacted such laws and the extent to which emerging laws in this area should shape the development of workplace privacy policies. 

For more information and to register, please visit the ABA 16th Annual Labor and Employment Law Conference event webpage.

As part of the UK data protection authority’s new three-year strategy (ICO25), launched on 14 July, UK Information Commissioner John Edwards announced an investigation into the use of AI systems in recruitment. The investigation will have a particular focus on the potential for bias and discrimination stemming from the algorithms and training data underpinning AI systems used to sift recruitment applications. A key concern is that training data could be negatively impacting the employment opportunities of those from diverse backgrounds.

Bias is a particular risk in AI or machine learning systems designed not to solve a problem by following a set of rules, but instead to “learn” from examples of what the solution looks like. If the data sets used to provide those examples have bias built in, then an AI system is likely to replicate and amplify that bias. For example, if successful candidates reflected in the training data share certain characteristics (such as gender, demographic profile or educational profile) then there is a risk of excluding candidates whose profiles do not match those criteria.

The ICO also plans to issue refreshed guidance for AI developers on ensuring that algorithms treat people and their information fairly. However, even where algorithms and training data reflect ethical guidance, it will remain best practice to retain meaningful human involvement in decision-making. In effect, AI systems should produce recommendations for human review, rather than decisions. Under EU and UK GDPR Article 22, decisions based solely on automated processing, including profiling, which produce legal effects concerning him or her or similarly significantly affects the data subject are restricted unless they are:

  • necessary for entering into or performance of a contract between an organisation and the individual;
  • authorised by law (for example, for the purposes of fraud or tax evasion); or
  • based on the individual’s explicit consent.

The making or withholding of employment offers would clearly constitute legal or similarly significant effects.

Where special category personal data is involved, decisions based solely on automated processing are permissible only:

  • with the individual’s explicit consent; or
  • where the processing is necessary for reasons of substantial public interest.

In addition, because decisions based solely on automated processing are considered to be high risk, UK GDPR requires a Data Protection Impact Assessment (DPIA), showing that risks have been identified and assessed, and how they are addressed.  From there, compliance obligations include:

  • giving individuals specific information about the processing;
  • taking steps to prevent errors, bias and discrimination; and
  • giving individuals rights to challenge and request a (human) review of the decision.

The ICO’s indication that investigating AI in the context of recruitment will be one of its priorities over the next three years is significant. AI and machine learning tools are an increasingly valuable resource, but they come with compliance obligations that are likely to come under intense scrutiny as an area of particular interest to the ICO as the UK’s data protection authority. To learn more, or to discuss the practicalities of compliance, please contact the authors.

On July 21st, from 4:15-4:40 pm (Singapore time), our Tokyo/Shanghai partner, Scott Warren will be speaking at the ‘Singapore: Technology – Effects on Arbitration and Corporate Crime in SE Asia’ conference. His topic, ‘Cybersecurity and Digital Crime: Preparing for WHEN We Are Hacked’, will provide practical tips for effectively dealing with a multi-jurisdictional cybersecurity incident. The virtual webinar is free for in-house counsel and will be conducted in English.

To register, please contact: Bettina.yan@legalplus-asia.com.

For years now, California has led the way by setting the standard for privacy and data protection regulation in the United States. Recently— and as calls for greater controls over the addictive nature of social media grow louder—legislators in the Golden State have moved closer toward enacting a new, first-of-its-kind privacy law that would prohibit the development and utilization of “addictive” features by social media platforms. At the same time, state legislators also advanced a second bill that would put in place stringent online privacy protections for minors.

Businesses should monitor the progress of these bills closely, as their enactment—combined with an increased focus on children’s privacy by both federal lawmakers and the Federal Trade Commission (“FTC”)—may have a ripple effect in other states and municipalities, with legislators following close behind to enact similar children’s online privacy laws.

Continue Reading California Moves Closer to Enacting More Stringent Online Privacy Protections for Children

In case you missed it, below are recent posts from Consumer Privacy World covering the latest developments on data privacy, security and innovation. Please reach out to the authors if you are interested in additional information.

Federal Court Dismisses Colonial Pipeline Cybersecurity Litigation

Federal Court Refuses to Dismiss Biometric Claims Brought by Trucker Against Facial Recognition Company

Federal Court Rejects Attempt to Impose FCRA Executive Liability on CEOs of CRAs

Watch Now – “The Perfect Storm: New Privacy Laws & the Cookie-less Future”

Is the Dutch Data Protection Authority’s Restrictive Approach to Legitimate Interests an Eccentricity or a Trend?

CPW’s Stephanie Faber Speaks at French Association of Personal Data Protection Correspondents Annual MeetingFuture Uncertain for the American Data Privacy and Protection Act

Online Webinar Now Available: Kristin Bryan and Kyle Fath Discuss AI and Biometrics Privacy Trends and DevelopmentsNinth Circuit District Court Finds No Standing for Alleged Lost Commercial and Proprietary Data in Privacy Litigation

FTC Signals Intention to Begin Rulemaking on Privacy and AI, Hints at Areas of AI Focus in Congressional Report

Heated Debate Surrounds Proposed Federal Privacy Legislation

Motion for Preliminary Approval of Accellion Data Breach Settlement Filed in California Federal Court

Following Court Win on BIPA Claims, FTC Ramps Up Investigation into Company’s Sharing of Biometric Data

Third Circuit Affirms Law Student’s Cyberstalking Plea, Holding Federal Criminal Cyberstalking Statute Does Not Violate Constitution

Fourth Circuit Grants Summary Judgment to Defendant in Driver Privacy Litigation

CPPA Holds First Public Meeting Following Publication of First Draft of Proposed Regulations and Initial Statement of Reasons

Congress Proposes Federal Privacy Legislation to Preempt Certain State Privacy Laws, Hearing Scheduled for Next Week

SEC Cyber Regulation Efforts: A Mid-Year Review

Federal Court Stays BIPA Litigation While Applicable Statute of Limitations is Still in Question

Updates to Automatic Renewal Laws with New Consent, Notice, and Cancellation Requirements in the United States and Germany

 

 

 

The EU Commission has expressed concerns about the Dutch data protection authority’s strict interpretation of “legitimate interests”, considering it to be “not in line with the GDPR, the guidelines of the Article 29 Working Party/EDPB and the case law of the European Court of Justice (CJEU)”. Those concerns focus on guidance issued by the Autoriteit Persoonsgegevens (“AP”) in 2019, stating that purely commercial purposes, such as maximization of profits, would not be considered a “legitimate interest”. Now, in addition to the EU Commission’s expression of concern, the Netherlands’ highest administrative court (the Council of State) is preparing to decide on the AP’s appeal in VoetbalTV. At stake is the consistent interpretation and enforcement of GDPR across EU member states. As the EU Commission pointed out: it is of outmost importance that the national guidelines are in line with the case law of the CJEU and with the guidelines adopted on the European Data Protection Board (EDPB) level.

Some background

VoetbalTV is a social and video platform for amateur football. It makes video recordings of matches in amateur football on behalf of football clubs and allows its members to interact and share information through an app. The AP found, on 16 July 2020, that VoetbalTV’s video recordings and their subsequent distribution and processing by analytics tools via this app (activities carried out without the consent of the data subjects, many of them minors) was in breach of Article 6 (1) of the GDPR and sanctioned VoetbalTV with a €575,000 GDPR fine.

The AP concluded that commercial interests cannot be regarded as lawful legitimate interests as they lack urgent “legal” character. Consequently, processing was unlawful rather than being subject to an assessment on the need and proportionality of the processing.

The Central Netherlands District Court overturned the AP’s decision on the basis that the a priori exclusion of certain legitimate interests has been “specifically prohibited” by the CJEU in repeated case law (a ruling that has been appealed by the AP and on which the Council of State will be deciding in the coming months).

Deeper into the AP’s stance

According to the Central Netherlands District Court’s decision, the AP’s position is that “a legitimate interest is an interest that is designated as a legal interest in (general) legislation or elsewhere in the law. It must therefore be an interest that is also protected in law, that is considered worthy of protection and that in principle must be respected and can be ‘enforced’. For an interest to qualify as a legitimate interest, this interest must have a more or less urgent and specific character that follows from a (written or unwritten) legal rule or principle; it must in a certain sense be unavoidable that these legitimate interests are served. Purely commercial interests and the interest of profit maximization are not specific enough and lack an urgent ‘legal’ character, so that they cannot be regarded as legitimate interests.

Some views on the legal concern

A move away from the AP’s restrictive view of legitimate interests would not necessarily change the outcome for VoetbalTV on the particular facts of their case. Even if purely commercial factors such as profit maximization were to be recognised as potentially legitimate interests, VoetbalTV’s interests might not be considered sufficiently compelling to override the rights and freedoms of data subjects. However, the broader implications of such a change would be significant for other organisations as it would mean that commercial interests could, subject to passing the balancing test, be established as a legitimate interest rather than being ruled out as a blanket matter by the AP’s current approach. As the EU Commission pointed out:

it  also  has  to  be  borne  in  mind  that  the  freedom  to  conduct  a  business, including  pursuing  pure  commercial  interests  such  as  profit  maximisation,  is  a  human right  enshrined  in  Article 16  of  the  Charter  of  Fundamental  Rights  of  the  European Union  (EU  Charter).  Recital 4  of  the  GDPR  underlines  that  the  right  to  protection  of personal data is not an absolute right and it has to be balanced against other fundamental rights, such as the freedom to conduct a business. The strict interpretation of the Dutch DPA does not allow an appropriate balance to be struck between the rights at issue, as the right to data protection is given precedence by virtue of the fact that certain interests rooted in the freedom to conduct a business are categorically considered illegitimate.

It is worth noting that the AP is not alone in its stance. The Portuguese data protection authorities (CNPD) have also proven to be very rigid in respect of the scope of the processing activities that can be carried out on the basis of legitimate interests, endorsing only those specifically recognized in the law.

We look forward to the ultimate decision of the Council of State and, in particular, whether their views coincide with those of the AP or with the European Commission’s statement that: there is nothing in the jurisprudence of the CJEU which allows one to conclude that the CJEU is of the opinion that economic interests cannot be considered legitimate under Article 6(1)(f) GDPR.