Context

Businesses are under pressure from a range of internal and external stakeholders to create and maintain genuinely diverse and inclusive workplaces. Consequently, more and more businesses want to collect and track Diversity and Inclusion (“D&I”) data about their staff. This may include information about gender, sexual orientation, race, ethnic origin, religion, socio-economic background health, and disability. This information may help organizations better understand the current profile of their workforce, assess the impact of their equal opportunities policies, determine what steps they may need to take to address any barriers to change and measure progress against any objectives/targets set.

However, in some countries, collection and tracking of such data is regulated by various laws and it is socially and culturally inappropriate to ask certain questions in this area.

In France, various regulations and case law restrict the collection of such data, including the EU General Data Protection Regulation (“GDPR”). There is a particular sensitivity in relation to origin/race/ethnicity data (as notably stated in a decision from the French Constitutional Council of 15 November 2007 sanctioning the collection of such data in this context).

Draft recommendation

To guide organizations wishing to implement diversity measurement surveys, the CNIL is submitting a recommendation for public consultation until September 13, 2024 (the “Draft Recommendation”).

It notably includes GDPR-specific recommendations that were not in the guide “Measuring to progress towards equal opportunities” that the CNIL had published with the Defender of Rights twelve years ago (the “Guide”).

The recommendation addresses the following issues in relation to diversity surveys.

Continue Reading Measuring Diversity at Work in France: the CNIL Launches a Public Consultation on a Draft Recommendation

Malaysia’s Personal Data Protection Act (PDPA) was enacted in 2010 and came into force in November 2013, making Malaysia the first country in the Association of Southeast Asian Nations (ASEAN) to enact comprehensive privacy legislation.

On July 31, 2024, the Personal Data Protection (Amendment) Bill 2024 (PDP Bill) was passed by the Dewan Negara (Malaysia’s Senate). It is expected to receive royal assent and thereafter come into force on a date to be appointed by the Minister of Digital by notification in the Gazette.

The PDP Bill introduces significant amendments to the PDPA, including specific definitions, new obligations on data controllers and stricter penalties for non-compliance. These amendments align the PDPA with internationally recognised standards, positioning Malaysia alongside its regional peers in Asia-Pacific, including Singapore, Indonesia, the Philippines, Thailand and Vietnam.

According to Malaysia’s Digital Minister, Gobind Singh Deo, these changes are driven by rapid technological advancements that necessitate society’s reliance on digital platforms for business, coupled with an expectation of protection. His comments come in response to a recent rise in complaints regarding the misuse and breach of personal data, an increase in personal data breaches, and a growing number of online fraud cases.

We outline below key changes brought about by the PDP Bill and its impact on businesses:

Continue Reading Malaysia Pushes Out Groundbreaking Amendment to Personal Data Protection Act – Impact on Businesses

In a move that will be unwelcomed by plaintiffs’ lawyers, Illinois has enacted an amendment to its biometrics privacy law – the Biometric Information Privacy Act (“BIPA”) – to provide that when a private entity that, in more than one instance, discloses, rediscloses, or otherwise disseminates the same biometric identifier or biometric information from the same person to the same recipient using the same method of collection, without the required prior notice and written release, it commits only a single violation for penalty calculation purposes, regardless of the number of times the data was disclosed, redisclosed, or otherwise disseminated.  This will significantly reduce the potential damages and lower the settlement value of BIPA claims.  The amendment also provides that an e-signature satisfies the written requirements for the release.  “Electronic signature” means an electronic sound, symbol, or process attached to or logically associated with a record and executed or adopted by a person with the intent to sign the record[,]” thus clarifying that online “clickwrap” releases suffice.  This amendment follows previous failed attempts at similar reforms to stem the fold of BIPA class action litigation that has plagued companies that have enacted fingerprint time cards or other biometric fraud and security measures without strictly complying with BIPA.  Colorado recently enacted a BIPA-like biometrics law, but like other states except only Illinois, it does not have a privacy right of action and can only be enforced by the state.  However, states are active in enforcing their privacy laws as illustrated by a recent Texas settlement with a social media company for biometric consent claims that included a 9-figure civil penalty payment.

For more information, contact the author or your SPB relationship lawyer.


Disclaimer: While every effort has been made to ensure that the information contained in this article is accurate, neither its authors nor Squire Patton Boggs accepts responsibility for any errors or omissions. The content of this article is for general information only and is not intended to constitute or be relied upon as legal advice.

Regulators in states without omnibus state privacy laws, like New York, are staking their claim over privacy regulation and enforcement. After months of investigating the deployment of tracking technologies and privacy controls on various websites, the New York State Attorney General (“NY AG”) published its guidance, Website Privacy Controls: A Guide for Business. The NY AG also published a companion guidance for consumers, A Consumer Guide to Web Tracking, which provides a high-level overview of how websites track consumers and what steps consumers can take to protect their privacy. Stay tuned for potential enforcement actions and big-figure settlements. Will New York follow Texas in this regard?

NY AG Investigation and Findings

Tracking technologies, like cookies and tags (i.e., pixels), are utilized by businesses to collect and assess information regarding how individuals interact with the business’ website or mobile app. While tracking technologies can provide valuable insights for businesses, they also raise privacy concerns regarding data collection, selling, sharing, creation of detailed profiles about individuals that are used for targeted advertising, cross-site tracking that leads to a comprehensive understanding of an individual’s interests and behavior without the individual’s knowledge or consent, and more.  The Federal Trade Commission (“FTC”) is attempting Section 5 Magnuson-Moss rulemaking on this, which they call surveillance capitalism.

Continue Reading Businesses Beware: New York Eyeing Privacy Regulation and Enforcement Even Absent Omnibus State Privacy Law

Six years after its enactment and four years after it entered into force, on July 17, 2024, the Brazilian Data Protection Agency (Autoridade Nacional de Proteção de Dados (ANPD)) has issued a regulation developing the Brazilian General Data Protection Law (Lei Geral de Proteção de Dados Pessoais (LGPD)) and clarifying the regulatory framework for Data Protection Officers (DPOs) in Brazil (ANPD Resolution No. 18/2024, the “Resolution”).

Article 41 of the LGPD establishes that data controllers must appoint a data protection officer (DPO), details their main responsibilities, and requires that the DPO’s identity must be made public. It also invites the ANPD to establish complementary rules for the definition and attribution of the person in charge, including cases of exemption from the appointment requirement, depending on the nature and size of the entity or the volume of the data processing operations.

Continue Reading New ANPD Resolution on the Statute of Data Protection Officers in Brazil

Shortly after the publication of the Artificial Intelligence (AI) Act, the EU Commission published the AI Pact’s draft commitments with a view of anticipating compliance with high-risk requirements for AI developers and deployers.

Publication and timeline for the AI Act

The EU AI Act was published in the Official Journal of the European Union on July 12, 2024, as “Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonized rules on artificial intelligence.”  We have presented the main provisions and purposes of the AI Act in our publication here.

The EU AI Act will enter into force across all 27 EU Member States on August 1, 2024, but has variable transition periods depending on the relevant parts of the AI Acts; starting with February 2, 2025, at which point, prohibited AI practices must be withdrawn from the market, and with the enforcement of the majority of its provisions commencing on August 2, 2026.

The call for participation on the AI Pact by the EU commission

In this context, the EU Commission issued a press release on July 22, 2024, promoting the “AI Pact”, seeking the industry’s voluntary commitment to anticipate the AI Act and to start implementing its requirements ahead of the legal deadline.  The press release can be found here.

The AI Pact was first launched in November 2023, obtaining responses from over 550 organizations of various sizes, sectors, and countries.

The AI Office has since initiated the development of the AI Pact, which is structured around two pillars:

Continue Reading The EU Commission’s Draft AI Pact anticipating compliance with newly published AI Act

In case you missed it, below are recent posts from Privacy World covering the latest developments on data privacy, security and innovation. Please reach out to the authors if you are interested in additional information.

FCC to Consider Formal Rules for Use of Artificial Intelligence and Robocalls | Privacy World

California Privacy Regs Advance But Vote on Drafts Delayed| Privacy World

New CCPA Regs Prepared for Public Comment| Privacy World

A data processing agreement is not always enough. | Privacy World

Rhode Island Makes it an Even 20 | Privacy World

Building on last year’s Notice of Inquiry, the Federal Communications Commission (“FCC” or “Commission”) is poised to consider a draft Notice of Proposed Rulemaking (“NPRM”) at its August 7, 2024 Open Meeting that would further address the use of artificial intelligence (“AI”)-generated automated calls. Specifically, the FCC proposal would “define AI-generated calls and propose new rules that would require callers to disclose to consumers when they receive an AI-generated call.”  The background framework for the agency’s proposed actions is the consent and other relevant requirements of the Telephone Consumer Protection Act of 1991 (“TCPA”), which the FCC is responsible for implementing.

The Commission has previously declared that  AI technologies that generate human voices are covered by the TCPA. And it has already responded to potential harms with use of AI and automated calling.

In releasing the draft NPRM, Chairwoman Jessica Rosenworcel, who has prioritized the issue of robocall regulation, commented that in light of those potential harms, “’[w]e want to put in place rules that empower consumers to avoid this junk and make informed decisions.” To that end, the NPRM would now seek comment on the following key proposed rules:

Continue Reading FCC to Consider Formal Rules for Use of Artificial Intelligence and Robocalls

In case you missed it, below are recent posts from Privacy World covering the latest developments on data privacy, security and innovation. Please reach out to the authors if you are interested in additional information.

California Privacy Regs Advance But Vote on Drafts Delayed| Privacy World

New CCPA Regs Prepared for Public Comment| Privacy World

A data processing agreement is not always enough. | Privacy World

Rhode Island Makes it an Even 20 | Privacy World

We reported earlier that at the July 16th California Privacy Protection Agency (CPPA) Board meeting, the Board would be considering a rulemaking package that staff prepared further the Board’s vote and direction in March.  Copies of those documents are here.  At the July 16th Board meeting the staff presented on those, and reported that it was still working on the required Standardized Regulatory Impact Assessment (SRIA) that will need to be approved by the CA Department of Finance prior to publication for public comment and the commencement of the formal rulemaking process.  The Board also debated the substance of the draft rules but did not vote on them.  The Board asked staff to make clear certain alternatives to the draft in the call for public comments, most notably if risk assessments related to processing that, results in consequential decision-making, should be for all processing or just processing using automated decision-making (ADM) technologies.  Board Member MacTaggert raised several concerns about the current drafts, including:

Continue Reading California Privacy Regs Advance But Vote on Drafts Delayed