The Federal Trade Commission (FTC) has released a staff reportBringing Dark Patterns to Light, which discusses misleading and manipulative design practices—dark patterns—in web and mobile apps. These design choices take advantage of users’ cognitive biases to influence their behavior and prevent them from making fully informed decisions about their data and purchases. Dark patterns are employed to get users to surrender their personal information, unwittingly sign up for services, and purchase products they do not intend to purchase. The consequences of dark patterns have been increasingly noticed in the regulatory and legislative sphere, both in the United States and Europe

Continue Reading Dark Patterns under the Regulatory Spotlight Again

Dark patterns are top of mind for regulators on both sides of the Atlantic. In the United States, federal and state regulators are targeting dark patterns as part of both their privacy and traditional consumer protection remits. Meanwhile, the European Data Protection Board (EDPB) is conducting a consultation on proposed Guidelines (Guidelines) for assessing and avoiding dark pattern practices that violate the EU General Data Protection Directive (GDPR) in the context of social media platforms. In practice, the Guidelines are likely to have broader application to other types of digital platforms as well. Continue Reading “Dark Patterns” Are Focus of Regulatory Scrutiny in the United States and Europe

This month, CPW’s Kyle Fath, Kristin Bryan, Christina Lamoureux & Elizabeth Helpling explained how data privacy and cybersecurity were Federal Trade Commission (“FTC”) priorities.  As they wrote, there were “three key areas of interest to consumer privacy that are now in the FTC’s spotlight, as well as their relation to state privacy legislation and their anticipated impact to civil litigation.”  One area of interest they identified was deceptive and manipulative conduct on the Internet (including so-called “dark patterns”).  Today, the FTC announced that it was going to ramp up enforcement against illegal dark patterns that trick consumers into subscriptions.  Read on to learn more and what it means going forward.

First, some background.  The term “dark patterns” collectively applies manipulative techniques that can impair consumer autonomy and create traps for online shoppers (for instance, think of multi-click unsubscription options).  As CPW previously explained, “[e]arlier this year, the FTC hosted a workshop called “Bringing Dark Patterns to Light,” and sought comments from experts and the public to evaluate how dark patterns impact customers.”  The genesis for this workshop was the FTC’s concern with harms caused by dark patterns, and how dark patterns may take advantage of certain groups of vulnerable consumers.

Notably, the FTC is not alone in its attention to this issue as California’s Attorney General previously announced regulations that banned dark patterns and required disclosure to consumers of the right to opt-out of the sale of personal information collected through online cookies.  Dark patterns has also been targeted in civil litigation.  This year, the weight-loss app Noom faced a class action alleging deceptive acts through Noom’s cancellation policy, automatic renewal schemes, and marketing to consumers.

Building off these prior developments, today, the FTC announced a new enforcement policy statement “warning companies against deploying illegal dark patterns that trick or trap consumers into subscription services.”  As the FTC cautioned, “[t]he agency is ramping up its enforcement in response to a rising number of complaints about the financial harms caused by deceptive sign up tactics, including unauthorized charges or ongoing billing that is impossible cancel.”

As summarized in the FTC’s press release announcing this development, businesses going forward must follow three key requirements in this area or run the risk of an enforcement action (including potential civil penalties):

  • (1) Disclose clearly and conspicuouslyall material terms of the product or service:  This includes disclosing how much a product and/or service costs, “deadlines by which the consumer must act to stop further charges, the amount and frequency of such charges, how to cancel, and information about the product or service itself that is needed to stop consumers from being deceived about the characteristics of the product or service.”
  • (2) Obtain the consumer’s express informed consent before charging them for a product or services: This means “obtaining the consumer’s acceptance of the negative option feature separately from other portions of the entire transaction, not including information that interferes with, detracts from, contradicts, or otherwise undermines the consumer’s ability to provide their express informed consent.”
  • (3) Provide easy and simple cancellation to the consumer: Marketers are also to “provide cancellation mechanisms that are at least as easy to use as the method the consumer used to buy the product or service in the first place.”

This development is likely one of only many anticipated to be rolled out in light of the FTC’s continued focus on data privacy and cybersecurity.  For more on this, stay tuned—CPW will be there to keep you in the loop.

As Rosa BarceloMatus HubaLucia Hartnett and Bethany Simmonds discuss in greater detail here, “[t]he European Data Protection Board (“EDPB”), a body with members from all EEA supervisory authorities (and the European Data Protection Supervisor), has recently established a taskforce to coordinate the response to complaints concerning compliance of cookie banners filed with several European Economic Area (“EEA”) Supervisory Authorities (“SAs”) by a non-profit organization NOYB. NOYB believes that many cookie banners, including those of ‘major’ companies, engage in “deceptive designs” and “dark patterns”.  The EDPB taskforce is established in accordance with Art. 70(1)(u) of the GDPR, which states that the EDBP must promote the cooperation and effective bilateral and multilateral exchange of information and best practices between SAs. The aim of this taskforce is to harmonize and coordinate the approach to investigating and responding to cookie banner complaints from NOYB. It remains to be seen how this will actually be done in practice and whether EDPB will limit the harmonization to procedural approach to the complaints, or whether it will also attempt to ensure consistent application of the underlying substantive rules.”

They provide a detailed analysis at the Security Privacy Bytes blog and comment that “the development of the taskforce could have a significant impact in streamlining the handling of the complaints it is set to investigate and could help companies better understand what is an acceptable pan-EU approach to cookie banners.”

The European Data Protection Board (“EDPB”), a body with members from all EEA supervisory authorities (and the European Data Protection Supervisor), has recently established a taskforce to coordinate the response to complaints concerning compliance of cookie banners filed with several European Economic Area (“EEA”) Supervisory Authorities (“SAs”) by a non-profit organisation NOYB. NOYB believes that many cookie banners, including those of ‘major’ companies, engage in “deceptive designs” and “dark patterns”. Continue Reading EDPB Establishes Cookie Banner Taskforce, Which Will Also Look Into Dark Patterns and Deceptive Designs

Yesterday, Utah’s Social Media Regulation Act (“SMRA”) was signed into law by Gov. Spencer Cox.

The SMRA applies to businesses that provide a social media platform with at least five (5) million account holders worldwide. The definition of “social media platform” is broad but includes 24 exceptions that generally narrow the SMRA’s scope to a lay-person’s typical understanding of a social media platform.

It goes into effect on May 3, 2023 with numerous compliance requirements and prohibitions for social media platforms coming into force beginning March 1, 2024. Continue Reading Utah’s Social Media Regulation Act Signed by Governor

Several months ago, you may have seen social media filled with artistic renditions of your connections as paintings, cartoons, or other artistic styles. These renditions came from Lensa, an app by which users upload “selfies” or other photos, which the app processes to generate artistic images of the user. Lensa, which is owned by Prisma Labs, Inc., is the latest subject of a putative class action brought under the Illinois Biometric Information Privacy Act (“BIPA”).

In Flora, et al., v. Prisma Labs, Inc., No. 5:23-cv-00680 (N.D. Cal.), Plaintiffs—a group that includes a minor child—are residents of Illinois who used the Lensa app to create artistic images of themselves. Plaintiffs allege that they used Lensa in December 2022, after the app exploded in popularity in November 2022 due to the launch of the “magic avatars” feature, which requires users to upload at least eight images of themselves (and up to 20 images) to create artistic, stylized “avatars” of the user’s face. The app can also be used to upload images of others, and create avatars based on those images. Plaintiffs allege that Lensa’s privacy policy as of December 2022 did not inform users that their facial geometry would be collected to create the avatars, and that several oblique references to Lensa’s use and processing of users’ images lead users to believe that their biometric data is “anonymized” and does not leave the user’s device—which seemingly contradicts Lensa’s model of collecting users’ images and generating avatars based on those images. The Complaint also alleges that Lensa’s privacy policy temporarily disclosed that “face data” will be used to “train” its “neural network algorithms,” but that the provision was subsequently removed, and never included provisions of how that data would be protected or disclosed.

Based on the allegations in the Complaint, Plaintiffs seek to represent a class of “All persons who reside in Illinois whose biometric data was collected, captured, purchased, received through trade, or otherwise obtained by Prisma, either through use of the Lensa app or otherwise.” Plaintiffs bring seven causes of action under Sections 15(a), 15(b)(1), 15(b)(2), 15(b)(3), 15(c), 15(d), and 15(e) of BIPA, as well as an additional claim for unjust enrichment based on Lensa’s paid subscription service.

The Complaint also raises additional concerns about Lensa’s business model and methods of generating images. For example, upon downloading the app, a user is prompted to begin a seven-day trial subscription with Lensa; the Complaint alleges that the app uses dark patterns to prompt users to choose this option, rather than closing out of it and declining the trial subscription. The Complaint also alleges that Lensa uses Stable Diffusion to generate images, which is an open-source AI model trained on over 2 billion copyrighted images, including images that are protected by copyright. As alleged in the Complaint, the system could violate the intellectual property rights of artists who own the copyrights in the images used to train the AI model.

Flora is similar to past BIPA class actions brought against apps that allow users to virtually “try on” makeup, clothing, or other beauty items, as well as class actions brought against entities that use images to “train” models of AI. Plaintiffs are represented by Loevy & Loevy, which notably prevailed in the first BIPA case to go to trial, Rogers v. BNSF Railway Company. Privacy World will continue to keep an eye on how this case develops for you.

746 years. That is the total amount of time criminal defendants have been sentenced to prison from consumer fraud cases the Federal Trade Commission (FTC) has referred to prosecutors the past five years. Indeed, the FTC’s Bureau of Consumer Protection Criminal Liaison Unit (Bureau) highlighted these figures in its recently published Criminal Liaison Unit Report. Notably, this report emphasized the FTC’s growing enforcement concern over the use of deceptive negative option marketing (or dark patterns) and its intended aim to push egregious cases to prosecutors in the future. The Criminal Liaison Unit Report (the Report) is consistent with FTC’s November 4, 2021 Enforcement Policy Statement Regarding Negative Option Marketing, and the Report outlines four key takeaways for companies going forward. Continue Reading FTC Signals More Criminal Referrals for Negative Option Fraudsters

On October 17, 2022, the California Privacy Protection Agency (“CPPA” or “Agency”) published Modified Text of Proposed Regulations (“Modified Regs”) and Explanation of Modified Text of Proposed Regulations (“Explanation of Modified Regs”). The CPPA review of the Modified Regs has been postponed and is now scheduled to be considered during the October 28-29, 2022 public meeting.

Recall that earlier this year, on May 27, 2022, the CPPA published the first draft of the proposed CPRA Regs and initial statement of reasons. The Agency commenced the formal rulemaking process to adopt the Regs on July 8, 2022, and the 45-day public comment period closed on August 23, 2022. The comments submitted in response to the first draft of the Regs are available here. Continue Reading Revised Proposed CPRA Regs To Be Considered At October 28, 2022 Meeting

In case you missed it, below are recent posts from Consumer Privacy World covering the latest developments on data privacy, security and innovation. Please reach out to the authors if you are interested in additional information.

Passage of Federal Privacy Bill Remains Possible This Year, Remains a Continued Priority | Consumer Privacy World

Webinar Registration Open: Mitigating Cybersecurity Class Action Litigation Risks: Policies, Procedures, Service Providers, Notification, Damages | Consumer Privacy World

Kyle Fath appointed to Connecticut Privacy Legislation Working Group | Consumer Privacy World

FCC Adopts Rulemaking Proposal to Protect Consumer Privacy From Invasion by Unwanted Text Messages | Consumer Privacy World

Update on the California Privacy Protection Agency: Still No Date Certain for the CPRA Regulations | Consumer Privacy World

“Delaware Ruling Highlights Challenges Of Data Breach Biz Disputes” Article, Co-Authored by CPW’s Kristin Bryan, Jesse Taylor and Caroline Dzeba, is Published on Law360 | Consumer Privacy World

Third Circuit Announces Standard for Determining Accuracy of Credit Reports Under FCRA | Consumer Privacy World

2023 State Privacy Laws: How to Assess and Ensure Readiness by Year-end

Malcolm Dowden and Niloufar Massachi Discuss Vendor Contracting Requirements Under New US Privacy Laws and the GDPR

New topic for EDPB’s coordinated enforcement action: the DPO

Dark Patterns under the Regulatory Spotlight Again

CPW’s Shea Leitch and Kyle Dull to Speak at ACC South Florida’s 12th Annual CLE Conference

CPW’s David Oberly Examines Recent Major Changes to Consumer Privacy Legal Landscape in Latest Issue of the Cincinnati Bar Association’s CBA Report Magazine

CPW’s Kristin Bryan Discusses Session Replay Software Litigation Trends With The Seattle Times

Office of Management and Budget Takes Action to Enhance the Security of Software Supply Chain

CPW’s Kristin Bryan, Jesse Taylor and Shing Tse Co-Author Chapter for Lexis Practical Guidance on Privacy, Cybersecurity and Data Breach Litigation: Key Laws and Considerations

Data Protection and Digital Information Bill Delayed – Aspects to Consider While We Wait

CPW’s David Oberly Analyzes the FTC’s Largest FTC Contact Lens Rule Settlement to Date in Law360