The Federal Trade Commission (FTC) has made it clear: data privacy and cybersecurity are now a priority, and will be for years to come. In the wake of PrivacyCon 2021, the FTC’s sixth annual privacy, cybersecurity and consumer protection summit, held this summer, the FTC finally took official and sweeping action on privacy and cybersecurity. In particular, the Commission recently designated eight key areas of focus for enforcement and regulatory action, three of which directly implicate privacy, cybersecurity, and consumer protection. Below, we discuss the FTC’s action and what it means for businesses, the three key areas of interest to consumer privacy that are now in the FTC’s spotlight, as well as their relation to state privacy legislation and their anticipated impact to civil litigation. Full details on PrivacyCon 2021 and the FTC’s resolutions following the summit can be found on the FTC’s website, linked here for your convenience.

The FTC’s Actions and Areas of Focus

In mid-September, the FTC voted to approve a series of resolutions, directed at key enforcement areas, including the following, each discussed in further detail below:

  • Children Under 18: Harmful conduct directed at children under 18 has been a source of significant public concern, now, FTC staff will similarly be able to expeditiously investigate any allegations in this important area.
  • Algorithmic and Biometric Bias: Allows staff to investigate allegations of bias in algorithms and biometrics. Algorithmic bias was the subject of a recent FTC blog.
  • Deceptive and Manipulative Conduct on the Internet: This includes, but is not limited to, the “manipulation of user interfaces,” including but not limited to dark patterns, also the subject of a recent FTC workshop.

The approval of this series of resolutions will enable the Commission “to efficiently and expeditiously investigate conduct in core FTC priority areas. Through the passage of the resolutions, the FTC has now directed that all “compulsory processes” available to it be used in connection with COPPA enforcement. This omnibus resolution mobilizes the full force of the FTC for the next ten years and gives FTC staff full authority to conduct investigations and commence enforcement actions in pursuit of this goal. The FTC has offered very little elaboration on this front, however, regarding how it will use such “compulsory processes,” which include subpoenas, civil investigative demands, and other demands for documents or testimony.

What does seems clear, however, is that the FTC is buckling down on the enforceability of its own actions. Previous remarks by Chair Lina M. Khan before the House Energy and Commerce Committee expressed frustration at the frequent hamstringing of the agency at the hands of courts in its enforcement efforts in the past. With this declaration of renewed energy, the FTC is summoning all the power it can to do its job, and we should expect to see an energized FTC kick up its patrol efforts in the near future. Businesses that conduct activities that implicate these renewed areas should be aware of the FTC’s focus and penchant for investigations and enforcement in such areas.

Children Under 18

The FTC’s mandate to focus on harmful conduct directed at children under 18 is a signal that the Commission plans on broadening and doubling down on its already active enforcement efforts in this area. Areas of the Commission’s prior and current focus on children include marketing claims, loot boxes and other virtual items that can be purchased in games, and in-app and recurring purchases made by children without parental authorization. Most importantly, the FTC is the main arbiter of children’s online privacy through its enforcement of the Children’s Online Privacy Protection Act (“COPPA”), but that law only applies to children under 13 (i.e., 12 and under).  With this new proviso to focus on children under 18, we can certainly expect the FTC to focus on consumer privacy issues, broader than COPPA, for children from ages 13 to 17 as well.

Algorithmic and Biometric Bias

The FTC already has enforcement capabilities to regulate the development and use of artificial intelligence (“AI”) and its associated algorithms. These include the Section 5 of the FTC Act, which prohibits “unfair or deceptive acts or practices,” the Fair Credit Reporting Act, which rears its head when algorithms impact lenders’ decisions to provide credit, and the Equal Opportunity Credit Act, which prohibits the use of biased algorithms that discriminate on the basis of race, color, sex, age, and so on when making credit determinations. In using these tools, the FTC aims to clarify how algorithms are used and how the data that feeds them contributes to algorithmic output, and to bring to light issues that arise when algorithms don’t work or feed on improper biases.

Bias and discrimination arising from use of biometrics will also now be a focus of the FTC. Interestingly, much recent research and criticism has pointed out that algorithms and biometric systems are biased against faces of color. This has arisen in many contexts, from the iPhone’s FaceID feature to the 2020 remotely-administered bar exam that threatened to fail applicants of color because their webcams could not detect their faces. These are just some of the issues that arise when companies turn to algorithms to try to create heuristics in making business decisions. The FTC has not let these concerns go by the wayside, and after preliminarily addressing them in an April 2021 blog post, has now reestablished that algorithmic and biometric bias is a new focus for the upcoming years.

Notably, AI and other automated decision-making, particularly that which results in legal and/or discriminatory effects, will also become regulated under omnibus privacy legislation in California, Virginia, and Colorado, forthcoming in 2023.

Deceptive and Manipulative Conduct on the Internet (Including “Dark Patterns”)

The sinisterly-nicknamed practice of “dark patterns” happens constantly to online consumers, albeit in ways that tend to seem benign. For example, shoppers contemplating items in their cart may be pressured to complete the sale if they receive a notification like, “Hurry, three other people have this in their cart!” More annoyingly, online consumers who wish to unsubscribe to newsletters or email blasts may find themselves having to click through multiple pages just to free their inboxes, rather than an easily-identifiable and quickly-accessible “unsubscribe” button. “Dark patterns” is the term coined for these sorts of techniques, which impair consumers’ autonomy and create traps for online shoppers.

Earlier this year, the FTC hosted a workshop called “Bringing Dark Patterns to Light,” and sought comments from experts and the public to evaluate how these dark patterns impact customers. The FTC was particularly concerned with harms caused by these dark patterns, and how dark patterns may take advantage of certain groups of vulnerable consumers. The FTC is not alone in its attention to this issue; in March, California’s Attorney General announced regulations that banned dark patterns and required disclosure to consumers of the right to opt-out of the sale of personal information collected through online cookies. These regulations also prohibit companies from requiring consumers who wish to opt out to click through myriads of screens before achieving their goals. On the opposite coast, the weight-loss app Noom now faces a class action alleging deceptive acts through Noom’s cancellation policy, automatic renewal schemes, and marketing to consumers.

With both public and private entities turning their eyes toward dark patterns, the FTC has now declared the agency will put its full weight behind seeking out and investigating “unfair, deceptive, anticompetitive, collusive, coercive, predatory, exploitative, or exclusionary acts or practices…including, but not limited to, dark patterns…” Keeping an eye on this work will be important—just as important as keeping an eye on which cookies you accept, and which are best to just let go stale.

In addition to being in the crosshairs of the FTC, dark patterns are also a focus of regulators across the globe, including in Europe, and will be regulated under California’s forthcoming California Privacy Rights Act.

Anticipated Litigation Trends

With the FTC declaring its intent to vigorously investigate these three aforementioned areas, we now turn to what the agency’s new enforcement priorities mean for civil litigation. As practitioners in this field already know, it is unlikely that they will result in an influx of new litigations. The FTC’s enforcement authority exists pursuant to Section 5(a) of the FTC Act, which outlaws “unfair or deceptive acts or practices in or affecting commerce,” but does not contain a private right of action – so plaintiffs cannot technically bring new suits based on the new enforcement priorities, as they have no private right to enforce those priorities.

However, these areas of focus could influence broader trends in civil litigation, even if, on their own, they do not create any new liability. Successful enforcement actions by the FTC could bring about new industry standards with respect to algorithmic bias, dark patterns, and other areas of focus. These standards, in turn, could be cited in consumer privacy class action complaints. New civil actions could also stem from enforcement actions by the FTC and the information revealed in settlements resulting from such actions. For example, the FTC announced a settlement with fertility-tracking app Flo Health Inc. in January; this month, a consolidated class action complaint was filed against Flo Health, stemming from seven proposed class actions filed against it this year, alleging that the app unlawfully shared users’ health information with third parties.

Although the FTC’s new enforcement priorities seem ambitious, recent developments may impede its capability to bring enforcement actions in these areas. The agency was dealt a blow in April of this year, when the Supreme Court ruled in AMG Capital Mgmt., LLC v. FTC that the agency lacks power to seek monetary recovery under Section 13 of the FTC Act. Legislation to restore this power to the agency passed the House, but is awaiting a Senate vote. More recently, the Senate voted to advance the nomination of Rohit Chopra, currently a Democratic Commissioner, to lead the Consumer Financial Protection Bureau. The White House announced that President Biden will nominate Alvaro Bedoya, a privacy scholar and expert with expertise in surveillance and data security, to fill Commissioner Chopra’s seat. As Commissioner, likely priorities for Bedoya include the FTC’s enforcement of various privacy laws, including the Fair Credit Reporting Act and the Gramm-Leach-Bliley Act, which could further impact litigations brought under those statutes.