For years now, California has led the way by setting the standard for privacy and data protection regulation in the United States. Recently— and as calls for greater controls over the addictive nature of social media grow louder—legislators in the Golden State have moved closer toward enacting a new, first-of-its-kind privacy law that would prohibit the development and utilization of “addictive” features by social media platforms. At the same time, state legislators also advanced a second bill that would put in place stringent online privacy protections for minors.

Businesses should monitor the progress of these bills closely, as their enactment—combined with an increased focus on children’s privacy by both federal lawmakers and the Federal Trade Commission (“FTC”)—may have a ripple effect in other states and municipalities, with legislators following close behind to enact similar children’s online privacy laws.

Continue Reading California Moves Closer to Enacting More Stringent Online Privacy Protections for Children

Last week, the Federal Trade Commission (“FTC”) held an open meeting focused on issues related to children’s privacy and those pertaining to the use of endorsements and testimonials in advertising. In the meeting, the FTC adopted a new policy statement targeting data collection practices in educational technology. Further, the FTC proposed amendments to the Guides Concerning the Use of Endorsements and Testimonials in Advertising (“Endorsement Guides”) which would target child-directed marketing. Of note, one of the amendments would recognize that children may react to advertising practices differently than adults and thus advertising practices directed towards children may be treated differently by the FTC compared to those practices directed towards adults. Continue Reading FTC Targets Children’s Privacy and Stealth Advertising Directed at Children

The Federal Trade Commission (“FTC”) announced its next open meeting will focus on issues related to children’s privacy and those pertaining to the use of endorsements and testimonials in advertising. Continue Reading FTC to Discuss Children’s Privacy, Endorsement Guides at Next (Virtual) Open Commission Meeting: May 19, 2022, 1PM ET

As readers of CPW know, the Federal Trade Commission (“FTC”) has made it clear that privacy and security will be top-of-mind issues for the Commission for the foreseeable future. Recently, the FTC announced its settlement with WW International, Inc.—formerly known as Weight Watchers (“Weight Watchers”)—over claims the company violated the Children’s Online Privacy Protection Act (“COPPA”) by collecting children’s personal information without providing notice or obtaining parental consent.

The settlement requires the company to pay a $1.5 million penalty, delete personal information that was improperly collected from children, and destroy any models or algorithms developed with the use of that data. Importantly, the settlement illustrates the FTC’s increased focus on children’s privacy, as well as the Commission’s increased reliance on the disgorgement remedy in privacy and security enforcement actions—including in the AI context.

I.     Factual Background & FTC Allegations

By way of background, COPPA requires that websites, apps, and online services that are child-oriented or knowingly collect personal information from children notify parents and obtain their consent before collecting, using, or disclosing personal information from children under 13. It was passed in 1998 amid rising concerns regarding children’s privacy online. Unlike other some other federal regulatory regimes, both the FTC and state attorneys general have concurrent jurisdiction to enforce COPPA (meaning as a practical matter private entities are subject to potential regulator scrutiny at both the state and federal level for alleged COPPA violations).

Weight Watchers marketed a health and wellness app and website to both adults and children that allowed users to track their food intake, activity, and weight. The app also collected personal information, including names, email addresses, and birth dates. Up until late 2019, users could sign up for the app by indicating (1) they were a parent registering their child or (2) a child over the age of 13 signing up for themselves.

The non-neutral age gate that was presented by Weight Watchers at registration indicated to younger users that they could sign up without a parent by falsely claiming they were at least 13. Not only that, hundreds of users who signed up for the app did, in fact, circumvent the age gate by creating an account and later revising their profiles to reflect their true age. Despite this, these users were still permitted to access the app without parental involvement. Further, while the company implemented a new age gate in late 2019 that removed any reference to being “at least 13” and indicated that individuals under the age of 13 needed parental permission to use the app, Weight Watchers’ screening mechanism still failed to ensure that users who selected the parent signup option were truly parents—and not children attempting to bypass the age restriction.

According to the FTC, Weight Watchers violated COPPA as a result of its failure to provide a mechanism to prevent children from using the parent registration option to bypass the age restriction, as well as COPPA’s notice and data retention provisions.

II.     The Settlement Terms and Key Takeaways

The Weight Watchers settlement is comprised of three primary components, all of which carry significant implications for potential FTC enforcement actions going forward.

  • First, the company must pay a $1.5 million penalty.
  • Second, the company must destroy all personal information that was collected in a manner that failed to comply with COPPA.
  • Finally, the company must destroy all models or algorithms developed in whole or in part using improperly collected personal information 

     A.     FTC’s Continued Focus on Children’s Privacy 

There are three major takeaways from the Weight Watchers settlement. The first pertains to the FTC’s increased activity in the children’s privacy space. The Weight Watchers settlement comes on the heels of several other FTC enforcement actions against companies who ran afoul of COPPA. In December 2021, advertising platform OpenX Technologies agreed to pay a $2 million penalty to resolve similar FTC allegations that it collected children’s personal information without parental consent. And in July of last year, online coloring book app Kuuhuub agreed to a $3 million penalty to settle COPPA allegations as well.

Relatedly, during his State of the Union address President Joe Biden urged Congress to strengthen children’s privacy protections and clamp down on companies that improperly collect children’s personal information.

Taken together, companies that market their online products or services to children—or otherwise collect children’s personal information—are well-advised to review their compliance with COPPA’s requirements to mitigate the heightened legal risk posed by the FTC’s increased emphasis on children’s privacy.

     B.     Utilization of Disgorgement Remedy

The second major takeaway pertains to the requirement that Weight Watchers destroy any models or algorithms developed through the use of personal information that was improperly collected from minors in violation of COPPA.

Importantly, the Weight Watchers matter marks the first time that the FTC has utilized this enforcement tool—known as disgorgement—in a COPPA case. This is part of a larger shift by the FTC to prioritize “meaningful disgorgement” as a remedy in privacy and security and enforcement actions. Disgorgement was first used by the FTC in its first enforcement action specifically targeting improper facial recognition practices with photo developer Everalbum, Inc. As part of the settlement, Everalbum was forced to delete not only all photos and other user data that had been improperly collected and/or retained, but also all facial recognition algorithms that were developed with Everalbum’s ill-gotten data.

Shortly after the Everalbum settlement—during remarks at the 2021 Future of Privacy Forum—the FTC’s then-Acting Chairwoman, Rebecca Kelly Slaughter, noted that where companies unlawfully collect and/or use consumers’ personal information, the FTC would seek disgorgement of both the improperly collected data, as well as any benefits from that data—pointing to Everalbum as an example of how the FTC could leverage disgorgement in privacy and security matters.

     C.     Algorithmic Disgorgement As New Normal In Near Future?

Third, the Weight Watchers settlement not only represents a continuation of the disgorgement remedy trend in FTC enforcement actions, but also indicates that algorithmic disgorgement may soon become a standard component in future FTC settlements. This may have a particularly outsized impact on developers of artificial intelligence and related technologies which rely heavily on the development of advanced algorithms.

This settlement is yet another example of the FTC’s focus on the impact AI can have in relation to consumer privacy and related issues.  In December the FTC issued a notice (“Notice”) that it was “considering initiating a rulemaking under Section 18 of the FTC Act to curb lax security practices, limit privacy abuses, and ensure that algorithmic decision-making does not result in unlawful discrimination.”

There are a range of privacy, cybersecurity and AI issues that the FTC may seek to regulate as previewed by its Notice, should internal disagreement at the agency not stall this effort in 2022.  For instance, as seen in an April 2021 release the FTC has increasingly cautioned that AI may be utilized and “inadvertently introduce[e] bias or other unfair outcomes” to medicine, finance, business operations, media, and other sectors.  In addition, the FTC declared algorithmic and biometric bias as a focus of enforcement in resolutions passed in Fall 2021.

For more on this, stay tuned.  CPW will be there to keep you in the loop.

On Friday, February 25, 2022, the Utah Senate unanimously passed SB 227, or the Utah Consumer Privacy Act.

Controllers and Processors Beware

SB 227 is an omnibus privacy bill that shares similarities with the Virginia Consumer Data Protection Act and the Colorado Privacy Act.  For instance, the bill imposes different obligations on a covered business depending on whether the business is acting as a controller (one who determines the purposes for processing data, alone or in coordination with others) or processor (one who processes data on behalf of a controller).

Controllers are responsible for transparency, purpose specification, and data minimization.  They must also obtain the consumer’s consent for any secondary uses, and must honor consumer rights (generally within 45 days of receipt of the consumer’s request).  Controllers are also responsible for safeguarding data privacy and security, non-discrimination, non-retaliation, and non-waiver of consumer rights.  Controllers are prohibited from processing certain data qualifying as “sensitive data” without first presenting the consumer with clear notice and providing an opportunity to opt-out of processing.

Processors must follow a controller’s instructions and must enter into a contract that incorporates certain enumerated requirements (e.g., requirements pertaining to duty of confidentiality and data privacy and security safeguards) before processing data on behalf of the controller.

Applicability

The bill applies to:

  1. Businesses who (a) (i) conduct business in Utah; or produces a product or service targeted to consumers who are Utah residents; (b) has an annual revenue of $25,000,000 or more; and (c) satisfies one of more of certain enumerated thresholds (e.g., controls or processes the personal data of 100,000 or more consumers; or derives over 50% of gross revenue from the sale of personal data);
  2. “Personal Data,” which is information that can be linked (or is reasonably linkable to) an identified or identifiable individual, with exclusions; and
  3. “Biometric data,” which is “automatic measurements of an individual’s unique biological characteristics” that can identify a specific individual, excluding, among others, photographs or video recordings (or data derived from either).

The bill does not apply to, among others:

  1. Government entities;
  2. Business entities that are covered entities or business associates pursuant to the Health Insurance Portability and Accountability Act (“HIPAA”); and
  3. Information subject to HIPAA, the Federal Credit Reporting Act (“FCRA”), the Gramm-Leach-Bliley Act (“GLBA”), or the federal Drivers Privacy Protection Act (“DPPA”).

Consumer Rights

The bill protects “consumers,” which are individuals who are Utah residents acting in an individual or household context, not in an employment or commercial context.  Consumers would have the rights of access, correction, deletion, portability, and right to opt-out of certain processing.  Consumers also have a right to opt-out of certain processing, including the “sale” of personal data.

The parents or legal guardians of consumers who are children (under 13 years old) may exercise consumer rights on behalf of the child.  The personal data of children is considered “sensitive data” under the Utah Consumer Privacy Act.  The bill as currently drafted requires controllers to process the personal data of known children according to the requirements of the federal Children’s Online Privacy Protection Act (“COPPA”).

No Right of Private Action

The bill as currently drafted does not grant a private right of action and explicitly precludes consumers from using a violation of the Act to support a claim under other Utah laws, such as laws regarding unfair or deceptive acts or practices.

Risk of Enforcement Action

The Utah Consumer Privacy Act grants exclusive enforcement authority to the Utah Attorney General.  However, before the Attorney General initiates an enforcement action, the Attorney General must first provide the allegedly non-compliant business with (1) written notice (30 days before initiating enforcement action) and (2) an opportunity to cure (30 days from receipt of the written notice).

Prior Legislative History

The Utah Consumer Privacy Act was previously introduced in 2021 (as S 200) and in 2020 (as S 429).  In 2021, S 200 passed the first and second Senate floor readings, but failed to get a third Senate floor reading despite a substitute bill and fiscal note being distributed.  The Utah legislature closes on March 4, 2022.

Update as of March 3, 2022

On March 3, 2022, the Utah Senate passed the House Amendments to SB 227, and returned SB 227 to the House for signature of the Speaker.  The amended version of SB 227 passed with 22 Yay votes, 0 Nay votes, and 4 absentees. This means that the bill has passed the concurrence process. Once the bill is signed by the Speaker, it moves on to the ‘enrolling process,’ and then afterwards will be delivered to the Governor, in accordance with the Utah legislative process

What’s Next

In Utah, if a chamber passes a bill with amendments, the “the bill is sent back to originating [chamber] for concurrence of the amendment.”  Here, SB 227 passed in the Senate (where it was first introduced), then passed in the House with amendments, and afterwards was sent back to the Senate for concurrence.

If the Senate accepts the House amendments, SB 227 will be delivered to the Governor for action.  The Governor has 20 days from adjournment to (1) sign (or not sign the bill), after which the bill becomes law; or (2) veto the bill, in which case the bill does not become a law unless the Governor’s veto is overridden by the legislature.

Utah is inching closer to passing the Utah Consumer Privacy Act.  CPW will be here to keep you in the loop.

On Friday, February 25, 2022, the Utah Senate unanimously passed SB 227, or the Utah Consumer Privacy Act.

Controllers and Processors Beware

SB 227 is an omnibus privacy bill that shares similarities with the Virginia Consumer Data Protection Act and the Colorado Privacy Act.  For instance, the bill imposes different obligations on a covered business depending on whether the business is acting as a controller (one who determines the purposes for processing data, alone or in coordination with others) or processor (one who processes data on behalf of a controller).

Controllers are responsible for transparency, purpose specification, and data minimization.  They must also obtain the consumer’s consent for any secondary uses, and must honor consumer rights (generally within 45 days of receipt of the consumer’s request).  Controllers are also responsible for safeguarding data privacy and security, non-discrimination, non-retaliation, and non-waiver of consumer rights.  Controllers are prohibited from processing certain data qualifying as “sensitive data” without first presenting the consumer with clear notice and providing an opportunity to opt-out of processing.

Processors must follow a controller’s instructions and must enter into a contract that incorporates certain enumerated requirements (e.g., requirements pertaining to duty of confidentiality and data privacy and security safeguards) before processing data on behalf of the controller.

Applicability

The bill applies to:

  1. Businesses who (a) (i) conducts business in Utah; or produces a product or service targeted to consumers who are Utah residents; (b) has an annual revenue of $25,000,000 or more; and (c) satisfies one of more of certain enumerated thresholds (e.g., controls or processes the personal data of 100,000 or more consumers; or derives over 50% of gross revenue from the sale of personal data);
  2. “Personal Data,” which is information that can be linked (or is reasonably linkable to) an identified or identifiable individual, with exclusions; and
  3. “Biometric data,” which is “automatic measurements of an individual’s unique biological characteristics” that can identify a specific individual, excluding, among others, photographs or video recordings (or data derived from either).

The bill does not apply to, among others:

  1. Government entities;
  2. Business entities that are covered entities or business associates pursuant to the Health Insurance Portability and Accountability Act (“HIPAA”); and
  3. Information subject to HIPAA, the Federal Credit Reporting Act (“FCRA”), the Gramm-Leach-Bliley Act (“GLBA”), or the federal Drivers Privacy Protection Act (“DPPA”).

Consumer Rights

The bill protects “consumers,” which are individuals who are Utah residents acting in an individual or household context, not in an employment or commercial context.  Consumers would have the rights of access, correction, deletion, portability, and right to opt-out of certain processing.  Consumers also have a right to opt-out of certain processing, including the “sale” of personal data.

The parents or legal guardians of consumers who are children (under 13 years old) may exercise consumer rights on behalf of the child.  The personal data of children is considered “sensitive data” under the Utah Consumer Privacy Act.  The bill as currently drafted requires controllers to process the personal data of known children according to the requirements of the federal Children’s Online Privacy Protection Act (“COPPA”).

No Right of Private Action

The bill as currently drafted does not grant a private right of action and explicitly precludes consumers from using a violation of the Act to support a claim under other Utah laws, such as laws regarding unfair or deceptive acts or practices.

Risk of Enforcement Action

The Utah Consumer Privacy Act grants exclusive enforcement authority to the Utah Attorney General.  However, before the Attorney General initiates an enforcement action, the Attorney General must first provide the allegedly non-compliant business with (1) written notice (30 days before initiating enforcement action) and (2) an opportunity to cure (30 days from receipt of the written notice).

What’s Next

The Utah Consumer Privacy Act was previously introduced in 2021 (as S 200) and in 2020 (as S 429).  In 2021, S 200 passed the first and second Senate floor readings, but failed to get a third Senate floor reading despite a substitute bill and fiscal note being distributed.  The Utah legislature closes on March 4, 2022.

It remains to be seen how the 2022 version of the Utah Consumer Privacy Act will fare in the Utah House, but CPW will be here to keep you in the loop.

The Federal Trade Commission (FTC) has made it clear: data privacy and cybersecurity are now a priority, and will be for years to come. In the wake of PrivacyCon 2021, the FTC’s sixth annual privacy, cybersecurity and consumer protection summit, held this summer, the FTC finally took official and sweeping action on privacy and cybersecurity. In particular, the Commission recently designated eight key areas of focus for enforcement and regulatory action, three of which directly implicate privacy, cybersecurity, and consumer protection. Below, we discuss the FTC’s action and what it means for businesses, the three key areas of interest to consumer privacy that are now in the FTC’s spotlight, as well as their relation to state privacy legislation and their anticipated impact to civil litigation. Full details on PrivacyCon 2021 and the FTC’s resolutions following the summit can be found on the FTC’s website, linked here for your convenience.

The FTC’s Actions and Areas of Focus

In mid-September, the FTC voted to approve a series of resolutions, directed at key enforcement areas, including the following, each discussed in further detail below:

  • Children Under 18: Harmful conduct directed at children under 18 has been a source of significant public concern, now, FTC staff will similarly be able to expeditiously investigate any allegations in this important area.
  • Algorithmic and Biometric Bias: Allows staff to investigate allegations of bias in algorithms and biometrics. Algorithmic bias was the subject of a recent FTC blog.
  • Deceptive and Manipulative Conduct on the Internet: This includes, but is not limited to, the “manipulation of user interfaces,” including but not limited to dark patterns, also the subject of a recent FTC workshop.

The approval of this series of resolutions will enable the Commission “to efficiently and expeditiously investigate conduct in core FTC priority areas. Through the passage of the resolutions, the FTC has now directed that all “compulsory processes” available to it be used in connection with COPPA enforcement. This omnibus resolution mobilizes the full force of the FTC for the next ten years and gives FTC staff full authority to conduct investigations and commence enforcement actions in pursuit of this goal. The FTC has offered very little elaboration on this front, however, regarding how it will use such “compulsory processes,” which include subpoenas, civil investigative demands, and other demands for documents or testimony.

What does seems clear, however, is that the FTC is buckling down on the enforceability of its own actions. Previous remarks by Chair Lina M. Khan before the House Energy and Commerce Committee expressed frustration at the frequent hamstringing of the agency at the hands of courts in its enforcement efforts in the past. With this declaration of renewed energy, the FTC is summoning all the power it can to do its job, and we should expect to see an energized FTC kick up its patrol efforts in the near future. Businesses that conduct activities that implicate these renewed areas should be aware of the FTC’s focus and penchant for investigations and enforcement in such areas.

Children Under 18

The FTC’s mandate to focus on harmful conduct directed at children under 18 is a signal that the Commission plans on broadening and doubling down on its already active enforcement efforts in this area. Areas of the Commission’s prior and current focus on children include marketing claims, loot boxes and other virtual items that can be purchased in games, and in-app and recurring purchases made by children without parental authorization. Most importantly, the FTC is the main arbiter of children’s online privacy through its enforcement of the Children’s Online Privacy Protection Act (“COPPA”), but that law only applies to children under 13 (i.e., 12 and under).  With this new proviso to focus on children under 18, we can certainly expect the FTC to focus on consumer privacy issues, broader than COPPA, for children from ages 13 to 17 as well.

Algorithmic and Biometric Bias

The FTC already has enforcement capabilities to regulate the development and use of artificial intelligence (“AI”) and its associated algorithms. These include the Section 5 of the FTC Act, which prohibits “unfair or deceptive acts or practices,” the Fair Credit Reporting Act, which rears its head when algorithms impact lenders’ decisions to provide credit, and the Equal Opportunity Credit Act, which prohibits the use of biased algorithms that discriminate on the basis of race, color, sex, age, and so on when making credit determinations. In using these tools, the FTC aims to clarify how algorithms are used and how the data that feeds them contributes to algorithmic output, and to bring to light issues that arise when algorithms don’t work or feed on improper biases.

Bias and discrimination arising from use of biometrics will also now be a focus of the FTC. Interestingly, much recent research and criticism has pointed out that algorithms and biometric systems are biased against faces of color. This has arisen in many contexts, from the iPhone’s FaceID feature to the 2020 remotely-administered bar exam that threatened to fail applicants of color because their webcams could not detect their faces. These are just some of the issues that arise when companies turn to algorithms to try to create heuristics in making business decisions. The FTC has not let these concerns go by the wayside, and after preliminarily addressing them in an April 2021 blog post, has now reestablished that algorithmic and biometric bias is a new focus for the upcoming years.

Notably, AI and other automated decision-making, particularly that which results in legal and/or discriminatory effects, will also become regulated under omnibus privacy legislation in California, Virginia, and Colorado, forthcoming in 2023.

Deceptive and Manipulative Conduct on the Internet (Including “Dark Patterns”)

The sinisterly-nicknamed practice of “dark patterns” happens constantly to online consumers, albeit in ways that tend to seem benign. For example, shoppers contemplating items in their cart may be pressured to complete the sale if they receive a notification like, “Hurry, three other people have this in their cart!” More annoyingly, online consumers who wish to unsubscribe to newsletters or email blasts may find themselves having to click through multiple pages just to free their inboxes, rather than an easily-identifiable and quickly-accessible “unsubscribe” button. “Dark patterns” is the term coined for these sorts of techniques, which impair consumers’ autonomy and create traps for online shoppers.

Earlier this year, the FTC hosted a workshop called “Bringing Dark Patterns to Light,” and sought comments from experts and the public to evaluate how these dark patterns impact customers. The FTC was particularly concerned with harms caused by these dark patterns, and how dark patterns may take advantage of certain groups of vulnerable consumers. The FTC is not alone in its attention to this issue; in March, California’s Attorney General announced regulations that banned dark patterns and required disclosure to consumers of the right to opt-out of the sale of personal information collected through online cookies. These regulations also prohibit companies from requiring consumers who wish to opt out to click through myriads of screens before achieving their goals. On the opposite coast, the weight-loss app Noom now faces a class action alleging deceptive acts through Noom’s cancellation policy, automatic renewal schemes, and marketing to consumers.

With both public and private entities turning their eyes toward dark patterns, the FTC has now declared the agency will put its full weight behind seeking out and investigating “unfair, deceptive, anticompetitive, collusive, coercive, predatory, exploitative, or exclusionary acts or practices…including, but not limited to, dark patterns…” Keeping an eye on this work will be important—just as important as keeping an eye on which cookies you accept, and which are best to just let go stale.

In addition to being in the crosshairs of the FTC, dark patterns are also a focus of regulators across the globe, including in Europe, and will be regulated under California’s forthcoming California Privacy Rights Act.

Anticipated Litigation Trends

With the FTC declaring its intent to vigorously investigate these three aforementioned areas, we now turn to what the agency’s new enforcement priorities mean for civil litigation. As practitioners in this field already know, it is unlikely that they will result in an influx of new litigations. The FTC’s enforcement authority exists pursuant to Section 5(a) of the FTC Act, which outlaws “unfair or deceptive acts or practices in or affecting commerce,” but does not contain a private right of action – so plaintiffs cannot technically bring new suits based on the new enforcement priorities, as they have no private right to enforce those priorities.

However, these areas of focus could influence broader trends in civil litigation, even if, on their own, they do not create any new liability. Successful enforcement actions by the FTC could bring about new industry standards with respect to algorithmic bias, dark patterns, and other areas of focus. These standards, in turn, could be cited in consumer privacy class action complaints. New civil actions could also stem from enforcement actions by the FTC and the information revealed in settlements resulting from such actions. For example, the FTC announced a settlement with fertility-tracking app Flo Health Inc. in January; this month, a consolidated class action complaint was filed against Flo Health, stemming from seven proposed class actions filed against it this year, alleging that the app unlawfully shared users’ health information with third parties.

Although the FTC’s new enforcement priorities seem ambitious, recent developments may impede its capability to bring enforcement actions in these areas. The agency was dealt a blow in April of this year, when the Supreme Court ruled in AMG Capital Mgmt., LLC v. FTC that the agency lacks power to seek monetary recovery under Section 13 of the FTC Act. Legislation to restore this power to the agency passed the House, but is awaiting a Senate vote. More recently, the Senate voted to advance the nomination of Rohit Chopra, currently a Democratic Commissioner, to lead the Consumer Financial Protection Bureau. The White House announced that President Biden will nominate Alvaro Bedoya, a privacy scholar and expert with expertise in surveillance and data security, to fill Commissioner Chopra’s seat. As Commissioner, likely priorities for Bedoya include the FTC’s enforcement of various privacy laws, including the Fair Credit Reporting Act and the Gramm-Leach-Bliley Act, which could further impact litigations brought under those statutes.

 

 

Unlike the European Union and many countries, the US does not have a holistic, comprehensive federal law generally regulating privacy and the collection, processing, disclosure and security of “personal information” (typically defined as information that identifies, relates to, describes, is reasonably capable of being linked to, a particular individual). Rather, a patchwork of sectoral federal

Join Elliot Golding, a data privacy partner at SPB along with Joanne Charles (Microsoft) and Trinity Car (eHealth) for a must-attend webinar sponsored by the ABA next Thursday, September 9, at 1 pm EST.

In this era of digital health, the panelists will look “beyond HIPAA” and highlight other federal and state laws governing health information. They will explore the laws that relate to health information other than HIPAA and provide risk management strategies. Their session will focus on: Research and data analytics covering federal laws (the Common Rule) and state privacy laws (CCPA and biometrics); vulnerable populations, including children (subject to COPPA, FERPA, etc.), and sensitive health information (mental health and substance use disorders); Advertising and communication issues subject to federal laws (CAN-SPAM and TCPA), state laws (CCPA) and industry (DAA) standards.

Registration is available here