A recent case in the Northern District of Minnesota[1] helpfully confirmed that although consumer reporting agencies (“CRAs”) are required by the Fair Credit Reporting Act (“FCRA”) to ensure that the consumer reports are accurate, they are not obligated to include information on all (or any) credit accounts (or “tradelines”) relating to a consumer.

Plaintiff obtained a mortgage loan in 2006. In 2019, he obtained copies of his credit reports from all three credit bureaus and noted that his payment history for his mortgage was not included on any of them. Plaintiff submitted disputes to each credit bureau, asking them to conduct a reasonable reinvestigation and remedy inaccuracies concerning the mortgage account. Experian responded by stating that it was unable to honor his request to place credit information on his credit report. When Plaintiff informed his mortgage lender that his tradeline was missing despite making timely payments, he learned that the lender had stopped submitting information on the account to the three bureaus.

Plaintiff then sued all three credit bureaus, alleging a violation of the FCRA’s requirement that CRAs follow “reasonable procedures to assure maximum possible accuracy” of information reported, as well as a violation of the FCRA requirement to conduct a reasonable reinvestigation of disputed information.

Experian then moved for judgment based on the pleadings.

In granting Experian’s motion, the court reiterated that to plead a viable claim that a CRA violated the FCRA’s accuracy standard, a plaintiff must plausibly allege that the CRA: (1) reported inaccurate credit information about the plaintiff; and (2) failed to follow reasonable procedures to assure the accuracy of such information. Experian argued that the complaint failed to allege either of these elements because the FCRA does not require CRAs to report every account or tradeline of a consumer; rather, the FCRA requires only that the information contained in a credit report be accurate.

Interestingly, Plaintiff did not dispute Experian’s assertion that CRAs generally are not required to add all credit data to a credit report. Instead, he argued that the “missing” data made the credit report misleading and thus inaccurate. This is a clever argument, as courts have held that technically accurate data can still be misleading in such a way and to such an extent that it can be expected to adversely affect credit decisions. In such cases, many courts have found that including such misleading data to be a violation of the FCRA’s accuracy standard.

The court here noted that the Eighth Circuit had not ruled definitively on whether accurate but misleading data violates the FCRA’s accuracy standard, but held that even assuming that it does so violate such standard, Plaintiff’s claim would still fail. The court found that a credit report was not “inaccurate” or “materially misleading” simply because it did not mention a particular tradeline. Furthermore, the complaint failed to allege any facts supporting a plausible inference that the information reported by Experian was materially misleading, and therefore inaccurate.

This is a helpful reminder that although information reported regarding a tradeline must be complete and accurate, the omission of any information at all regarding a given tradeline is not a violation of the FCRA.  It is especially helpful for CRAs dealing with a furnisher that may not have sufficient controls in place to ensure adequate accuracy. Think twice- there is no FCRA requirement that information from all furnishers must be accepted!

[1] Krosch v. Equifax Info. Servs., et al., 2020 U.S. Dist. LEXIS99150 (N.D. Minn., 2020).

Part I:  First, Check Your Rear-View Mirror

[M]odern enterprise and invention have, through invasions upon . . . privacy, subjected [people] to mental pain and distress, far greater than could be inflicted by mere bodily injury.”[1]

Legal commentators today commonly characterize data privacy and cybersecurity litigation as a “tidal wave”[2] approaching the proportion of a “tsunami.”[3]  But consumer privacy litigation is nothing new and recent legislation has so far not meaningfully altered much other than the zeitgeist.  Privacy litigation in the United States was borne from the Supreme Court’s recognition that certain privacy rights are fundamental.  Indeed, as far back as 1890, Justices Warren and Brandeis predicted that modern advancement and technology would necessarily invade basic privacy interests in ways that would demand legal intervention.[4]

Protection of an individual’s expectation of privacy is a bedrock principle upon which the United States was founded.  Americans have always valued and sought to shield their privacy in a variety of contexts.  This encompasses the varied privacy rights protected by the Bill of Rights, including the First Amendment (protecting every citizen’s privacy of beliefs), the Third Amendment (protecting privacy of the home), the Fourth Amendment (protecting the privacy of the person from unreasonable searches and seizures) and the Fifth Amendment (with the bar on self-incrimination protecting the privacy of personal information).

The Supreme Court has upheld and reinforced these protections with the recognition of privacy as a fundamental constitutional right, most notably during the Warren Court.  In Griswold v. Connecticut, the Court observed “zone of privacy created by several fundamental constitutional guarantees” including those referenced in the Bill of Rights.  381 U.S. 479, 485 (1965).  The Court reiterated a similar sentiment in Stanley v. Georgia, declaring that “also fundamental is the right to be free, except in very limited circumstances, from unwanted governmental intrusions into one’s privacy.”  394 U.S. 557, 564 (1969) (citing Olmstead v. United States, 277 U.S. 438, 478 (1928) (Brandeis, J., dissenting) (“The makers of our Constitution undertook to secure conditions favorable to the pursuit of happiness . . . They conferred . . . the right to be let alone—the most comprehensive of rights and the right most valued by civilized man.”)).

The Court further embraced and entrenched the right to privacy in the seminal case, Roe v. Wade, announcing that “a right of personal privacy, or a guarantee of certain areas or zones of privacy, does exist under the Constitution.”  410 U.S. 113, 152 (1973).  Of course, the Supreme Court has continued to develop and refine notions of privacy well into the modern era, including as to informational privacy, and the Roberts Court. See, e.g., Carpenter v. United States, 138 S. Ct. 2206 (2018) (holding that collecting extensive, historical cell phone location information by the government is a search requiring a warrant).

Numerous federal and state statutes followed the Supreme Court’s early recognition of individual privacy rights to specify the prevention of such things as the mishandling and misuse of certain types of personal information and to decry the surreptitious intrusion into private communications.  This is in addition to many states (including California) that specifically recognize the right to privacy in their constitutions.  The California Consumer Protection Act (“CCPA”) is just the most recent example.  Although some herald the CCPA as marking a new era, many privacy statutes long predate the CCPA and current notions of so-called “data” privacy rights.  Exemplary of this earlier lawmaking includes the Fair Credit Reporting Act (“FCRA”), which became effective five decades ago, in 1970.

The FCRA was the first federal law enacted to regulate the privacy of personal information compiled by businesses, specifically credit reporting agencies.  Other statutes followed and expanded this statutory protection to ensure the privacy of various additional categories of information ordinarily held in confidence by individuals.  Among other things, laws were passed to protect private information in the context of healthcare (the Health Insurance Portability and Accountability Act (“HIPAA”)).  In the present context of discussing data privacy rights, it is also especially noteworthy that more than two decades ago Congress passed the Children’s Online Privacy Protection Act (“COPPA”) to protect data accessible online relating to users aged children 13 and under.

States have also enacted special protections of privacy rights, with some predating even early federal action, such as the California Confidentiality of Medical Information Act (“CIMA”) and the California Invasion of Privacy Act (“CIPA”) passed in 1967.  Like the CCPA garnering so much attention today because of its purported novelty, these laws enacted more than 50 years ago were passed to protect against the “invasion of privacy resulting from the continual and increasing use of [modern] devices and techniques” that were perceived to “create[] a serious threat to the free exercise of personal liberties and cannot be tolerated in a free and civilized society.”   Cal. Pen. Code § 630.

The CCPA may be a new statute targeting data privacy, but its future enforcement and the anticipated strategies for handling it in litigation can be relatively easily divined by those who are knowledgeable about and fluent in the decades-old privacy laws that have already been litigated and shaped by the courts.

History teaches us that the enactment of privacy laws precedes the birth of novel consumer privacy litigation.  In this context, the identification of the CCPA as preordaining a tidal wave of data privacy litigation is unsurprising and no different from how other privacy laws greeted the courts.  For example, after the passage of the FCRA in 1970, individual plaintiffs promptly invoked the statute to seek relief in court.[5]  Since its early beginning, FCRA litigation has since grown from single-plaintiff actions to a class-action machine with nearly 5,000 FCRA cases filed in federal court in 2019 alone—an 8.9 percent increase in filings over 2018.[6]  Although many ambiguities and legal questions lurk in corners of the CCPA, a good student of history enjoys superior tools to predict, prepare and pivot in the face of litigation premised upon this new statute.

In Parts 2 and 3, we will discuss the evolution of some of these defense theories in consumer privacy litigation—both successful and unsuccessful—and what types of defense theories can be expected in the years to follow this evolving “data” privacy litigation battle.  We will also discuss novel prosecution theories that could surface in the coming years, especially from competitors.

[1] Barren, Warren and Brandeis, The Right To Privacy, 4 Harv. L. Rev. 193, 196 (1890) (advocating for tort relief for individuals whose private affairs were disclosed without consent).

[2] https://www.law.com/thelegalintelligencer/2020/01/27/a-tidal-wave-of-change-plaintiffs-firms-are-tapping-data-analytics-but-some-are-still-reluctant/

[3] https://www.jdsupra.com/legalnews/the-coming-wave-of-california-consumer-13751/

[4] See supra note 1.

[5] See, e.g., Rasor v. Retail Credit Co., 87 Wash. 2d 516, 520 (1976) (confronting allegations by a small business owner against a consumer reporting company that purportedly failed to follow reasonable procedures under FCRA, and prompting the court to note that “[t]his important federal program for the protection of consumers was a Congressional response to documented abuses in the previously self-regulated credit reporting industry”).

[6] See https://webrecon.com/webrecon-stats-for-dec-2019-and-year-in-review-how-did-your-favorite-statutes-fare/

Maintaining a positive and productive work environment helps retain valued employees and aids in recruiting new talent, ultimately saving costs and providing an advantage over competitors. To monitor employee satisfaction organizations are increasingly turning to conducting workplace surveys.

On June 16, 2020 at 4:00p CEST  Annette Demmel of our Data Privacy & Cybersecurity team will discuss what companies should consider when implementing and conducting employee surveys in order to be in line with applicable data protection laws, in particular the GDPR.

They will explain the different legal bases for acquiring employee feedback; which information has to be given to the employees prior or during a survey; what needs to be taken into account when survey results are being evaluated; as well as how to avoid unnecessary risks in this context.

Additional information and registration is available here.

1.0 hour CLE available for CA, NJ and NY

1.0 hour CPE (IAPP).

The California Attorney General has submitted comments on the final proposed CCPA regulations.  Our sister blog, Security & Privacy Bytes has published a summary of the key guidance that can be garnered from these materials including, expectations regarding “user-enabled privacy controls” (and Do Not Track signals), rules governing service provider use of personal information, jurisdictional triggers, and guidance regarding “financial incentives.”  The post may be accessed here.

On Friday, after reviewing Plaintiff’s credit reports which were ordered by the Court last month, the Honorable Laurie J. Michelson, dismissed Plaintiff’s claims against Michigan First, concluding there was no inaccurate or misleading reporting. (See Rider v. Equifax Info. Servs. LLC, No. 2:19-cv-13660, 2020 U.S. Dist. LEXIS 99265, at *2 (E.D. Mich. June 5, 2020.)) Plaintiff had alleged that Michigan First negligently failed to conduct a proper investigation of her dispute as required by 15 USC 1681s-2(b), and that it had “willfully failed to conduct a proper reinvestigation of [her] dispute.” HOWEVER, both of Plaintiff’s disputed credit reports indicated Plaintiff’s Michigan First account as having a “zero balance” and also included the closing date for her account. Take as a whole, neither of the tradelines would be considered misleading, ruled the Court. And the Court went on to further state “Michigan First is not the consumer reporting agency—it reports to the consumer reporting agency. So Michigan First does not control how the data it provides is presented.” Just because there is a tradeline reflecting a closed account on a credit report (within the statutorily permitted reporting period) does not mean that the credit report is inaccurate, it is the information contained within the tradeline that determines the accuracy-and after reviewing that very information in the disputed tradeline in the aforementioned case, the Court dismissed Plaintiff’s allegations as insufficient.

Biometric tech is everywhere. Think facial recognition, voice ID, fingerprints, retina scans (a la Minority Report or a million other sci-fi movies, except real and in the present). It’s probably how you unlock your phone every day, and how social media platforms ask if that photo some friend of a friend tagged is really you. And it’s likely to become even more pervasive in light of COVID-19 as some companies seek to embrace longer-term remote work policies without sacrificing security.

Because of an Illinois statute—the Biometric Information Privacy Act (BIPA)—this could pose a big, costly problem for companies if they don’t get consent, and that problem could play out in the form of a big class action in federal court. Take Vimeo, for example: a federal judge just rejected their attempt to arbitrate a class action brought under BIPA, even though the user had agreed to an arbitration clause. Luckily, though, this was because an exception applied, but the court reiterated the broad privacy rights afforded by BIPA and demonstrated a willingness to disfavor arbitration if the circumstances are right (or wrong, depending on your view).

If you’re reading this, you probably know that BIPA is a sweeping state law that restricts how companies can collect, store, use, and destroy this information. Most notably, it creates a private right of action that has the potential to cost companies millions or even billions—$1,000 per negligent violation or $5,000 per reckless violation (or actual damages if they’re more than that), plus attorneys’ fees, costs, and injunctive relief.

Companies have faced hundreds of BIPA lawsuits since the Illinois Supreme Court opened the floodgates early last year by holding that individuals could sue based on a statutory violation alone, even if the violation didn’t harm them in any way. Two federal courts of appeals—the Ninth and, as we reported last month, the Seventh Circuit—subsequently held the same thing under the federal standard. (Though before all this, the Second Circuit seemed to hold otherwise, albeit in a non-precedential decision, and the deepened split might get the Supreme Court to take up this issue.)

It’s not all bad news. Companies facing BIPA lawsuits have several lines of attack, including on grounds of personal jurisdiction, statute of limitations, constitutionality of the statute itself, preemption by other state/federal laws, as well as various statutory defenses, etc. And, some companies have able to avoid class actions by invoking arbitration clauses. Just last month, for example, an Illinois federal court set aside claims that Southwest Airline violated the BIPA by requiring employees to clock in and out by scanning their fingerprints, holding that employees had to pursue their claims as individuals in arbitration, not as a class in federal court. (See this post for more details on that.)

Earlier this week, though, a judge in the same federal court ruled in Acaley v. Vimeo, Inc., No. 19-cv-7164, 2020 U.S. Dist. LEXIS 95208 (June 1, 2020) that users of a video app could pursue their claims in federal court. This case involves the Magisto video creation and editing app, which Vimeo owns and operates. A guy named Bradley Acaley, a user of the Magisto app, brought a class action alleging that Vimeo violated BIPA by using facial recognition technology to scan pictures and videos uploaded to the app to create unique face templates without consent.

Vimeo asked the court to stay the lawsuit and compel Acaley to arbitrate his claims individually, claiming that Acaley had agreed to arbitrate BIPA claims by accepting Magisto’s terms of service, which contain a mandatory arbitration clause. The court ruled against Vimeo, holding that even though the there was a binding and valid arbitration clause in the terms of service, that clause did not apply under an exception for claims relating to “invasion of privacy.”

The court first addressed whether the agreement to arbitrate was valid. It was, the court held, because Magisto provided “reasonable notice” that use of the app meant a user agreed to the terms of service. Whether a company has provided “reasonable notice” is a fact-specific test under Illinois law. On some of its signup pages, Magisto included a statement that use is subject to terms, along with a hyperlink to those terms. That was enough for the court.

The court agreed with Acaley, though, that the arbitration clause did not cover BIPA claims based on an “invasion of privacy” exception in the terms of service. The exception stated that any claims “related to, or arising from . . . invasion of privacy” were not covered by the arbitration clause. That included BIPA claims, according to the court, since BIPA created “a legal right to privacy.”

Courts generally favor arbitration, and where there is a valid arbitration clause, they usually require parties to arbitrate unless they can say “with positive assurance” that the agreement cannot reasonably be interpreted to over the dispute. Vimeo argued that that the clause could be interpreted to cover only claims brought by Vimeo against users and not the other way around, and that it applied only to claims of common-law invasion of privacy, not a statutory claim involving privacy. The court rejected these more narrow interpretations, and was absolutely sure that “any Claim related to, or arising from, allegations of . . . invasion of privacy” covered BIPA claims.

So, the court let this class action continue in federal court. Enforcement of arbitration clauses can be fact-specific, though, so it seems unlikely that this will become a trend since it was dependent on a specific exception in the terms of service, and this judge arguably seemed less inclined to enforce the arbitration agreement than is typical. For example, as we reported here just last week, a different judge in the same court compelled arbitration in a similar BIPA class action against Shutterfly. The court in that case enforced an arbitration clause, even though it didn’t exist when the plaintiff signed up, since the original terms had a change in terms provision, and the plaintiff continued to use the site after Shutterfly added the clause. That plaintiff will have to bring her claims as an individual in arbitration against Shutterfly.

On the other hand, the Vimeo case will continue as a class action in federal court due to the court’s interpretation of that “invasion of privacy” exception. The different outcomes illustrate just how important it is for companies to carefully craft (and update) their terms of service based on evolving case law. And take an extra careful look at any arbitration exceptions in your contracts.

 

After months of waiting, on June 1, 2020, the California Office of the Attorney General (“AG”) unveiled the final proposed California Consumer Privacy Act (“CCPA”) regulations, which are unchanged from the last version circulated in early March 2020 (summarized here).  The AG also published extensive materials, including more than 500 pages of responses to public comments, that provide a wealth of (non-binding) guidance on tricky issues.  Finally, the AG requested the Office of Administrative Law to expedite its review to make the regulations effective July 1, 2020, but it is unclear whether that will occur. Continue Reading California Attorney General Submits Final Proposed Regulations and Accompanying Materials

Neither Consumer Privacy World nor the court really knows what happened in this BIPA class action because the Plaintiff’s complaint was so factually bare.  In Kloss v. Acuant, 2020 U.S. Dist. LEXIS 89411 (N.D. Ill. May 21, 2020), Ms. Kloss alleged that Acuant captured, collected, and stored her facial geometry without her consent and then subsequently disseminated it to someone – nobody really knows who – also without her consent.  To top it off, Acuant allegedly failed to post a publicly accessible retention policy in violation of BIPA.  A case that started in state court, that was removed to federal court, now finds itself partly dismissed and partly remanded back to state court.

So what happened?

BIPA 15(a) allegation – the public retention policy, back to state court it goes.

Relying on the Seventh Circuit’s Bryant v. Compass Grp. USA, Inc., No. 20-1443, 2020 U.S. App. LEXIS 14256 (7th Cir. May 5, 2020), the district court in Kloss concluded that it lacked subject matter jurisdiction and remanded the 15(a) claim back to state court. In doing so, the court applied the holding reached in Bryant. In Bryant, the Seventh Circuit considered whether a Plaintiff alleges an injury-in-fact when it pleads a pure violation of BIPA section 15(a),  which requires a publicly available retention schedule. The Seventh Circuit assessed whether this violation was “sufficiently substantive” to meet standing under Article III. The court concluded that the duty in 15(a) was to the public, and not an individual. Therefore, the violation of 15(a) does not invade “personal privacy rights in a concrete manner” and there is not a concrete and particularized harm for violations of it. Bryant at 16. Applying this analysis to Kloss, the court determined that they did not have subject-matter jurisdiction over this claim under Article III, and remanded it back to state court.  Kloss at *4.

BIPA 15(b) and (d) allegations – if light on facts, away you go.

The court then turned to the allegations regarding a lack of proper consent under BIPA sections 15(b) (requiring a private entity obtain consent to obtain biometric information) and 15(d) (requiring a private entity obtain consent to disclose biometric information).  While Bryant concluded that non-compliance with section 15(b) of BIPA “leads to an invasion of personal rights that is both concrete and particularized,” Bryant at 2, the court in Kloss never reached this analysis.  Instead, the court looked to the facts as pled and held that the plaintiff did not provide sufficient facts to support her claims of violations of 15(b) and 15(d).  It pointed to the lack of a specified time of her use of Acuant’s technology and the lack of details demonstrating a direct relationship between Kloss and Acuant. (I mean, was there even a relationship?)  Kloss at *7.  In doing so, the court concluded that “[s]uch barebone factual support and recitation of statutory language is not enough to put Acuant on notice of Kloss’s claims in order to properly investigate or prepare a defense.”  Id.

Bryant held the door open for the pursuit of at least some BIPA claims at a federal level, but Kloss reminds us that there are still critical facts that must be included when making such claims.  Defendants should take care to examine the factual support for claims presented by plaintiffs regarding notice and consent under BIPA.  Weak facts get cases dismissed.

In considering methods to relax the COVID-19 lockdown measures and revive the economy, while at the same time containing the spread of the virus, the EU and national EU governments have been actively pursuing the development and use of contact tracing apps.

To be effective, any contact tracing app would require the majority of the population to use it. Of course, there are reservations about the overall benefit of such an app as a means of responding to the COVID-19 crisis (among others because it may lead to false positives or negatives, the technology may be unable to distinguish between people in crowded places, as well as because of the possible abuse of the data). Continue Reading EU and National Guidance and Approaches to Contact Tracing Apps

Standing to bring suit is an issue that is never waived and never goes away, regardless of the parties’ arguments.  Recently, the Ninth Circuit reviewed an appeal of an FCRA claim that had gone through discovery, summary judgment, and a fully briefed appeal.  It determined that the parties and the lower court had focused on the wrong issue—and spent time, effort, and money on litigation that was ultimately undone—because the plaintiff didn’t have standing and shouldn’t have been able to bring suit in the first place.

In Hogue v. Silver State Schs. Credit Union, 2020 U.S. App. LEXIS 15963, the Ninth Circuit affirmed a district court’s grant of summary judgment on an FCRA claim in favor of a credit furnisher…but not on the grounds the district court granted it.  Instead, the Ninth Circuit determined—independently of the parties and the district court—that the consumer never had standing to bring suit in the first place.

The plaintiff sued the defendant furnisher, claiming that it erroneously reported an auto loan as past due to a credit reporting agency, then failed to investigate the dispute once the plaintiff raised it.  The district court granted summary judgment to the furnisher after discovery, holding that the furnisher met its obligations under the FCRA.  On appeal, the Ninth Circuit reviewed the allegations the plaintiff made and determined that he had never alleged a concrete injury sufficient to give him constitutional standing to sue.

Not only had no third party ever made an adverse credit decision based on the disputed information, the disputed information had fallen off of the plaintiff’s account by the time he sued, meaning it would never be reported at all.  In fact, the plaintiff couldn’t even show that there was a risk the disputed information would be disseminated.  The district court’s dismissal was affirmed, but purely on standing grounds, without ever reaching the issues addressed by the district court.

The lesson for FCRA defendants: always keep an eye out for standing issues.  A successful standing challenge could save you a lot of time and energy spent on unnecessary discovery.