We’ve been hearing a lot lately about the FTC’s rulemaking procedures under Section 18 of the FTC Act (also known as “Mag-Moss” rulemaking). Long decried as too burdensome and difficult to use on a regular basis, this tool is now being celebrated for its enormous, untapped potential to establish industry-wide standards and enable the FTC to get monetary relief in its cases, post-AMG. (AMG didn’t affect the FTC authority to obtain monetary relief when it’s enforcing a rule.)

Is the old view or the new one correct? Is Mag-Moss rulemaking really so cumbersome, as many FTC staff and observers have long claimed? Have those burdens been overstated, warranting the enthusiasm we’re now seeing among FTC Commissioners, consumer groups, and Congress? Did the FTC’s changes to its internal rules last July (see below) really “streamline” the process as the FTC claimed?

As suggested by the title to this blogpost, I have an opinion: Mag-Moss is still an uphill climb. However, to enable readers to decide for themselves, I detail below the Mag-Moss process as laid out in the law. Although the FTC’s July changes stripped away some extra steps it had previously imposed under its rules, the hurdles in the law remain formidable.  Continue Reading The FTC’s Magnuson-Moss Rulemaking Process – Still an Uphill Climb

On December 13, the New Mexico Attorney General announced a settlement with Google to resolve claims regarding children’s privacy, including in the burgeoning EdTech space. The federal lawsuits Balderas v. Tiny Lab Productions, et al. and Balderas v. Google LLC, respectively, alleged COPPA and privacy violations related to collection of children’s information on game developer Tiny Lab’s apps and on Google’s G Suite for Education products. There are many features of this settlement that are worth discussing further as either potential future trends, or novel provisions.

Privacy Compliance Provisions

New Mexico’s injunction related to the Tiny Lab case includes changes to Google Play which will take effect after 120 days. Some of the specific measures include:

  • revising Google Play Families policies and including additional help pages to assist app developers in compliance;
  • requiring all developers to complete a form to indicate the targeted age group of apps;
  • using a rubric to evaluate app submissions to help determine whether it appeals to kids and check for consistency with the age group form;
  • requiring Families apps to certify they will comply with COPPA;
  • requiring all apps to only use SDKs that certify compliance with Google’s policies including COPPA;
  • requiring developers of Families apps to disclose collection of any children’s data including through third parties;
  • requiring a link to the app’s privacy policy on the Google Play store page; and
  • communicating whether an app is Child Directed to AdMob and AdMob will then follow COPPA pertaining to that data.

The content of the help pages the injunction requires do not just contain answers to frequently asked questions.  They prescribe certain decisions by and limitations on third parties using the Google Play store.  For example, Exhibit 3 to the injunction provides “if you serve ads in your app and your target audience only includes children, then you must use Google Play certified SDKs.”

In addition to these injunctive provisions, Google agreed to a set of voluntary enhancements to the Google Education platform intended to promote safety for students.  New Mexico’s enforcement of these provisions is limited to its ability to confirm that Google has made the changes, or inquire as to the status of changes not made.

These injunctions demonstrate continued state Attorney General scrutiny regarding children’s information.  And they come at a time that the Federal Trade Commission, which is responsible for issuing the COPPA Rule, is redoubling its COPPA efforts.  The FTC’s ongoing COPPA Rule Review includes a number of questions regarding the intersection of COPPA and education technology.  The FTC’s Statement of Regulatory Priorities, which we wrote about here, identifies COPPA as a top priority. And just this week, the FTC released its first COPPA settlement in almost 18 months.

Additional Settlement Terms Part from Historical State Settlements

Not to be ignored, several other provisions of the settlement have unique aspects that are extremely noteworthy.  Google has agreed to pay New Mexico $5.5 million – with $1.65 million of that going to outside counsel for the state.  The remaining payment will be used to fund the “Google New Mexico Kids Initiative” – a program jointly run by Google and New Mexico to award grants to schools, educational institutions, charitable organizations, or governmental entities.  This unique allocation of the payment to the State could result in scrutiny that other State Attorney General settlements have met in the past where they attempted to designate funds to specific third party recipients.  Some state legislatures may see it as an effort to appropriate funds without their involvement.

While New Mexico reserves its rights under the agreement regarding public statements, it has agreed to provide Google 24-hour notice before making any written public statement.  Moreover, New Mexico agrees to consider in good faith any suggestions or input Google has, and any statement will reference the parties’ shared commitment to innovation and education. States routinely resist any efforts to negotiate press in this manner, and it is unclear how enforceable a provision like this could really be anyway.  That said, this certainly reflects the cooperative nature of the agreement, in which case it’s fair to assume the State would issue press reflecting such cooperation anyway.

Google and New Mexico have also agreed to an ADR provision, requiring the state to pursue any disputes relating to the agreement in mediation prior to pursuing relief.  This again is fairly unique for a State AG settlement, as is the overall form of the document (a “Settlement Agreement and Release”) – normally states will only settle matters through a consent judgment or a statutorily authorized Assurance of Compliance or Discontinuance.  But just like some of the other unique provisions, agreeing to ADR may be more of a reflection of the cooperative nature of the agreement, and certainly presents opportunity for a more streamlined enforcement mechanism in the future.

It remains to be seen if these provisions will serve as a template for future state agreements with other companies, but given that state Attorneys General continue to pursue Google on a variety of fronts[1], New Mexico’s settlement will certainly be relevant in any future settlement efforts.

[1] Google Search Manipulation, Google Ad Tech, Google DOJ Search Monopoly, State of Arizona v. Google LLC geolocation privacy

In case you missed it, last week (on November 30), the National Telecommunications and Information Administration (NTIA) announced that it would convene a series of virtual listening sessions on privacy, equity, and civil rights. According to NTIA, the sessions (scheduled for December 14, 15, and 16) will provide data for a report on “the ways in which commercial data flows of personal information can lead to disparate impact and outcomes for marginalized or disadvantaged communities.”

NTIA cites the following examples to illustrate how data collection, “even for legitimate purposes,” leads to disparate impacts:

  • Digital advertising offers content and opportunities based on proxy indicators of race, gender, disability and other characteristics, perpetuating historical patterns of discrimination.
  • Insurance companies use information such as neighborhood, safety, bankruptcy, and gun ownership to infer who will need expensive health care, warranting higher premiums.
  • Universities predict which students will struggle academically based on factors that include race.

Why is this News?   

As our readers may have noticed, NTIA is hardly the first agency or constituency to draw the link between data collection and discrimination. In 2013, Harvard Professor Latanya Sweeney published a groundbreaking study showing racial discrimination and stereotyping in online search and ad delivery. In 2014, FTC hosted a workshop, followed by a report (Big Data: A Tool for Inclusion or Exclusion?) detailing the problem and making recommendations for companies and researchers. In recent years, scores of studies and conferences have examined the discriminatory assumptions embedded in algorithms and artificial intelligence (AI). And civil rights groups have raised concerns for years and, in 2019, obtained an historic settlement with Facebook to stop discrimination on its online advertising platform.

NTIA’s announcement is nevertheless significant for two reasons. First, by its own description, NTIA is the President’s principal advisor on information policy issues, responsible for evaluating the impact of technology on privacy and the sufficiency of existing privacy laws. Further, its announcement states that the listening sessions are designed to “build the factual record for further policy development in this area.” For these reasons, the notice has been heralded as the Administration’s “first move” on privacy and a possible attempt to revive stalled efforts in Congress to enact a federal privacy law.

Second, in case there was any doubt, NTIA’s announcement affirms that the link between privacy and civil rights is now a widely accepted policy position, and will remain front-and-center in any debate about whether to enact a comprehensive federal privacy law. Whereas once there were questions about whether civil rights provisions should be “added” to a privacy law, now they’re essential building blocks.

This is true not only among Democrats, but among Republicans too. For example, provisions related to discrimination and/or algorithmic decision-making appear in recent privacy legislative proposals from, not just Representative Eshoo and Senator Cantwell, but also Senator Wicker and the Republican members of the House Energy and Commerce (E&C) Committee. The Republican E&C bill is especially notable for how much it leans into the issue – prohibiting data practices that “discriminate against or make an economic opportunity unavailable on the basis of race, color, religion, national origin, sex, age, political ideology, or disability or class of persons.”

But What Does this Mean for Companies Today?

You may be wondering – what does this mean for companies now, with Congress still (endlessly) debating whether to pass federal privacy legislation? It means that:

  • Data discrimination is on everyone’s radar, regardless of whether Congress finally decides to pass a federal privacy law.
  • Companies should expect more enforcement – even now, under existing laws – challenging data practices that lead to discriminatory outcomes. Such laws include the FTC Act (recently used to challenge racial profiling by an auto dealer), state UDAP laws, the Fair Credit Reporting Act, the Equal Credit Opportunity Act, and (of course) the civil rights laws.
  • To steer clear of discrimination (and any allegations of discrimination), companies should test their data systems and use of algorithms and AI for accuracy and fairness before using them in the real world.

We will continue to monitor developments on this issue and post updates as they occur.


The Supreme Court in AMG foreclosed the FTC’s ability to pursue monetary remedies under Section 13(b) of the FTC Act. That, however, AMG has not stopped the FTC from pursuing monetary relief directly in court, while attempting to bypass the statutory prerequisite of an administrative proceeding. The FTC is continuing to use Section 13(b) of the Act to attempt to obtain preliminary and permanent injunctive relief. At the same time, the Commission is coupling its 13(b) requests for injunctive relief with other (sometimes creative) statutory requests for money.

Given the Commission’s newfound interest in exploring non-13(b) statutory avenues to obtain monetary remedies, we have expanded our Post-AMG chart to include a wider swath of ongoing cases in which the FTC is attempting to collect money absent the use of 13(b). The latest version of our expanded chart follows.

Continue Reading Post-AMG Scorecard: The FTC Pivots to Other Statutory Bases for Monetary Relief

Since Congress enacted the Children’s Online Privacy Protection Act (COPPA) in 1998, the regulatory wall between kids and teens has been a remarkably durable one. During all this time, COPPA, the primary U.S. law protecting kids’ privacy, has protected children under 13 but hasn’t provided any protections for teens. While California’s privacy law grants some rights to teens under 16, these protections are narrow (opt-in for data sharing) and only apply within that state. This means that teens are generally treated like adults for purposes of privacy in the U.S.

It’s not exactly clear why COPPA’s age 13 cut-off was chosen the first place. First year of teen-hood? Bar Mitzvah age? The age when children become too independent and tech-savvy to let their parents control their media? (Ahem – that happened at age six in my house.) Whatever the reasons for the original choice, age 13 has stuck, even as concerns about teens’ privacy and use of social media have grown, and Senator Markey and others have repeatedly proposed extending privacy protections to teens.

However, we might finally be seeing some cracks in the kid-teen privacy wall – cracks that could lead to a federal law protecting teens in the not-too-distant future.

These cracks are due to a confluence of events. Notably, in September 2020, the U.K. passed a law (the Age Appropriate Design Code or AADC) that requires all online commercial services “likely to be accessed by” kids and teens (including apps, programs, websites, games, community environments, and connected toys or devices) to meet 15 standards to ensure that their content is age appropriate. The law, which became fully effective in September 2021, starts with the principle that any service be designed with the “best interest of the child” as a primary consideration. It then details more specific requirements, including that defaults be set at the most protective level (e.g., location tracking and profiling are set to “off”), that data is not shared with third parties without a “compelling reason,” and that “nudge” techniques aren’t used to encourage minors to provide data or reduce their protections.

In response to the law, U.S. companies operating in the U.K. (notably, some of the large tech platforms) recently announced new protections for teens – a significant development in the long-running kid-teen debate, but one that has received relatively little attention. For example, Facebook/Instagram now says that for kids under 16, it will default them into private accounts; make it harder for “suspicious” accountholders to find them; and limit the data advertisers can get about them. Meanwhile, Google/YouTube has pledged similar protections for kids under 18, including private accounts by default; allowing minors to remove their images; applying restrictive default settings; turning off location history permanently; and limiting the data collected for ad targeting.

Following these announcements, Senator Markey and two House members sent a letter to the FTC urging it to ensure that these companies keep their promises, using its authority to stop deceptive practices under the FTC Act.

And there’s more. Last week, in developments widely covered in the media, a former Facebook employee detailed what she viewed as manipulation of teens using algorithms that kept them on the platform and exposed them to harmful content. Also, with broad-based privacy legislation perennially stalled, there’s been talk that Congress might prefer to tackle privacy issues that are more manageable and bipartisan (like kids’ and teen privacy) – talk that has only grown louder since the developments regarding Facebook.

Adding to the momentum, Senator Markey recently introduced a bipartisan bill (with Republican Senator Cassidy) that would provide privacy protections specific to teens, and Representative Castor has introduced a similar bill in the House. Further, the FTC has expressed a strong interest in protecting kids’ privacy, and in undertaking enforcement and rulemakings to extend U.S. privacy protections beyond the status quo.

In short, the kid-teen privacy wall is under pressure, and we could soon see a federal law, FTC enforcement, and/or (a harder climb) an FTC rulemaking using the agency’s Magnuson-Moss authority. For companies that collect teen data in connection with marketing or providing commercial products or services, this means double-checking your data practices to ensure that they’re age-appropriate and don’t expose teens to harms that can be avoided. (While the U.K.’s AADC principles are very ambitious, and do not apply to U.S.-only companies, they’re a valuable reference point.) It also means being prepared to explain and defend your data practices with respect to teens if regulators come knocking.

We will continue to monitor developments on this issue and provide updates as they occur.

If the summer slide and the start of school kept you too busy to follow what’s going on in the food scene, we hear you!  Catch up on key developments below in this issue of our Food Industry Litigation and Regulatory Highlights.

The Courts Were Kind to the Food Industry This Summer

This summer brought a series of class action victories to the food industry, including a trio of decisions from the Second and Ninth Circuits, both long-time hot beds for false advertising class actions, as well as four dismissals from the Southern District of New York.

At the appellate level, the Second Circuit affirmed the dismissal of a putative class action challenging Starbucks’ claim that its drinks are the “best coffee for you” and that its coffee is “watched over … from the farm to you,” despite the use of pesticides to kill roaches at certain retail locations.  The Court ruled that the challenged claims were not specific enough to misrepresent a quality or characteristic of Starbucks’ coffee, and that no reasonable consumer would interpret them to suggest anything about the use of pesticides in Starbucks’ stores.

The Ninth Circuit decertified a class of consumers claiming that Coca-Cola falsely labels its drinks as having no artificial flavors when they contain phosphoric acid, ruling that consumers lacked standing to pursue injunctive relief.  According to the Court, the plaintiffs’ claims that they “would consider purchasing” Coke in the future if certain disclosures were included or if the product’s labels were truthful were insufficient to show an actual or imminent threat of future harm. Continue Reading Food Industry Litigation and Regulatory Highlights, July – September 2021

The dietary supplement and personal care product space continued to see enforcement on false CBD, COVID, and fertility claims as well as related litigation involving “germ-killing” claims on hand sanitizers and wipes.  Messy stuff…Let’s take a look…



Personal Care Products

In a blow to the trending “pink tax” theory of liability in consumer class actions, in May, the Eighth Circuit ruled that various personal care product manufacturers and retailers did not violate Missouri’s anti-discrimination laws by charging more for products marketed towards women as compared to allegedly identical products that were either marketed towards men or utilized gender-neutral marketing.  The Court found that the plaintiff “mistakes gender-based marketing for gender discrimination” and, in the process, ignores numerous differences between the products that account for the higher price tag.  There has been a handful of similar “pink tax” cases filed over the last year or two, but this is the first appellate court to rule on the issue. Continue Reading Dietary Supplement and Personal Care Products Regulatory and Litigation Highlights – May and June 2021

For our June review, the action stays largely in the litigation arena with vanilla getting thrown out and sustainability as well as settlements getting called into question.  Meanwhile, environmental and health stakeholders are pushing FDA to ban PFAS from food contact uses as many in industry move away from PFAS-containing packaging.  How to digest all of it?  Consider some yogurt.  FDA updated the standard of identity, making it more delicious than ever.  Let’s take a look….


Two More Vanilla Cases Get Thrown Out of the Food Court

In Robie v. Trader Joe’s Co., the Northern District of California dismissed claims that Trader Joe’s Almond Clusters cereal should have been labeled as “artificially flavored.”  The court held that, because the vanilla flavor can from both the vanilla plant and vanillin derived from tree bark, it was properly labeled as “Vanilla Flavored With Other Natural Flavors” under applicable FDA regulations and the plaintiff’s claims suggesting otherwise were preempted.  The court also found that the plaintiff had failed to allege facts suggesting that reasonable consumers would interpret “vanilla” on the product label to mean that the product’s flavor is derived exclusively from the vanilla plant, especially given that the challenged label did not contain any other words or pictures suggesting that the flavor was derived exclusively from the vanilla bean. Continue Reading Food Industry Regulatory and Litigation Highlights – June 2021

Over the last few months, a wave of consumers have filed putative class action complaints against a long list of consumer-facing website owners/operators and their software providers alleging invasion of privacy rights under statutes focused on wiretapping and eavesdropping.

Our team has represented both website and software defendants in these cases.  However, this post is not intended to reflect on any specific claim, website, or software.  Rather, our goal is to provide an introduction to the general nature of the consumer claims and current landscape of these litigations.

This post summarizes (1) the “session replay” technology at issue in these claims; (2) arguments presented by the Complaints; (3) an overview of common defenses; and (4) where things stand.  With that context, we then provide our list of practical considerations for the use of session replay software.      

What is “Session Replay” Software? 

A significant branch of the Software-as-a-Service (Saas) industry has arisen to support website owners/operators in effectively maintaining and leveraging their consumer-facing websites.  These software products are generally scripts placed in the JavaScript of a given website to capture specific information related to a consumer’s interactions with a given page.  The software can capture consumer’s keystrokes and mouse movements to provide information on everything from broken links or error messages to support IT teams, create heat maps showing website usage, and/or capture consumer information for validating consent to be contacted or agreement to receive products and services.

Despite how these products are often described, the software does not actually record the consumer’s session in the way that a security camera in a brick-and-mortar store would capture a consumer’s movements. Rather it captures the consumer’s interactions with the website at regular intervals and allows those movements and data points to be laid over an existing image of the website so that owners/operators can review a recreation (or dramatization) of an individual consumer’s experience.           Continue Reading Privacy Litigation Trend: The Latest on Session Replay Lawsuits, and Practical Considerations for Risk Mitigation

It has been a full year since the California Consumer Privacy Act (“CCPA”) took effect at the top of 2020. In the cases filed in the second half of the year, the complaints more frequently assert a violation of the CCPA as a standalone cause of action, though it remains common for a CCPA violation to be asserted as a predicate to support a separate cause of action, such as a violation of California’s Unfair Competition Law (“UCL”).

In this post, we include our round-up of representative cases filed in the third and fourth quarters of the year. Our prior summaries of CCPA-related litigation filed last year can be found in our Q1 2020 CCPA Litigation Round-Up and CCPA Litigation Round-Up: Q2 2020. We have separately analyzed trends emerging from the 2020 CCPA litigation landscape. Going forward into 2021, we will continue to report on relevant developments in CCPA consumer litigation, and also provide updates in our CCPA Litigation Tracker chart.

  1. Cases Filed in Q3/Q4 2020 Alleging Direct Violation of CCPA

Shadi Hayden v. The Retail Equation, Inc. et al., No. 8:20-cv-01203 (C.D. Cal.)

On August 3, a class action amended complaint was filed by thirteen named plaintiffs against The Retail Equation, Inc. (“TRE”) and a variety of retailers: Sephora USA, Inc., Advance Auto Body Parts, Inc., Bed Bath & Beyond, Inc., Best Buy Co., Inc., Buy Buy Baby, Inc., Caleres, Inc., CVS Health Corporation, Dick’s Sporting Goods, Inc., L Brands, Inc., Stein Mart, Inc., The Gap, Inc., The Home Depot, Inc., and The TJX Companies, Inc. (the “Defendant Retailers”) in the District Court for the Central District of California.  Plaintiffs’ CCPA claim alleges that the Defendant Retailers, without their customers’ knowledge or consent, collect large amounts of data about their retail customers, including: (1) “Consumer Commercial Activity Data,” which includes “the unique purchase, return, and/or exchange histories of individuals consumers”; and (2) “Consumer ID Data,” which includes “the unique identification information contained on or within a consumer’s driver’s license, government-issued ID card, and/or passport” such as “the consumer’s name, date of birth, race, sex, photograph, complete street address, and zip code.” Plaintiffs allege that this data is shared with TRE as non-anonymized, individual data sets, which TRE processes to create consumer reports and a risk score for each customer. The risk score is allegedly used to advise the retailer about whether a customer’s attempted return or exchange is fraudulent or abusive.  The amended complaint alleges that “Defendants’ policies and practices failed to hold plaintiffs’ and Class members’ personal information secure by, for example, [the Retailer Defendants’ sharing of] the personal information . . . in an unsecured, unrestricted manner with TRE to create consumer reports and generate a ‘risk score’ that TRE then shared with other Defendant Retailers alongside other personal information.”

McCoy v. Alphabet, Inc. et al., 5:20-cv-05427 (N.D. Cal.)

On August 5, 2020, plaintiff Robert McCoy filed a class action complaint against defendants Alphabet Inc. and Google LLC for monitoring and collecting the sensitive personal data of Android Smartphone users when they interact with non-Google applications on their smartphones, without obtaining consent. This personal data includes the duration of time spent on non-Google apps and how frequently those apps are opened.  Plaintiff’s CCPA cause of action alleges that defendants failed to disclose that they collect the class members’ personal data and the true purpose for collecting the data, which plaintiff alleges is to gain a competitive edge over rival companies. Plaintiff’s proposed class definition includes “All Android Smartphone users from at least as early as January 1, 2014 through the present.”

On September 30, 2020, Google filed a Motion to Dismiss, including arguments that the CCPA claim fails because (1) plaintiff fails to allege his information was subject to a data breach; and (2) relief is only available to a consumer, which is defined as a “California resident,” and plaintiff is a New York resident.

Guzman v. RLI Corp. et al., No. 2:20-cv-08318 (C.D. Cal.)

On September 10, 2020, plaintiff Jose Guzman filed a class action complaint against defendants RLI Corp. and RLI Insurance Company alleging that defendants, through the Pacer filing service, disclosed the login credentials to computer systems containing personal and confidential information of class members. Plaintiff alleges that as a surety, defendants requested access to the records of Libre by Nexus, which secures bonds for detained undocumented immigrants. Plaintiff alleges that, in a separate suit, defendants disclosed Libre’s login credentials by filing them publicly, giving anyone with a Pacer login access to class members’ personal and confidential information including dates of birth, names of minor children, home address, Social Security Numbers, and taxpayer identification numbers and financial account information.

On October 22, 2020, defendants filed a Motion to Dismiss, including arguments that the CCPA claim fails because: (1) defendants’ access was court-authorized and therefore not unauthorized; (2) plaintiff failed to establish that there was a “violation of the duty to implement and maintain reasonable security procedures and practices”; and (3) plaintiff did not comply with the mandatory 30-day notice and cure provision. On November 6, 2020, the action was voluntarily dismissed without prejudice.

Gardiner v. Walmart Inc. et al., 4:20-cv-04618 (N.D. Cal.)

On July 10, 2020, plaintiff Lavarious Gardiner filed a class action complaint against retailer Walmart alleging that vulnerabilities on Walmart’s website led to breaches of Walmart’s systems, allowing hackers to steal customers’ personally identifiable information (including full names, addresses, financial account information, and credit card information), and allowed hackers to attack Walmart’s customers’ computers directly as well. The CCPA cause of action alleges that Walmart violated its duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the personal information. On October 29, 2020, the Parties stipulated to a briefing schedule on defendant’s Motion to Dismiss which is scheduled to be completed by February 3, 2021.

Flores-Mendez et al v. Zoosk, Inc. et al., 3:20-cv-04929 (N.D. Cal.)

On July 22, 2020, plaintiffs Juan Flores-Mendez and Amber Collins filed a class action complaint against Zoosk, Inc., an online dating site, and its parent company, Spark Networks SE, alleging that cybercriminals hacked and obtained 30 million of Zoosk’s user’s records, containing their name, email, date of birth, and password, due to Zoosk failing to maintain reasonable security controls and systems.  Plaintiffs only sought injunctive and equitable relief but alleged that if Zoosk could not cure the breach within 30 days of its July 14 notice letter, they intended to amend to seek actual and statutory damages. On October 30, 2020, plaintiffs filed an Amended Complaint.

Warshawsky et al v. cbdMD, Inc et al., No. 3:20-cv-00562 (W.D.N.C.)

On October 9, 2020, plaintiffs Michael Warshawsky and Michael Steinhauser filed a class action complaint against cbdMD Inc., and CBD Industries, LLC. Plaintiffs allege that due to two data breaches, hackers accessed consumers’ names, credit card numbers, CVV security codes, credit card expiration dates, addresses, email addresses, and bank account numbers. Plaintiffs’ CCPA cause of action alleges that defendants’ computer systems and data security practices were inadequate to safeguard its customers’ personal information.

Diczhazy et al v. Dickeys Barbecue Restaurants Inc. et al., No. 3:20-cv-2189 (C.D. Cal.)

On November 9, 2020, plaintiffs Ross Diczhazy and Wesley Etheridge II filed a class action complaint against Dickey’s Barbecue Restaurants Inc. and Dickey’s Capital Group, Inc. for their alleged failure to secure and safeguard the names, payment card numbers and security codes of proposed class members in a data breach in violation of the CCPA. The complaint purports two classes: (a) All California residents who made a purchase from Dickey’s using a payment card, or otherwise disclosed payment card information to Dickey’s, since January 1, 2020, and whose personal information was compromised including as part of the Joker’s Stash BlazingSun data set; and (b) All persons who made a purchase from Dickey’s using a payment card, or otherwise disclosed payment card information to Dickey’s, since January 1, 2018, and whose personal information was compromised including as part of the Joker’s Stash BlazingSun data set.

Marquez v. Dickey’s Barbecue Resturants, Inc. et al., No. 3:20-cv-2251 (S.D. Cal.)

On November 18, 2020, plaintiff Jose Luis Marquez also filed a class action complaint against Dickey’s Barbecue Restaurants Inc. and Dickey’s Capital Group, Inc. for their failure to secure and safeguard their customers’ personal identifying information. As in Diczhazy (above), there is a nationwide class as well as a California subclass alleged: (a) All persons residing in the United States who made a credit or debit card purchase at any affected Dickey’s Barbecue Pit restaurant during the period of the Data Breach; and (b) All persons residing in the State of California who made a credit or debit card purchase at any affected Dickey’s Barbecue Pit restaurant during the period of the Data Breach.

Gitner v. U.S. Bank National Association et al., No. 0:20-cv-02101 (D. Minn.)

On November 20, 2020, plaintiff Barry Gitner filed a first amended class action complaint in the District of Minnesota against U.S. Bank National Association and U.S. Bancorp for their alleged failure to secure and safeguard the confidential, personally identifiable information of thousands of consumers, including names, account numbers, Social Security Numbers, driver’s license numbers, and dates of birth. Specifically, plaintiffs allege that a computer server with consumer information was stolen from defendants’ corporate offices. Under the CCPA cause of action, plaintiffs seek injunctive or other equitable relief but reserve their rights to amend the complaint to seek actual and statutory damages if the breach is not cured within 30 days. On January 13, 2021, the Court stayed the action pending arbitration of Plaintiff’s individual claims, after defendants’ Motion to Compel Arbitration was unopposed.

Schaubach v. Hotels.Com, LP et al., No. 8:20-cv-2370 (C.D. Cal.)

On December 17, 2020, plaintiff Lauren Schaubach filed a class action complaint against defendants Hotels.com, L.P. (“HLP”), Expedia Group, Inc. (“Expedia”) and Amazon Web Services, Inc. (“AWS”) after a Cloud Hospitality server hosted by Defendant AWS and containing information for customers of Defendant HLP and Defendant Expedia was hacked and tens of millions of data records were exposed, including full names, email address, ID numbers, phone numbers, credit card numbers, security codes and expiration dates. Plaintiff seeks to represent a class of “all consumers in California whose personally identifiable information was compromised in the Breach.” On December 17, 2020, the action was voluntarily dismissed without prejudice.

  1. Cases Filed in Q3/Q4 2020 Alleging CCPA Violations As a Predicate For UCL Causes of Action

Pygin v. Bombas, LLC et al., No. 4:20-cv-04412 (N.D. Cal.)

On July 1, 2020, plaintiff Alex Pygin filed a class action complaint against defendants Bombas, LLC, Shopify (USA) Inc. and Shopify, Inc., alleging that sock and apparel retailer Bombas uses an ecommerce platform supplied by Shopify to take customers’ personal and payment information (including name, billing, shipping and email addresses, along with credit card numbers, expiration dates, and security codes) and that the customers’ information was compromised during a data breach due to defendants’ negligent and/or careless acts and omissions and failure to protect the data.

While plaintiff brings no claim under the CCPA, he alleges that class members have suffered injury including “deprivation of rights they possess under . . . the California Consumer Privacy Act” by “failing to maintain reasonable security procedures and practices appropriate to the nature of the personally identifiable information.” As part of its causes of action for negligence and violation of the UCL, plaintiff alleges that defendants: (i) had a duty to take reasonable steps and employ reasonable methods of safeguarding the personally identifiable information of class members, as required under the CCPA; (ii) failed to maintain those reasonable security procedures and practices by storing the information in an unsecure electronic environment; and (iii) failed to disclose the data breach to class members in a timely and accurate manner as required by the CCPA.

Currently pending before the Court is Shopify’s Motion to Dismiss for (1) lack of personal jurisdiction, (2) violation of FRCP 8 for failing to distinguish among defendants and adequately allege that Shopify caused harm, and (3) failure to state a claim, based partially on the argument that the CCPA does not “create any private right of action under any other law.”

Calixte et al. v. Dave, Inc., 2:20-cv-07704 (C.D. Cal.)

On August 24, 2020, five plaintiffs filed a class action complaint against defendant Dave Inc. alleging that its users’ names, emails, date of birth, physical address, phone numbers and social security numbers were compromised as a result of a cyberattack against a former third party service provider of Dave Inc. The complaint alleges that the hackers’ ability to pivot from a third-party vendor’s system to the defendant’s systems without detection demonstrates the lack of controls and cybersecurity measures in use at Dave Inc. to prevent such unauthorized use.

Plaintiffs only allege violations of the CCPA as a predicate to their UCL violation cause of action based on Dave Inc.’s alleged failure to implement and maintain reasonable security measures. The proposed nationwide class is defined as “All persons whose PII was compromised as a result of the Data Breach announced by Dave Inc. in July and August of 2020.” The Parties are currently briefing defendant’s Motion to Compel Arbitration. On November 9, 2020, the action was voluntarily dismissed without prejudice.

Wesch v. Yodlee, Inc. et al., No. 3:20-cv-05991 (N.D. Cal)

On August 25, 2020, plaintiff Deborah Wesch filed a class action complaint against defendants Yodlee, Inc. and Envestnet, Inc. (who acquired Yodlee) alleging that Yodlee sells highly sensitive financial data, such as bank balances and credit card transaction histories, collected from software products that it markets and sells to financial institutions. Plaintiffs allege that when individuals connect their bank accounts to Paypal, they upload their banking credentials using Yodlee’s system. Yodlee then allegedly stores a copy of the credentials on its own system and exploits them, contrary to the disclosed use of the information.

Plaintiff’s UCL cause of action is predicated upon alleged violations of the CCPA, including that defendants: (i) disclose before or at the point of collection, the category of information to be collected and how it will be used; and (ii) refrain from collecting additional information for additional purposes without providing notice.

Plaintiff filed an Amended Complaint on October 21, 2020  and the parties have stipulated to briefing schedule on plaintiff’s anticipated Motion to Dismiss.

Conditi v. Instagram, LLC et al., No. 3:20-cv-06534 (N.D. Cal.)

            On September 17, 2020, plaintiff Brittany Conditi brought a class action complaint against defendants Instagram LLC and Facebook Inc. alleging that Instagram constantly accesses users’ smartphone camera feature and monitors users without permission when they are not interacting with the camera feature, which goes beyond the services it promises to provide. Plaintiff alleges that Instagram does this to collect valuable personal data to increase their advertising revenue.

Plaintiff’s UCL cause of action is based upon allegations that defendants violated the CCPA by failing to disclose that they monitor users through their smartphone cameras, while not in use, to collect personal information. Plaintiff proposes the following class definition: “All Instagram users whose smartphone cameras were accessed by Instagram without their consent from 2010 through the present (the ‘Class Period’).”


You can follow developments in CCPA-related cases by referring to our new CCPA Litigation Tracker. If you have any questions about defending and/or preparing for a potential privacy consumer class action, please reach out to our team.