The FTC took unprecedented action yesterday when it moved to impose what it describes as a “blanket prohibition” preventing the company from monetizing young people’s data.  The FTC contends that this prohibition is warranted as a result of repeated violations of Meta’s 2020 consent order (“Proposed Order”).

In taking this action, the FTC is relying on its administrative authority to “reopen and modify” orders to address alleged order violations, rather than to press its compliance case in federal court under the FTC Act.  In doing so, the FTC seeks to significantly expand the scope and duration of the existing order to cover new conduct.  Even against recent examples of aggressive FTC action (see examples here, here, and here), this one markedly stands out.  And, in the face of mounting agency losses in challenges to its enforcement authority in Axon and AMG and its aftermath, the Proposed Order is extraordinary. 

The Commission voted 3-0 to issue the Proposed Order and accompanying Order to Show Cause.  Commissioner Bedoya issued a statement expressing reservations about the “monetization” restrictions described below, specifically questioning whether the provision related to minors’ data is sufficiently related to either the 2012 or 2020 violations or order.  Meta has 30 days to answer the FTC’s proposal.

Continue Reading FTC Attempts End Run to Ban Meta from “Monetizing” Minors’ Data

Just in time for the holidays, the FTC has released two companion settlements resolving allegations that Epic Games (maker of the popular video game Fortnite) violated the Children’s Online Protection Act (COPPA) and the FTC Act, with Epic to pay $520 million in penalties and consumer redress. The cases build on existing FTC law and precedent but add new dimensions that should interest a wide array of companies subject to FTC jurisdiction.    

Notably, the first case alleges COPPA violations (compromising the privacy and safety of users under 13) but adds allegations that Epic violated teens’ privacy and safety, too. And the second case alleges unauthorized in-app purchases – not just by kids, which was the focus of earlier FTC cases, but by users of all ages. Both cases rely on unfairness theories in extending their reach. Both incorporate the (now ever-present) concept of dark patterns (generally defined as practices that subvert or impair user choice). And both got a 4-0 Commission vote, with a strong concurrence from Republican Commissioner Wilson explaining her support for the FTC’s use of unfairness here. Neither case names any individuals.  

The privacy case

The FTC’s privacy case alleges that, for over two years following Fortnite’s launch in 2017, Epic allowed kids to register with no parental involvement, and for kids and teens to play the game with features enabling them to communicate in real time with anyone on the platform. According to the FTC, these practices subjected kids and teens to bullying, harassment, threats, and “toxic” content, including “predators blackmailing extorting, or coercing children and teens…into sharing explicit image or meeting offline for sexual activity.” Further, says the FTC, Epic knew about these problems, resisted fixing them and, when it finally took action, added controls that were hard to find and use, and failed to cure the violations.     

Continue Reading Two Epic Cases from the FTC: Spotlight on COPPA, Unfairness, Teens, Dark Patterns, In-App Purchases, Cancellations, and More

As we recently blogged here, the FTC’s review of the COPPA rule has been pending for over three years, prompting one group of Senators, in early October, to ask the agency to “Please Update the COPPA Rule Now.” The FTC has not yet responded to that request (at least not publicly) or made any official moves towards resuming its COPPA review. However, the agency is focusing on children’s privacy and safety in other ways, including by hosting a virtual event on October 19 on “Protecting Kids from Stealth Advertising in Digital Media.”

The FTC’s day-long event examined how advertising that is “blurred” with other content online (“stealth advertising”) affects children. Among other things, the event addressed concerns that some advertising in the digital space – such as the use of influencers on social media, product placement in the metaverse, or “advergames” – can be deceptive or unfair because children don’t know that the content is an ad and/or can’t recognize the ad’s impact.

The event focused in particular on: (1) children’s capacity at different ages to recognize advertising content and distinguish it from other content; (2) harms resulting from the inability of children to recognize advertising; (3) what measures can be taken to protect children from blurred advertising content; and (4) the need for, and efficacy of, disclosures as a solution for children of different ages, including the format, timing, placement, wording, and frequency of disclosures. The FTC has also sought public comment on these topics (until November 18).

The event dove deeply into these issues, with help from a range of legal, policy, behavioral, and communications experts. (See here for the agenda and list of panelists.) The discussion was interesting and substantive, and built on actions already undertaken in Europe and California to develop Age-Appropriate Codes governing child-directed content. However, the event left open the question of whether and how the FTC intends to address the issues discussed. Will it proceed via guidance or rulemaking?  If rulemaking, does it plan to use COPPA, the pending Mag-Moss rulemaking on “commercial surveillance,” or some other regulatory vehicle?

All of these options present challenges: COPPA gives parents the tools to control the content that their children see, but generally doesn’t regulate the content itself. Mag-Moss is a long process, which the FTC has made especially complex with its sprawling ANPR. Finally, any rulemaking restricting kids’ advertising could run into the specific Mag-Moss provision (discussed here) limiting the FTC’s regulatory authority in this area. (On the other hand, protecting kids’ privacy and safety tends to be a bipartisan issue, which will assist the agency as it seeks to address these issues.)

Here’s more detail on what happened at the workshop:
Continue Reading Blurred Lines: A Rundown on the FTC Workshop “Protecting Kids from Stealth Advertising in Digital Media”

Amidst all of the recent news and developments about the privacy of kids and teens (including multiple Congressional hearings; Frances Haugen’s testimony; enactment of the UK’s and California’s Age Appropriate Design Codes; the Irish DPC’s GDPR decision against Instagram; numerous bills in Congress; and the FTC’s ongoing focus on kids’ privacy in policy statements, workshops, and its “commercial surveillance” rulemaking), the FTC still has a powerful tool that seems to be sitting on the back-burner: the Children’s Online Privacy Protection Act (COPPA) and its implementing rule.

But some members of Congress just wrote a letter to the FTC, asking it to make COPPA a priority.

Background on COPPA 

As most of our readers know, COPPA protects the privacy of kids under 13, mostly by requiring kid-directed web sites or apps, or sites/apps that have actual knowledge they’re dealing with kids, to get parental permission before collecting, using, or sharing kids’ data.  Enacted in 1998, COPPA is now nearly 25 years old, a dinosaur in today’s fast-moving world of privacy.  However, using the APA rulemaking authority granted in COPPA, the FTC has amended its COPPA rule to ensure that it keeps pace with developments – for example, extending the rule to ad networks and plug-ins; adding geolocation, persistent identifiers, photos, and videos to the definition of “personal information”; and strengthening the rule’s requirements governing data security, retention, and deletion.

However, those updates to COPPA became final in 2013 – almost ten years ago – and the FTC hasn’t amended the rule since then.  Although the FTC initiated a rule review in July 2019, that review is still pending more than three years later. According to Regulations.gov, the Commission received over 176,000 public comments in the rule review.  That’s a lot of comments, but it surely can’t explain such a lengthy delay.
Continue Reading Congress to FTC: “Please Update the COPPA Rule Now”

The replay for our May 19, 2022 Teen Privacy Law Update webinar is available here.

Protecting the privacy and safety of kids and teens online is receiving enormous attention lately from Congress, the States, the FTC, and even the White House.  Further, just last month, BBB National Programs unveiled a Teenage Privacy Program Roadmap

Lina Khan’s Privacy Priorities – Time for a RecapRumors suggest that Senator Schumer is maneuvering to confirm Alvaro Bedoya as FTC Commissioner sooner rather than later, which would give FTC Chair Khan the majority she needs to move forward on multiple fronts. One of those fronts is consumer privacy, for which  Khan has announced ambitious plans (discussed here and here) that have stalled for lack of Commissioner votes. With Bedoya potentially on deck, now seems like a good time to recap those plans, as they might provide clues about what’s in the pipeline awaiting Bedoya’s vote. We focus here on three priorities Khan has emphasized in statements and interviews since becoming Chair.
Continue Reading Lina Khan’s Privacy Priorities – Time for a Recap

New Federal Bill to Protect Kids’ Privacy: Will This One Break Through?Last October, we blogged that bipartisan momentum was building in Congress to enact stronger privacy protections for children, even if (and especially if) Congress remains stalled on broader federal privacy legislation. Of particular significance, we noted a strong push to protect, not just kids under 13 (the cutoff under COPPA), but also teens.

Since

Where to Find More Info on the FTC’s Top Rules for 2022

Last week, Jessica Rich wrote about the FTC’s rulemaking plans for 2022. Make sure you read that post for a detailed analysis of what the Commission is planning. As we looked at which of those topics have generated the most interest on Ad Law Access recently, we wanted to point you to

On December 13, the New Mexico Attorney General announced a settlement with Google to resolve claims regarding children’s privacy, including in the burgeoning EdTech space. The federal lawsuits Balderas v. Tiny Lab Productions, et al. and Balderas v. Google LLC, respectively, alleged COPPA and privacy violations related to collection of children’s information on