Privacy and Information Security

The Federal Communications Commission (“FCC” or “Commission”) is seeking comments on a Notice of Proposed Rulemaking (NPRM) to refresh its customer proprietary network information (“CPNI”) data breach reporting requirements (the “Rule”).  Adopted earlier this month by a unanimous 4-0 vote of the Commission, the NPRM solicits comments on rule revisions that would expand the scope of notification obligations and accelerate the timeframe to notify customers after a data breach involving telephone call detail records and other CPNI.  The FCC cites “an increasing number of security breaches of customer information” in the telecommunications industry in recent years and the need to “keep pace with today’s challenges” and best practices that have emerged under other federal and state notification standards as reasons to update the Rule.

According to the current Rule, a “breach” means that a person “without authorization or exceeding authorization, has intentionally gained access to, used, or disclosed CPNI.”  As summarized in the NPRM, CPNI includes “phone numbers called by a consumer, the frequency, duration, and timing of such calls, the location of a mobile device when it is in active mode (i.e., able to signal its location to nearby network facilities), and any services purchased by the consumer, such as call waiting.”  (The NPRM does not propose any changes to the definition of CPNI.)

Continue Reading FCC Seeks Comments on Updates to CPNI Breach Reporting Rule

While State Attorneys General have been clear that social media companies are generally on their radar for a variety of consumer protection concerns, TikTok has been the latest to make headlines in recent weeks. For example, multiple states have banned TikTok from government phones, and a federal government ban may soon follow, because of concerns about the Chinese government’s control over the platform. Higher education institutions are also joining in the app’s ban with Georgia public colleges and the University of Oklahoma being two of the most recent to do so.

Earlier this month, Indiana Attorney General Todd Rokita filed two complaints focused on the TikTok app, which were based on two very different theories, including a foreign component in one. The cases both allege violations of the state’s Deceptive Consumer Sales Act.

Representations About Audience-Appropriateness

In its first complaint, the state alleges that TikTok makes a variety of misleading representations and omissions to claim a “12+” rating on the Apple App Store and a “T” for “Teen” rating in the Google Play and Microsoft Stores. Instead, the state asserts, TikTok should self-report ratings of “17+” (in the App Store) and “M” for “Mature” (in the Google Play and Microsoft Stores). To support these allegations, the state details the frequency and severity of alcohol, tobacco, and drug content, sexual content, nudity, mature/suggestive themes, and profanity on the TikTok platform, which it claims are much more frequent and severe than what TikTok self-reports. In its filings, the state also includes an affidavit by outside counsel detailing the mature content she recorded posing as a teen user and the number of views related to specific videos.

In addition, the complaint takes aim at TikTok’s “Restricted Mode,” a feature which can be used by parents to limit inappropriate content. The state alleges that much of the mature content described in its complaint was also accessible in Restricted Mode. Moreover, the complaint alleges that TikTok actually suggests mature content through the functionality of its “Autocomplete” search feature and by including content on a user’s personalized “For You” page.

Continue Reading It May Be Time for TikTok to Change its Ways if State AGs Have Any Say

Just in time for the holidays, the FTC has released two companion settlements resolving allegations that Epic Games (maker of the popular video game Fortnite) violated the Children’s Online Protection Act (COPPA) and the FTC Act, with Epic to pay $520 million in penalties and consumer redress. The cases build on existing FTC law and precedent but add new dimensions that should interest a wide array of companies subject to FTC jurisdiction.    

Notably, the first case alleges COPPA violations (compromising the privacy and safety of users under 13) but adds allegations that Epic violated teens’ privacy and safety, too. And the second case alleges unauthorized in-app purchases – not just by kids, which was the focus of earlier FTC cases, but by users of all ages. Both cases rely on unfairness theories in extending their reach. Both incorporate the (now ever-present) concept of dark patterns (generally defined as practices that subvert or impair user choice). And both got a 4-0 Commission vote, with a strong concurrence from Republican Commissioner Wilson explaining her support for the FTC’s use of unfairness here. Neither case names any individuals.  

The privacy case

The FTC’s privacy case alleges that, for over two years following Fortnite’s launch in 2017, Epic allowed kids to register with no parental involvement, and for kids and teens to play the game with features enabling them to communicate in real time with anyone on the platform. According to the FTC, these practices subjected kids and teens to bullying, harassment, threats, and “toxic” content, including “predators blackmailing extorting, or coercing children and teens…into sharing explicit image or meeting offline for sexual activity.” Further, says the FTC, Epic knew about these problems, resisted fixing them and, when it finally took action, added controls that were hard to find and use, and failed to cure the violations.     

Continue Reading Two Epic Cases from the FTC: Spotlight on COPPA, Unfairness, Teens, Dark Patterns, In-App Purchases, Cancellations, and More

2022 was a remarkable year for privacy. Utah and Connecticut enacted new privacy laws. California and Colorado launched detailed (and continuing) privacy rulemakings. Congress proposed a landmark bipartisan, bicameral federal privacy bill (the American Data Privacy and Protection Act, or ADPPA). And the FTC initiated a sweeping privacy rulemaking under its Section 18 (Mag-Moss) rulemaking authority.

As if that weren’t enough, the US and EU announced a new Transatlantic Data Transfer Framework. We saw aggressive enforcement of UDAP and privacy laws at the federal and state levels. California passed an Age Appropriate Design Code (similar to the UK’s), while Congress proposed multiple kids’ privacy bills. And, amidst all of this, “dark patterns” and “surveillance” shot to the top of the privacy lexicon.   

2023 promises to be just as active, with further twists and turns on all of the above. Notably, the five new state privacy laws we’ve all been awaiting and planning for will take effect at various points in 2023. Further, other states may join the fray, enacting their own laws. If 2022 was the year that regulators and companies spent positioning themselves on the field, 2023 will be the year the balls start flying.  

We’ll be blogging on all of this in 2023 but, for now, we want to highlight some issues we’re watching with particular interest.   

Continue Reading What privacy issues are on deck for 2023?  Here are some of the most interesting ones

On Thursday, November 10th, the Colorado Attorney General’s Office held the first of three stakeholder meetings on its Colorado Privacy Act draft rules. The initial meeting covered Universal Opt Out Mechanisms (UOOMs) and consumer rights. Pre-registered participants were given three minutes to present on each topic. AG staff then posed a variety of

Early this week, a coalition of 40 attorneys general obtained two multistate settlements with Experian concerning data breaches it experienced in 2012 and 2015 that compromised the personal information of millions of consumers nationwide. The 2012 breach investigation was co-led by the Massachusetts and Illinois AG offices, and the 2015 investigation was co-led by the AGs of Connecticut, DC, Illinois, and Maryland. An additional settlement was reached with T-Mobile in connection with the 2015 Experian breach, which impacted more than 15 million individuals who submitted credit applications with T-Mobile.

In an effort to change corporate behavior, both settlements require Experian and T-Mobile to enhance their data security practices and to pay a combined amount of more than $16 million. Experian has agreed to bolster its due diligence and data security practices by adhering to the following:
Continue Reading AG Settlements Call for Stronger Data Security

Just two months before the effective date (January 1, 2023) of the California Privacy Rights Act (“CPRA”), the California Privacy Protection Agency (“CPPA”) Board met on October 28 and 29 to discuss revisions to the agency’s initial draft CPRA regulations.  Board members discussed a range of proposed changes that could significantly impact businesses but also reserved discussion on important topics, such as employee and business-to-business data, for future proceedings.

This post provides further details about the rulemaking process, as well as takeaways from the Board’s discussion of key substantive topics, such as restrictions on the collection of personal information and opt-out preference signals.  The Board directed CPPA staff to consider and include specific modifications, as discussed below; and on November 3, the CPPA released a further revision of its proposed rules for a 15-day public comment period (the “November 3 Draft Regulations”).  The deadline to submit comments is 8:00 am on Monday, November 21.
Continue Reading CPRA Rule Revisions Unlikely to be Finalized in 2022

As we recently blogged here, the FTC’s review of the COPPA rule has been pending for over three years, prompting one group of Senators, in early October, to ask the agency to “Please Update the COPPA Rule Now.” The FTC has not yet responded to that request (at least not publicly) or made any official moves towards resuming its COPPA review. However, the agency is focusing on children’s privacy and safety in other ways, including by hosting a virtual event on October 19 on “Protecting Kids from Stealth Advertising in Digital Media.”

The FTC’s day-long event examined how advertising that is “blurred” with other content online (“stealth advertising”) affects children. Among other things, the event addressed concerns that some advertising in the digital space – such as the use of influencers on social media, product placement in the metaverse, or “advergames” – can be deceptive or unfair because children don’t know that the content is an ad and/or can’t recognize the ad’s impact.

The event focused in particular on: (1) children’s capacity at different ages to recognize advertising content and distinguish it from other content; (2) harms resulting from the inability of children to recognize advertising; (3) what measures can be taken to protect children from blurred advertising content; and (4) the need for, and efficacy of, disclosures as a solution for children of different ages, including the format, timing, placement, wording, and frequency of disclosures. The FTC has also sought public comment on these topics (until November 18).

The event dove deeply into these issues, with help from a range of legal, policy, behavioral, and communications experts. (See here for the agenda and list of panelists.) The discussion was interesting and substantive, and built on actions already undertaken in Europe and California to develop Age-Appropriate Codes governing child-directed content. However, the event left open the question of whether and how the FTC intends to address the issues discussed. Will it proceed via guidance or rulemaking?  If rulemaking, does it plan to use COPPA, the pending Mag-Moss rulemaking on “commercial surveillance,” or some other regulatory vehicle?

All of these options present challenges: COPPA gives parents the tools to control the content that their children see, but generally doesn’t regulate the content itself. Mag-Moss is a long process, which the FTC has made especially complex with its sprawling ANPR. Finally, any rulemaking restricting kids’ advertising could run into the specific Mag-Moss provision (discussed here) limiting the FTC’s regulatory authority in this area. (On the other hand, protecting kids’ privacy and safety tends to be a bipartisan issue, which will assist the agency as it seeks to address these issues.)

Here’s more detail on what happened at the workshop:
Continue Reading Blurred Lines: A Rundown on the FTC Workshop “Protecting Kids from Stealth Advertising in Digital Media”

Amidst all of the recent news and developments about the privacy of kids and teens (including multiple Congressional hearings; Frances Haugen’s testimony; enactment of the UK’s and California’s Age Appropriate Design Codes; the Irish DPC’s GDPR decision against Instagram; numerous bills in Congress; and the FTC’s ongoing focus on kids’ privacy in policy statements, workshops, and its “commercial surveillance” rulemaking), the FTC still has a powerful tool that seems to be sitting on the back-burner: the Children’s Online Privacy Protection Act (COPPA) and its implementing rule.

But some members of Congress just wrote a letter to the FTC, asking it to make COPPA a priority.

Background on COPPA 

As most of our readers know, COPPA protects the privacy of kids under 13, mostly by requiring kid-directed web sites or apps, or sites/apps that have actual knowledge they’re dealing with kids, to get parental permission before collecting, using, or sharing kids’ data.  Enacted in 1998, COPPA is now nearly 25 years old, a dinosaur in today’s fast-moving world of privacy.  However, using the APA rulemaking authority granted in COPPA, the FTC has amended its COPPA rule to ensure that it keeps pace with developments – for example, extending the rule to ad networks and plug-ins; adding geolocation, persistent identifiers, photos, and videos to the definition of “personal information”; and strengthening the rule’s requirements governing data security, retention, and deletion.

However, those updates to COPPA became final in 2013 – almost ten years ago – and the FTC hasn’t amended the rule since then.  Although the FTC initiated a rule review in July 2019, that review is still pending more than three years later. According to Regulations.gov, the Commission received over 176,000 public comments in the rule review.  That’s a lot of comments, but it surely can’t explain such a lengthy delay.
Continue Reading Congress to FTC: “Please Update the COPPA Rule Now”

Join us on Thursday for a webinar discussing how to operationalize adtech privacy compliance, and learn about other ways you can stay informed.

Operationalizing Adtech Privacy Compliance: Understanding the IAB Multi-State Privacy Agreement

State privacy laws that go into effect in 2023 will significantly change the digital advertising landscape.  These privacy laws require companies to