Photo of Jessica Rich

Email
(202) 342-8580
Bio   LinkedIn

On June 27, 2023, “online marketplaces” (i.e., online selling platforms like Amazon and EBay) will have some brand new obligations.  So will many of the third party sellers that operate on these platforms. 

That’s because, tucked away on pages 2800-2819 of last year’s 4000+ page Omnibus Appropriations Bill (between provisions addressing furniture tip-overs and Tribal swimming pools), is legislation requiring the marketplaces to collect and verify certain information from “high-volume third party sellers,” suspend sellers that fail to comply, and disclose the sellers’ contact information to purchasers.        

The new law (the Integrity, Notification, and Fairness in Online Retail Marketplaces for Consumers Act, or the INFORM Consumers Act) charges the Federal Trade Commission (FTC), the state Attorneys General (AGs), and “other state officials” with enforcement; gives the FTC rulemaking authority; and authorizes substantial civil penalties for violations. The law was the result of a bipartisan effort led by Senators Durbin and Cassidy, as well as Representatives Schakowsky and Bilirakis, who remain invested in its success. (Note that Durbin and Schakowsky both mentioned INFORM at recent Congressional hearings.)     

Continue Reading New Law Governing Online Platforms And Sellers Takes Effect In June – Are You Ready?

Last week, in its most high-profile effort yet to focus attention on data privacy and security, the House Committee on Energy & Commerce held a hearing with TikTok’s CEO Shou Zi Chew.   The full-Committee hearing was high drama, with sharp statements and accusations about TikTok’s connections to the Chinese government, wide attendance by Committee members, and extensive press coverage during the hearing and afterwards. Some members (notably Chairwoman Cathy McMorris Rodgers) called for TikTok to be banned from the U.S., while others asked pointed questions without committing to support a ban. Members also used the opportunity to push for federal privacy legislation (and specifically the bipartisan ADPPA), which they said would help to address the dangers posed by Big Tech companies like TikTok.

Overall, the hearing did a far better job of illuminating members’ concerns than in gathering information. Many questions were too broad, complex, or accusatory to be answered in a “yes” or “no” fashion (as frequently requested by Committee members). And at times, Chew was simply evasive. Nevertheless, the hearing highlighted, once again, bipartisan concerns surrounding TikTok, national security, children’s safety, and privacy.

Continue Reading Is Time Really Up for TikTok? – Details from the House Committee Hearing with TikTok CEO Shou Zi Chew

For anyone planning to attending the ABA Antitrust Spring Meeting in Washington DC this week (March 29-31), please look for your friends from Kelley Drye Ad Law on multiple panels on Wednesday and Thursday:

ABBY STEMPSON (Special Counsel in the Ad Law and State AG practices) will be speaking on a panel entitled Fundamentals –

As we’ve described here, the Senate made major strides last year on legislation to protect children’s privacy and safety online. Indeed, two bipartisan bills sailed through a Commerce Committee markup, though they didn’t ultimately make it to the floor for a Senate vote. This year, kids’ privacy is once again getting attention, beginning with a February 14 Senate Judiciary Committee hearing on the issue. Members used the hearing to tout last year’s bills and mention some new ones, too. They also touched on other top-of-mind issues involving the tech industry, such as Section 230 reform and encryption.   

Of note, Senators Blumenthal and Blackburn discussed the Kids Online Safety Act (KOSA) (their bill from last year, just re-introduced), which would impose a “duty of care” on tech companies and shield young people from harmful content. Senator Hawley, in turn, talked up his Making Age-Verification Technology Uniform, Robust, and Effective Act (MATURE Act), which would enforce a minimum age requirement of 16 for users of social media platforms. (As noted below, panelists were quite skeptical that this would work.) 

The event highlighted, once again, the bipartisan interest in tackling the harms that minors face online. Here’s more detail on what happened:  

First up, opening remarks from Chairman Durbin (D-Ill.), Ranking Member Graham (R-S.C.), and Senators Blumenthal (D-Conn.) and Blackburn (R-Tenn.)

Chairman Durbin kicked off the hearing by explaining that the internet and social media have become a threat to young people.  He noted that while the Internet offers tremendous benefits, cyberbullies can hurt kids online via platforms like Facebook and Snapchat. Durbin stated that “we don’t have to take” the lucrative business that the platforms (who were not in attendance) have created to keep kids’ eyes glued to the screens. He said that the addictive nature of the platforms has created a mental health crisis – causing anxiety, stress, and body image issues, for example – which can lead to tragic results.

Continue Reading Senate Judiciary Hearing on Kid’s Privacy – Sunny with a Chance of Section 230 Reform

For the 26+ years I served at the FTC, the agency always described itself as a “law enforcement agency,” not a “regulator.” That’s because the FTC spent most of its resources on enforcing the FTC Act and other laws passed by Congress, not creating new regulations on its own. While it would be an exaggeration to say that the FTC has become a regulator in the mold of the federal banking agencies or CFPB, Chair Khan is certainly pushing the FTC in that direction. Indeed, the agency’s rulemaking activity has dramatically increased under her tenure.    

From “Whack-a-Mole” to “Rule-a-Palooza”

What explains the change? For one thing, the FTC majority believes that the FTC’s former way of operating (which it often describes as “case-by-case enforcement” or even “whack-a-mole”) hasn’t adequately protected consumers and competition, warranting the creation of stricter, broader rules for the entire marketplace. For another, in the wake of the Supreme Court’s decision in AMG (holding that the FTC can’t obtain monetary relief under Section 13(b)), the FTC is increasingly relying on other legal tools to get money – notably, alleging rule violations wherever possible, which enables the FTC to seek civil penalties and/or consumer redress. Hence the desire for more rulemaking, or what Commissioner Wilson has described (in strongly worded dissents) as a “Rule-a-Palooza.”   

Continue Reading Is the FTC a “Regulator”?  It Sure Seems to be Moving in that Direction  

By now, most of our readers have likely heard about the FTC’s proposed rule to ban noncompete clauses in employment contracts, including from Kelley Drye’s other posts on the topic discussing the sheer breadth of the proposal and the potential implications for employers.  In this post, we zero in on an issue that merits a lot more attention than it’s getting – namely, the serious legal and practical questions that the FTC’s proposal raises.  

Brief recap of how we got here and what the rule would require

This is the first of many rulemakings that the FTC has said it will launch based on its supposed authority to issue rules banning “unfair methods of competition” (“UMCs”) under the FTC Act. Notably, starting with a statement of regulatory priorities submitted to OMB in December 2021, the FTC has said repeatedly that it may launch multiple competition rulemakings based on this authority (as well as multiple consumer protection rulemakings based on its Magnuson-Moss authority, which it has done). More recently, the FTC issued a policy statement taking an expansive view of what’s an UMC, so the scope of the FTC’s intended reach here could be very broad indeed.   

Continue Reading The FTC’s Proposal to Ban Noncompetes is on Shaky Legal Ground

Just in time for the holidays, the FTC has released two companion settlements resolving allegations that Epic Games (maker of the popular video game Fortnite) violated the Children’s Online Protection Act (COPPA) and the FTC Act, with Epic to pay $520 million in penalties and consumer redress. The cases build on existing FTC law and precedent but add new dimensions that should interest a wide array of companies subject to FTC jurisdiction.    

Notably, the first case alleges COPPA violations (compromising the privacy and safety of users under 13) but adds allegations that Epic violated teens’ privacy and safety, too. And the second case alleges unauthorized in-app purchases – not just by kids, which was the focus of earlier FTC cases, but by users of all ages. Both cases rely on unfairness theories in extending their reach. Both incorporate the (now ever-present) concept of dark patterns (generally defined as practices that subvert or impair user choice). And both got a 4-0 Commission vote, with a strong concurrence from Republican Commissioner Wilson explaining her support for the FTC’s use of unfairness here. Neither case names any individuals.  

The privacy case

The FTC’s privacy case alleges that, for over two years following Fortnite’s launch in 2017, Epic allowed kids to register with no parental involvement, and for kids and teens to play the game with features enabling them to communicate in real time with anyone on the platform. According to the FTC, these practices subjected kids and teens to bullying, harassment, threats, and “toxic” content, including “predators blackmailing extorting, or coercing children and teens…into sharing explicit image or meeting offline for sexual activity.” Further, says the FTC, Epic knew about these problems, resisted fixing them and, when it finally took action, added controls that were hard to find and use, and failed to cure the violations.     

Continue Reading Two Epic Cases from the FTC: Spotlight on COPPA, Unfairness, Teens, Dark Patterns, In-App Purchases, Cancellations, and More

Since Lina Khan took the reins of the FTC, the agency has launched five new rulemakings under its Section 18 (“Mag-Moss”) authority – specifically, rules to combat government and business impersonation scams, deceptive earnings claims, “commercial surveillance,” deceptive endorsements, and “junk fees.” (I’m excluding here revisions to existing Mag-Moss rules, as well

In a case that will likely resonate with many readers, the FTC’s recent settlement with Vonage describes in excruciating detail the obstacles and costs that Vonage allegedly imposed on consumers when they tried to cancel their phone service.  In many ways, it’s a typical FTC case involving deception, unauthorized charges, and misuse of a “negative option” that makes it simple to sign up and almost impossible to cancel.  However, the FTC’s characterization of the practices as “dark patterns,” coupled with some other features, make this case stand out.  Indeed, any company with a “customer retention strategy” (which is apparently what this was) would be wise to pay attention.

The FTC’s Complaint  

According to the FTC’s complaint, Vonage provides internet based phone service (known as Voice Over Internet Protocol or VOIP) to consumers and small businesses. Monthly charges range from $5-50 for individual customers and can be as high as thousands of dollars for small businesses.  In many cases, Vonage signs up consumers using a negative option plan that requires them to cancel by certain date before being charged.

The complaint alleges that, between 2017 and 2022, Vonage provided several ways to sign up for its plans (including online and via toll free number) but made cancellation much more difficult through numerous hurdles.  It also alleges that, in some cases, monthly fees continued after cancellation; consumers were charged (or threatened with) undisclosed early termination fees (ETFs); and Vonage provided only partial refunds or no refunds at all.  The complaint says that this was all part of a “customer retention strategy” that Vonage pursued despite hundreds of consumer complaints, knowledge among employees, and an earlier settlement with 32 states over similar allegations.

According to the complaint, these practices violated the Restore Online Shoppers’ Confidence Act (ROSCA) (failure to disclose material terms, obtain informed consent before imposing charges, and provide a simple mechanism to stop recurring charges) and Section 5 (charging consumers without their express informed consent).
Continue Reading The FTC’s case against Vonage – Customer Service Nightmare as “Dark Patterns”