Photo of Alysa Zeltzer Hutnik


(202) 342-8603

Businesses often include mandatory arbitration clauses in their pre-dispute dealings with customers to prevent costly consumer class actions in favor of streamlined (often individual) arbitration.  The Federal Arbitration Act (“FAA”) makes such arbitration agreements “valid, irrevocable, and enforceable, save upon such grounds as exist at law or in equity for the revocation of any contract.”  Relying on the FAA, the Supreme Court has defended business enforcement of such clauses against state- and judge-made exceptions.  For example, the Supreme Court has held that the FAA preempts state laws that pose obstacles to its enforcement, prevents courts from invalidating an arbitration agreement on the basis of the cost to arbitrate exceeding the potential recovery, and requires courts to enforce contractual provisions that delegate to an arbitrator the determination of whether an arbitration agreement applies to a dispute.  As a result, the existence and enforcement of mandatory, individual arbitration agreements have become more commonplace in consumer-facing industries.

Democratic senators are seeking to change this by introducing a bill that would narrow the FAA by prospectively barring pre-dispute arbitration agreements and class-action waivers in consumer, employment, antitrust, and civil rights disputes.  In these four areas, the proposed legislation, entitled The Forced Arbitration Injustice Repeal Act of 2019 (the “FAIR Act”), S. 635, H.R. 1423, would also override agreements to have arbitrators determine arbitrability or the FAIR Act’s applicability to the dispute, opting instead for courts to determine these issues under federal law.

The FAIR Act likely faces the same opposition from Republicans that have defeated similar proposals, including the renditions of the “Arbitration Fairness Act” that were rejected from 2007 through 2018.  Although the FAIR Act may garner attention in the current political climate, in large part due to its employment-related provisions, it likely faces an uphill battle in the current Republican-controlled Senate and White House. But it’s a bill that’s worth keeping an eye on.  We’ll continue to post updates on any key developments.

The FTC recently announced a $5.7 million settlement with app developer for COPPA violations associated with its app (now known as TikTok)—the agency’s largest-ever COPPA fine since the enactment of the statute. The agency charged the app company, which allows users to create and share videos of themselves lip-syncing to music, with unlawfully collecting personal information from children.

To create a TikTok profile, users must provide contact information, a short bio, and a profile picture. According to the FTC, between December 2015 and October 2016, the company also collected geolocation information from app users. In 2017, the app started requiring users to provide their age, although it did not require current users to update their accounts with their age. By default, accounts were “public,” allowing users to see each other’s bios (which included their grade or age). It also allowed users to see a list of other users within a 50-mile radius, and gave users the ability to direct message other users. Many of the songs available on the app were popular with children under 13.

The FTC further alleged that received thousands of complaints from parents asserting that their child had created the app account without their knowledge (and noted an example of a two-week period where the company received more than 300 such complaints). The agency also noted that while the company closed the children’s accounts in response, it did not delete the users’ videos or profile information from its servers.

The FTC’s Complaint focused on practices spanning from 2014 through 2017. was acquired by ByteDance Ltd. in December 2017, and merged with the TikTok app in August 2018.

COPPA identifies specific requirements for operators who collect personal information from children under 13, including obtaining consent from parents prior to collection and providing information about collection practices for children’s data. Online services subject to the rule generally fall into two categories: (1) sites that are directed to children and collect personal information from them; and (2) general audience sites that have actual knowledge that they are collecting personal information from children. Civil penalties for violations of COPPA can be up to $41,484 per violation.

According to the FTC,’s app fell into both categories:

  1. The company included music and other content appealing to children on the app. For example, many of the songs included on the app were popular with children under 13, and the app used “colorful and bright emoji characters” that could appeal to children.
  2. Once the company began collecting the ages of its users, had actual knowledge that some of its users were under the age of 13. In spite of this, the company did not obtain consent from the parents of users under the age of 13, or comply with other COPPA requirements.

FTC Commissioners Chopra and Slaughter issued a joint statement on the settlement, pointing out that FTC staff had uncovered disturbing practices of a company willing to pursue growth at the expense of endangering children. They also noted that previously, FTC investigations typically focused on individual accountability in limited circumstances, rather than pursuing broader enforcement against company leaders for widespread company practices. The Commissioners further indicated that as the FTC continues to pursue legal violations going forward, it is time to “prioritize uncovering the role of corporate officers and directors” and to “hold accountable everyone who broke the law.”

This settlement indicates that the FTC continues to prioritize privacy enforcement—particularly where vulnerable audiences, such as children, are involved. Future FTC enforcement actions could signal an expanded approach to individual liability, including with respect to larger companies.

The case is also a good reminder of the value in performing robust privacy due diligence when considering acquiring an entity, and meaningfully assessing the risk of a company’s data practices before adding them to the portfolio. A widely popular business with significant data assets may not look as attractive once civil penalties and injunctive terms are added to the mix.

The Federal Trade Commission (FTC) announced this week that it is seeking comments on proposed amendments to the Privacy Rule and Safeguards Rule under the Gramm-Leach-Bliley Act (GLBA).  These two rules outline obligations for financial institutions to protect the privacy and security of customer data in their control.  While the proposed changes to the Privacy Rule are modest, the expansive list of specific cyber controls proposed for the Safeguards Rule is material and could impose a new de facto minimum security standard that implicates many businesses, including those outside the coverage of the Rule.

Privacy Rule

The Privacy Rule, which went into effect in 2000, requires a financial institution to inform customers about its information-sharing practices and allow customers to opt out of having their information shared with certain third parties. Changes to the Dodd-Frank Act in 2010 transferred the majority of the FTC’s rulemaking authority for the Privacy Rule to the Consumer Financial Protection Bureau.  Only certain motor vehicle dealers are still subject to FTC rulemaking under the Privacy Rule.  To address these changes, the proposed amendments would remove from the Rule examples of financial institutions that are no longer subject to FTC rulemaking authority, and provide clarification to motor vehicle dealers regarding the annual privacy notices.

Safeguards Rule

The Safeguards Rule, which went into effect in 2003, requires financial institutions to develop, implement, and maintain comprehensive information security programs to protect their customers’ personal information. Currently, the Safeguards Rule emphasizes a process-based approach that is flexible in how the program is implemented so long as it meaningfully addresses core components, and where the safeguards address foreseeable internal and external cyber risks to customer information.

The proposed amendments to the Safeguards Rule would still follow a process-based approach but add significantly more specific requirements that must be addressed as part of the company’s information security program. These include, for example:

  • Appointing a Chief Information Security Officer (CISO) (e.g., a qualified individual responsible for overseeing and implementing the information security program and enforcing the program). The CISO can be an employee, affiliate, or a service provider, but if the latter, additional requirements apply;
  • More specificity in what the required information security program’s risk assessments involve;
  • More specificity in what is required as part of a company’s access controls for their information systems;
  • Updating risk assessments and resulting safeguards concerning a company’s data and system identification and mapping;
  • Employing encryption of all customer information stored or transmitted over external networks or implement alternative compensating controls that are reviewed and approved by the company’s CISO;
  • Adopting secure development practices for in-house developed applications that handle customer information;
  • Implementing multi-factor authentication for any individual with access to customer information or internal networks that contain customer information (unless the CISO approves a compensating control);
  • Including audit trails that detect and respond to security events;
  • Implementing change management procedures;
  • Implementing safeguards that both monitor authorized activity and detect unauthorized activity involving customer information;
  • Regular testing of the effectiveness of the information security program’s key controls, systems, and procedures, including continuous monitoring or annual penetration testing and biannual vulnerability assessments;
  • Establishing a written incident response plan that addresses goals, outlines the internal processes for incident response, defines clear roles, responsibilities and levels of decision-making authority, identifies external and internal communications and information sharing, identifies requirements for the remediation of identified weaknesses in information systems and controls, addresses the documentation and reporting of security events and related incident response activities, and the evaluation and revision of the program, as needed post-incident;
  • Requiring the CISO to at least annually report to the board of directors or equivalent governing body on the status of the information security program, the company’s compliance with the Safeguards Rule, and material matters related to the information security program.

The proposed modifications would exempt small businesses (financial institutions that maintain customer information concerning fewer than five thousand consumers) from some of the Safeguard Rule’s requirements.

In addition, the proposed modifications would expand the definition of “financial institution” to include entities engaged in activities that the Federal Reserve Board determines to be incidental to financial activities (e.g., “finders” that bring together buyers and sellers of a product or service), and incorporate the definition of this term directly in the Safeguards Rule, instead of by reference based on the Privacy Rule.

Two Republican appointed-Commissioners, Noah Phillips and Christine Wilson, dissented from the proposed amendments, noting that it may not be appropriate to mandate such prescriptive standards for all market participants. They maintained that producing guidance for companies would be a better approach than one-size-fits-all amendments that all companies will have to follow. The Commissioners also made a case that the proposed amendments are based on the New York State Department of Financial Services cyber regulations, which are too new for the FTC to evaluate for impact or efficacy.  They also expressed concerns with the rigidity that these new requirements would place on what is now a flexible approach, and whether these amendments would place the Commission in the stead of a company’s governance in deciding the level of board engagement, hiring and training, and accountability design, among other controls.


While the proposed amendments are limited to financial institutions subject to the GLBA Privacy Rule and Safeguards Rule, if adopted, the specificity of the cyber controls proposed are likely to factor into contract terms that financial institutions impose on their partners and service providers, as well as serve as a potential model for other industries. If adopted, these would be the most explicit cyber regulations in the United States to date.  At the same time, it is notable that the agency declined to adopt a safe harbor based on a showing of compliance with an industry standard, such as NIST or PCI DSS.  In other words, the proposed changes suggest a potential new minimum standard for enterprise security programs that warrant close consideration.  Given the influential role that the Safeguards Rule played in developing information security programs outside of the financial sector, these new proposed requirements may well become the de facto industry standard if history is a guide.

The deadline to submit written comments will be 60 days after the notice is published in the Federal Register. We will continue to monitor these developments.


The National Institute of Standards and Technology (NIST) released a preview of its plans for a standard Privacy Framework this past week.  The purpose of the Framework is to help organizations better manage privacy risks.

The Privacy Framework would breakdown privacy functions into five categories: identify the context of processing, protect private data, control data through data management, inform individuals about data processing, and respond to adverse breach events.

Also, organizations would be able to reference the Privacy Framework when deciding how to tailor compliance to the organization’s risk tolerance, privacy objectives, and financial resources.

NIST enters the privacy policy-making arena in a crowded field.  The NTIA has solicited comments on developing an approach to consumer privacy, Congress is considering competing legislative options for federal privacy legislation, and California is gearing up this year for the 2020 implementation of the CCPA.

But as NIST explains on its website, the NIST framework is intended to compliment statutory and regulatory rules, not replace them: “the NIST framework is envisioned as an enterprise-level privacy risk management tool that can be compatible with and support organizations’ ability to operate under applicable domestic and international legal or regulatory regimes.”

Throughout the process of developing the Privacy Framework, NIST has emphasized that it will leverage its 2014 Cybersecurity Framework – both as a template and as an example of the value of standards documents.  The agency recently celebrated the five-year anniversary of the Cybersecurity Framework in February, touting the fact that the Framework has been downloaded more than half a million times.

Kelley Drye will continue to track developments at NIST on the development of a Privacy Framework.  If you have questions about the Privacy Framework or are interested in submitting comments, please contact Alysa Hutnik or Alex Schneider at Kelley Drye.

The current and future definition of what qualifies as an automatic telephone dialing system (ATDS or autodialer) remains a hotly debated and evaluated issue for every company placing calls and texts, or designing dialer technology, as well as the litigants and jurists already mired in litigation under the Telephone Consumer Protection Act (TCPA).  Last year, the D.C. Circuit struck down the FCC’s ATDS definition in ACA International v. FCC, Case No. 15-1211 (D.C. Cir. 2019).  Courts since have diverged in approaches on interpreting the ATDS term.  See, e.g., prior discussions of Marks and Dominguez.  All eyes thus remain fixed on the FCC for clarification.

In this post, we revisit the relevant details of the Court’s decision in ACA International, and prior statements of FCC Chairman Ajit Pai concerning the ATDS definition to assess how history may be a guide to how the FCC approaches this issue.

Continue Reading Taking Stock of the TCPA in 2019: What is an “Autodialer”?

Last week, the California Assembly’s Standing Committee on Privacy and Consumer Protection held a hearing to discuss the California Consumer Privacy Act. While many panelists from the private sector pointed out problems with the law, a few panelists defended the law, and some suggested that it didn’t go far enough. For example, Stacey Schesser, the Supervising Deputy Attorney General for the Privacy Unit in the Consumer Law Section of the Office of the California Attorney General, stated that the current law presents “unworkable obligations and operational challenges” for the AG’s office and suggested several significant changes. This week, California AG Becerra and state Senator Hannah-Beth Jackson announced a bill that would seek to implement the changes Ms. Schesser described into law.

The bill includes two proposals that could materially affect potential exposure for businesses under the CCPA:

  • Private Right of Action:  The current law allows any consumer whose unencrypted or unredacted personal information is breached “as a result of a violation of the duty to implement and maintain reasonable security procedures and practices” to recover statutory damages of up to $750 per incident. The private right of action is likely to be used in litigation, particularly over what constitutes “reasonable” practices, but at least it is limited to breaches. The new bill, however, would expand the private right of action to cover violations of any other section of the law, as well.
  • Right to Cure:  The current law requires the AG to give businesses notice and 30 days to cure alleged violations before the AG can seek an injunction and civil penalties. This 30-day cure period can provide a warning to businesses that are trying to comply with a confusing law, if their efforts fall short. The proposed bill, however, would remove the right to cure, leaving businesses immediately exposed for any violations.

In addition to these changes, the bill proposes to remove a provision that would allow businesses to seek guidance from the AG on how to comply withCA Flag the law.

If the bill is enacted into law, these changes would be a boon to plaintiffs’ attorneys and privacy litigators. However, to use Ms. Schesser’s words, the changes would result in even more “unworkable obligations and operational challenges” for businesses. We will continue to closely track these developments, and keep you posted.

The Federal Trade Commission (FTC) announced this week that it would not update its anti-spam rule, completing the agency’s first 10-year review of the regulation.

The FTC last updated the rule, known as the CAN-SPAM Rule, in 2008. The rule requires, among other things, that commercial e-mail messages have a mechanism for allowing the recipient to opt out of future messages.

As part of the FTC’s review process, the FTC sought comments on whether the agency should update the definition of “transaction or relationship messages,” shorten the time period for honoring opt-out requests, or add to the statutory list of aggravated violations.

Ultimately, the Commission chose to keep the 2008 rule. Despite the advent of social media, increasingly sophisticated processes for identifying spam and managing opt-outs, and never-ending threats to a clean inbox, the FTC repeatedly declined to take up commenters’ suggestions for changing the CAN-SPAM Rule, citing unclear cost-benefit analysis outcomes, lack of evidence, and limited Congressional authority.  Here’s some examples:

  • On shortening the time period for opt-out requests: “[N]one of these comments provided the Commission with evidence showing how or to what extent the current ten business-day time-period has negatively affected consumers, nor did they address the concerns noted by other commenters that such a change may pose substantial burdens on small businesses.”
  • On commenter suggestions to modify opt-out requirements: “[N]one of the comments provides the Commission with information about the costs and benefits of these proposed rule changes.”
  • On comments asking the FTC to require consumer permission before transferring or selling a consumer’s email address to a third-party, and blocking all unsolicited spam from servers outside the US: “The Commission also declines to consider the remaining proposed modifications because each would be inconsistent with the Commission’s circumscribed authority under the Act.”

The FTC voted unanimously to confirm the CAN-SPAM Rule. If you have any questions about your obligations pursuant to the CAN-SPAM Rule, please contact Alysa Hutnik or Alex Schneider at Kelley Drye.

Last week, five advertising and marketing trade associations jointly filed comments with the California Attorney General seeking clarification on provisions within the California Consumer Privacy Act (CCPA).

While expressing “strong support” for the CCPA’s intent, and noting the online ad industry’s longstanding consumer privacy efforts like the DAA’s YourAdChoices Program, the group proposed the following three clarifications relating to CCPA provisions that, unless modified, the group believes could reduce consumer choice and privacy:

  • Notice relating to a sale of consumer data: A company’s written assurance of CCPA compliance should satisfy the requirement to provide a consumer with “explicit notice” (under 1798.115(d)) when a company sells a consumer’s personal data that the company did not receive directly from such consumer;
  • Partial opt-out from the sale of consumer data: When responding to a consumer’s request to opt out of the sale of personal data, companies can present consumers with choices on the types of “sales” from which to opt-out, the types of data to be deleted, or whether to opt out completely, rather than simply offering an all or nothing opt-out.
  • No individualized privacy policies: Businesses should not be required to create individualized privacy policies for each consumer to satisfy the requirement that a privacy policy disclose to consumers the specific pieces of personal data the business has collected about them.

The associations signing on to the comments include the Association of National Advertisers, American Advertising Federation, Interactive Advertising Bureau, American Association of Advertising Agencies, and the Network Advertising Initiative. The comments represent an “initial” submission intended to raise the proposals above and, more broadly, highlight to the California AG the importance of the online-ad supported ecosystem and its impact on the economy.  The associations plan to submit more detailed comments in the coming weeks.

The comments coincide with a series of public forums that the California AG is hosting to provide interested parties with an initial opportunity to comment on CCPA requirements and the corresponding regulations that the Attorney General must adopt on or before July 1, 2020.


In the Data Business? You May Be Obligated to Register in Vermont by Thursday

Data brokers have until this Thursday to register with the Vermont Secretary of State as part of a new data broker oversight law that became effective January 1st.

Approved unanimously by the Vermont Senate last May, the Vermont Data Broker Regulation, Act 171 of 2018, requires data brokers to register annually, pay an annual filing fee of $100, and maintain minimum data security standards, but the law does not prevent data brokers from collecting or selling consumer data.

What Qualifies as a “Data Broker”?

The law only applies to “data broker[s],” defined as a “business, or unit or units of a business, separately or together, that knowingly collects and sells or licenses to third parties the brokered personal information of a consumer with whom the business does not have a direct relationship.” Continue Reading In the Data Business? You May Be Obligated to Register in Vermont by Thursday

As we noted previously, the California Attorney General is holding a series of public forums on the California Consumer Privacy Act (CCPA) to provide the public with an initial opportunity to comment on CCPA requirements and the corresponding regulations that the Attorney General must adopt on or before July 1, 2020.  On Friday, January 25, 2019, the Attorney General’s Office held its fourth of six hearings before a full auditorium in Los Angeles.  This blog post summarizes the main themes discussed at the hearing.

Timing/Scope:  For businesses hoping for CCPA clarity and guidance soon, that seems unlikely. California Deputy Attorney General Lisa Kim initiated the hearing, emphasizing that the Attorney General’s Office was in the beginning of its rulemaking process and noting that she anticipated the formal review process not to start until Fall 2019.  For now, the Attorney General’s Office encouraged interested parties to submit comments by the end of February, focusing on subjects within the scope of the Attorney General’s rulemaking responsibilities, as set forth in the CCPA, including:

  • Categories of Personal Information
  • Definition of Unique Identifiers
  • CCPA Exemptions
  • Submitting and Complying with Consumer Requests
  • Uniform Opt-Out Logo/Button
  • Notices and Information to Consumers, including Financial Incentive Offerings
  • Certification of Consumers’ Requests

During the hearing, the Attorney General’s Office displayed this PowerPoint deck, summarizing the CCPA regulatory process.

Main Themes

Continue Reading California Privacy Update: What We Heard at Friday’s CCPA Hearing