FTC Chairman Joe Simons recently acknowledged the Commission’s plan to use its authority under Section 6(b) of the FTC Act to examine the data practices of large technology companies.  In written responses to questions from members of the U.S. Senate Commerce Committee following in-person testimony in November 2018, Chairman Simons confirmed that plans were underway to gather information from tech companies, though the specific targets or areas of focus remained under consideration.

As described by the FTC, Section 6(b) of the FTC Act “plays a critical role in protecting consumers,” and broadly authorizes the Commission to obtain information – or “special reports” – about certain aspects of a company’s business or industry sector.  Companies that are the focus of an FTC study pursuant to Section 6(b) must respond to a formal order issued by the Commission that, similar to a civil investigative demand, can include a series of information and document requests.  The information obtained through the order may then be the basis for FTC studies and subsequent industry guidance or rulemaking.

The revelation of the pending 6(b) orders comes amid concerns from federal and state lawmakers and regulators about transparency relating to “Big Data” practices and online data collection, and the use of artificial intelligence and machine-learning algorithms in decision-making.  In remarks this week to attendees of an Association of National Advertisers conference, Chairman Simons noted a potential lack of transparency in the online behavioral advertising context and “the fact that many of the companies at the heart of this ecosystem operate behind the scenes and without much consumer awareness.”

 

 

In February 2018, I reported on a 20-state objection brief, filed with the U.S. Supreme Court, asking the Court to reverse the approval of the class action settlement in Gaos v. Google.  That deal would have distributed a few million dollars to nonprofit groups, while the AGs wanted money paid to real people, even if that meant holding a lottery to do it.  Today, although the Supreme Court reversed the settlement, it did so on standing grounds and did not address whether a class action can be settled solely through “cy pres” settlements to non-profits.

The Supreme Court cited its recent Spokeo v. Robins decision in which it held that plaintiffs must allege concrete harm, and not just a bare statutory violation, in order to have Article III standing to sue in federal court.  Spokeo was not the Court’s most edifying decision and lower courts have split wildly on what it means in practice.  The Court’s decision today didn’t address that split; it just told the lower courts to analyze the Gaos plaintiffs’ standing in light of Spokeo without opining on the issue one way or the other.

Justice Thomas dissented alone.  He expressed his disagreement with Spokeo, believing that if Congress made conduct illegal, violating that statute suffices to confer standing.  He then said he would have reversed the settlement.  In Justice Thomas’s view, if a settlement provides no benefit to class members, and looks to be solely a means to extinguish a claim, courts should not approve it.

Perhaps the biggest takeaway from today’s decision, therefore, is that eight of the nine Justices think differently from Justice Thomas on this issue.  How differently, only time will tell.

Last week, in Cline v. Touchtunes Music Corp., No. 18-1756,  the Second Circuit Court of Appeals upheld a Manhattan district judge’s decision to approve a low-cost class action settlement in what the judge termed a “nuisance” case, while basically zeroing out the $100,000 fee requested by the plaintiffs’ class counsel.

Defendants who have faced silly but not entirely motionable class actions can momentarily enjoy the schadenfreude of watching a plaintiff’s law firm come away with nothing for its efforts.  The problem with decisions like Cline, however, is that they may make plaintiffs’ counsel more hesitant to settle a cheap case on cheap terms.  For the class action defense bar, raising potential settlement costs is nothing to celebrate.

The facts of Cline are almost too silly to merit repeating.  The defendant provided a digital jukebox application to (for example) bars and restaurants.  Patrons could pay money to play songs, and the app’s terms told the patrons clearly that their songs weren’t guaranteed to play and that no refunds would be provided under any circumstances.  What the terms didn’t disclose, however, is that the restaurant manager had the ability to manually skip songs.  The plaintiff, ostensibly on behalf of a class of people whose songs were skipped, sued for the lost value—about 40 cents each—of not hearing the songs be played.

The defendant, after two tries, couldn’t quite get the whole suit dismissed.  A highly experienced district judge left alive a false advertising claim for the non-disclosure.  At that point, early last year, the parties agreed to a settlement.  About 166,000 patrons whose songs weren’t played, and for whom the defendant had contact information, received a code by email good for one free song play on any of the defendant’s jukeboxes.  Other people could file claims for codes, and 2,200 people did so.

The plaintiff’s counsel sought a $100,000 fee for themselves and a $2,000 incentive fee for the plaintiff.  The judge approved the settlement but not these fees.  He rejected the incentive award and, in place of the $100,000 fee, which plaintiffs’ counsel contended was their “lodestar” of hours worked, the judge instead granted a fee of 20 cents per song code that class members actually redeem within the one-year expiration period.  That fee is likely to be less than $1,000.  The plaintiffs’ counsel appealed that reduction, but the Second Circuit upheld it in a summary order, finding that the district judge acted within his discretion, especially in a “coupon”-type settlement.

What should not be lost in any analysis of Cline is this key statement in the Second Circuit’s opinion:  “[C]lass counsel’s lodestar fee application was not supported by contemporaneous billing records, and…no substantial explanation had been provided for a $10,000 ‘consulting fee’ for which reimbursement was sought.”  The Second Circuit thus reinforced that plaintiffs’ counsel absolutely can still seek lodestar-based fees even when settling for coupons or in-kind goods, provided that they support those fees with appropriate billing detail.

If plaintiffs’ counsel try to tell you that they don’t want to enter into a coupon or in-kind settlement because Cline makes them fearful of receiving no fee in the case, therefore, remind them that the problem in Cline wasn’t the settlement structure; it was the law firm’s failure to document its fees.  Don’t make that mistake, and Cline shouldn’t be an issue.  Low-value cases like Cline still should be able to settle on low-value terms.

 

Businesses often include mandatory arbitration clauses in their pre-dispute dealings with customers to prevent costly consumer class actions in favor of streamlined (often individual) arbitration.  The Federal Arbitration Act (“FAA”) makes such arbitration agreements “valid, irrevocable, and enforceable, save upon such grounds as exist at law or in equity for the revocation of any contract.”  Relying on the FAA, the Supreme Court has defended business enforcement of such clauses against state- and judge-made exceptions.  For example, the Supreme Court has held that the FAA preempts state laws that pose obstacles to its enforcement, prevents courts from invalidating an arbitration agreement on the basis of the cost to arbitrate exceeding the potential recovery, and requires courts to enforce contractual provisions that delegate to an arbitrator the determination of whether an arbitration agreement applies to a dispute.  As a result, the existence and enforcement of mandatory, individual arbitration agreements have become more commonplace in consumer-facing industries.

Democratic senators are seeking to change this by introducing a bill that would narrow the FAA by prospectively barring pre-dispute arbitration agreements and class-action waivers in consumer, employment, antitrust, and civil rights disputes.  In these four areas, the proposed legislation, entitled The Forced Arbitration Injustice Repeal Act of 2019 (the “FAIR Act”), S. 635, H.R. 1423, would also override agreements to have arbitrators determine arbitrability or the FAIR Act’s applicability to the dispute, opting instead for courts to determine these issues under federal law.

The FAIR Act likely faces the same opposition from Republicans that have defeated similar proposals, including the renditions of the “Arbitration Fairness Act” that were rejected from 2007 through 2018.  Although the FAIR Act may garner attention in the current political climate, in large part due to its employment-related provisions, it likely faces an uphill battle in the current Republican-controlled Senate and White House. But it’s a bill that’s worth keeping an eye on.  We’ll continue to post updates on any key developments.

The FTC recently announced a $5.7 million settlement with app developer Musical.ly for COPPA violations associated with its app (now known as TikTok)—the agency’s largest-ever COPPA fine since the enactment of the statute. The agency charged the app company, which allows users to create and share videos of themselves lip-syncing to music, with unlawfully collecting personal information from children.

To create a TikTok profile, users must provide contact information, a short bio, and a profile picture. According to the FTC, between December 2015 and October 2016, the company also collected geolocation information from app users. In 2017, the app started requiring users to provide their age, although it did not require current users to update their accounts with their age. By default, accounts were “public,” allowing users to see each other’s bios (which included their grade or age). It also allowed users to see a list of other users within a 50-mile radius, and gave users the ability to direct message other users. Many of the songs available on the app were popular with children under 13.

The FTC further alleged that Musical.ly received thousands of complaints from parents asserting that their child had created the app account without their knowledge (and noted an example of a two-week period where the company received more than 300 such complaints). The agency also noted that while the company closed the children’s accounts in response, it did not delete the users’ videos or profile information from its servers.

The FTC’s Complaint focused on practices spanning from 2014 through 2017. Musical.ly was acquired by ByteDance Ltd. in December 2017, and merged with the TikTok app in August 2018.

COPPA identifies specific requirements for operators who collect personal information from children under 13, including obtaining consent from parents prior to collection and providing information about collection practices for children’s data. Online services subject to the rule generally fall into two categories: (1) sites that are directed to children and collect personal information from them; and (2) general audience sites that have actual knowledge that they are collecting personal information from children. Civil penalties for violations of COPPA can be up to $41,484 per violation.

According to the FTC, Musical.ly’s app fell into both categories:

  1. The company included music and other content appealing to children on the app. For example, many of the songs included on the app were popular with children under 13, and the app used “colorful and bright emoji characters” that could appeal to children.
  2. Once the company began collecting the ages of its users, Musical.ly had actual knowledge that some of its users were under the age of 13. In spite of this, the company did not obtain consent from the parents of users under the age of 13, or comply with other COPPA requirements.

FTC Commissioners Chopra and Slaughter issued a joint statement on the settlement, pointing out that FTC staff had uncovered disturbing practices of a company willing to pursue growth at the expense of endangering children. They also noted that previously, FTC investigations typically focused on individual accountability in limited circumstances, rather than pursuing broader enforcement against company leaders for widespread company practices. The Commissioners further indicated that as the FTC continues to pursue legal violations going forward, it is time to “prioritize uncovering the role of corporate officers and directors” and to “hold accountable everyone who broke the law.”

This settlement indicates that the FTC continues to prioritize privacy enforcement—particularly where vulnerable audiences, such as children, are involved. Future FTC enforcement actions could signal an expanded approach to individual liability, including with respect to larger companies.

The case is also a good reminder of the value in performing robust privacy due diligence when considering acquiring an entity, and meaningfully assessing the risk of a company’s data practices before adding them to the portfolio. A widely popular business with significant data assets may not look as attractive once civil penalties and injunctive terms are added to the mix.

The Federal Trade Commission (FTC) announced this week that it is seeking comments on proposed amendments to the Privacy Rule and Safeguards Rule under the Gramm-Leach-Bliley Act (GLBA).  These two rules outline obligations for financial institutions to protect the privacy and security of customer data in their control.  While the proposed changes to the Privacy Rule are modest, the expansive list of specific cyber controls proposed for the Safeguards Rule is material and could impose a new de facto minimum security standard that implicates many businesses, including those outside the coverage of the Rule.

Privacy Rule

The Privacy Rule, which went into effect in 2000, requires a financial institution to inform customers about its information-sharing practices and allow customers to opt out of having their information shared with certain third parties. Changes to the Dodd-Frank Act in 2010 transferred the majority of the FTC’s rulemaking authority for the Privacy Rule to the Consumer Financial Protection Bureau.  Only certain motor vehicle dealers are still subject to FTC rulemaking under the Privacy Rule.  To address these changes, the proposed amendments would remove from the Rule examples of financial institutions that are no longer subject to FTC rulemaking authority, and provide clarification to motor vehicle dealers regarding the annual privacy notices.

Safeguards Rule

The Safeguards Rule, which went into effect in 2003, requires financial institutions to develop, implement, and maintain comprehensive information security programs to protect their customers’ personal information. Currently, the Safeguards Rule emphasizes a process-based approach that is flexible in how the program is implemented so long as it meaningfully addresses core components, and where the safeguards address foreseeable internal and external cyber risks to customer information.

The proposed amendments to the Safeguards Rule would still follow a process-based approach but add significantly more specific requirements that must be addressed as part of the company’s information security program. These include, for example:

  • Appointing a Chief Information Security Officer (CISO) (e.g., a qualified individual responsible for overseeing and implementing the information security program and enforcing the program). The CISO can be an employee, affiliate, or a service provider, but if the latter, additional requirements apply;
  • More specificity in what the required information security program’s risk assessments involve;
  • More specificity in what is required as part of a company’s access controls for their information systems;
  • Updating risk assessments and resulting safeguards concerning a company’s data and system identification and mapping;
  • Employing encryption of all customer information stored or transmitted over external networks or implement alternative compensating controls that are reviewed and approved by the company’s CISO;
  • Adopting secure development practices for in-house developed applications that handle customer information;
  • Implementing multi-factor authentication for any individual with access to customer information or internal networks that contain customer information (unless the CISO approves a compensating control);
  • Including audit trails that detect and respond to security events;
  • Implementing change management procedures;
  • Implementing safeguards that both monitor authorized activity and detect unauthorized activity involving customer information;
  • Regular testing of the effectiveness of the information security program’s key controls, systems, and procedures, including continuous monitoring or annual penetration testing and biannual vulnerability assessments;
  • Establishing a written incident response plan that addresses goals, outlines the internal processes for incident response, defines clear roles, responsibilities and levels of decision-making authority, identifies external and internal communications and information sharing, identifies requirements for the remediation of identified weaknesses in information systems and controls, addresses the documentation and reporting of security events and related incident response activities, and the evaluation and revision of the program, as needed post-incident;
  • Requiring the CISO to at least annually report to the board of directors or equivalent governing body on the status of the information security program, the company’s compliance with the Safeguards Rule, and material matters related to the information security program.

The proposed modifications would exempt small businesses (financial institutions that maintain customer information concerning fewer than five thousand consumers) from some of the Safeguard Rule’s requirements.

In addition, the proposed modifications would expand the definition of “financial institution” to include entities engaged in activities that the Federal Reserve Board determines to be incidental to financial activities (e.g., “finders” that bring together buyers and sellers of a product or service), and incorporate the definition of this term directly in the Safeguards Rule, instead of by reference based on the Privacy Rule.

Two Republican appointed-Commissioners, Noah Phillips and Christine Wilson, dissented from the proposed amendments, noting that it may not be appropriate to mandate such prescriptive standards for all market participants. They maintained that producing guidance for companies would be a better approach than one-size-fits-all amendments that all companies will have to follow. The Commissioners also made a case that the proposed amendments are based on the New York State Department of Financial Services cyber regulations, which are too new for the FTC to evaluate for impact or efficacy.  They also expressed concerns with the rigidity that these new requirements would place on what is now a flexible approach, and whether these amendments would place the Commission in the stead of a company’s governance in deciding the level of board engagement, hiring and training, and accountability design, among other controls.

***

While the proposed amendments are limited to financial institutions subject to the GLBA Privacy Rule and Safeguards Rule, if adopted, the specificity of the cyber controls proposed are likely to factor into contract terms that financial institutions impose on their partners and service providers, as well as serve as a potential model for other industries. If adopted, these would be the most explicit cyber regulations in the United States to date.  At the same time, it is notable that the agency declined to adopt a safe harbor based on a showing of compliance with an industry standard, such as NIST or PCI DSS.  In other words, the proposed changes suggest a potential new minimum standard for enterprise security programs that warrant close consideration.  Given the influential role that the Safeguards Rule played in developing information security programs outside of the financial sector, these new proposed requirements may well become the de facto industry standard if history is a guide.

The deadline to submit written comments will be 60 days after the notice is published in the Federal Register. We will continue to monitor these developments.

 

The National Institute of Standards and Technology (NIST) released a preview of its plans for a standard Privacy Framework this past week.  The purpose of the Framework is to help organizations better manage privacy risks.

The Privacy Framework would breakdown privacy functions into five categories: identify the context of processing, protect private data, control data through data management, inform individuals about data processing, and respond to adverse breach events.

Also, organizations would be able to reference the Privacy Framework when deciding how to tailor compliance to the organization’s risk tolerance, privacy objectives, and financial resources.

NIST enters the privacy policy-making arena in a crowded field.  The NTIA has solicited comments on developing an approach to consumer privacy, Congress is considering competing legislative options for federal privacy legislation, and California is gearing up this year for the 2020 implementation of the CCPA.

But as NIST explains on its website, the NIST framework is intended to compliment statutory and regulatory rules, not replace them: “the NIST framework is envisioned as an enterprise-level privacy risk management tool that can be compatible with and support organizations’ ability to operate under applicable domestic and international legal or regulatory regimes.”

Throughout the process of developing the Privacy Framework, NIST has emphasized that it will leverage its 2014 Cybersecurity Framework – both as a template and as an example of the value of standards documents.  The agency recently celebrated the five-year anniversary of the Cybersecurity Framework in February, touting the fact that the Framework has been downloaded more than half a million times.

Kelley Drye will continue to track developments at NIST on the development of a Privacy Framework.  If you have questions about the Privacy Framework or are interested in submitting comments, please contact Alysa Hutnik or Alex Schneider at Kelley Drye.

Asserting the authority to oversee the Consumer Product Safety Commission, Frank Pallone, Jr. (D-NJ), Chairman of the Committee on Energy and Commerce, and Jan Schakowsky (D-IL), Chair of the Subcommittee on Consumer Protection and Commerce, have requested information from the Commission concerning the CPSC’s workload and its dealings with the public with regard to consumer complaints and FOIA requests. In a letter to Acting Chairman Ann Marie Buerkle, the Committee has requested information such as:

  • A list of rulemakings, petitions, applications, complaints, requests, and other items pending before the CPSC, including the length of time the matter has been pending and associated staff;
  • The total number of reports of unsafe products received through saferproducts.gov from FY 2016-2019;
  • Information pertaining to the number of investigations opened and closed by the Office of Compliance & Field Operations from FY 2016-2019;
  • Details about involvement in voluntary standards development;
  • A list of all FOIA requests from FY 2016-2019;
  • A list of civil penalties, including lists of internal “referrals” for civil penalties; and
  • A list of all matters from which CPSC leadership or staff has been recused from FY 2016-2019 and the reason for each recusal.

The Committee has requested a complete written response to these questions by March 22, 2019. We expect that an oversight or similar hearing will likely follow the CPSC’s response, and we will continue to monitor developments.

The current and future definition of what qualifies as an automatic telephone dialing system (ATDS or autodialer) remains a hotly debated and evaluated issue for every company placing calls and texts, or designing dialer technology, as well as the litigants and jurists already mired in litigation under the Telephone Consumer Protection Act (TCPA).  Last year, the D.C. Circuit struck down the FCC’s ATDS definition in ACA International v. FCC, Case No. 15-1211 (D.C. Cir. 2019).  Courts since have diverged in approaches on interpreting the ATDS term.  See, e.g., prior discussions of Marks and Dominguez.  All eyes thus remain fixed on the FCC for clarification.

In this post, we revisit the relevant details of the Court’s decision in ACA International, and prior statements of FCC Chairman Ajit Pai concerning the ATDS definition to assess how history may be a guide to how the FCC approaches this issue.

Continue Reading Taking Stock of the TCPA in 2019: What is an “Autodialer”?

The draft National E-Commerce Policy (“Draft Policy”) released by the Government of India on February 23, 2019 for stakeholder comments, has left the e-commerce sector in jitters. For global market players, the protectionist construct of the Draft Policy seems to suggest a shift of India’s focus from ‘Ease of Doing Business in India’ to ‘Make in India’. If the Draft Policy is implemented in its present form, it may have a serious impact demanding drastic change in internal strategies, policies and cost allocations for foreign companies having e-commercial presence in India. The Draft Policy is open for stakeholder comments up to March 9, 2019.

The Draft Policy focuses on: (i) restriction on cross-border flow of data; (ii) local presence and taxability of foreign entities having significant economic presence in India; (iii) creating a robust digital infrastructure for e-commerce, from online custom clearance to online resolution of consumer complaints; (iv) promoting exports from India with a boost to start-ups and small firms; and (v) regulatory changes to augment economic growth in e-commerce.

The key highlights of the Draft Policy are as follows:  Continue Reading Doing Business in India? Keep an Eye on This….