As we mark Data Privacy Day, today is a good time to take stock of where U.S. privacy legislation stands in relation to the developments of the past few years.  In less than two years, the GDPR and the CCPA became the most comprehensive privacy laws in effect, granting individuals extensive rights over their information, creating numerous accountability requirements, and giving authorities the power to impose potentially massive fines. (For more information on the GDPR, see our blog posts, including those here and here.)

The CCPA ignited a debate about whether it rejects or maintains the “American approach” to privacy.  Some observers panned the CCPA for departing from the “American approach” of “largely permissionless innovation with a post hoc regulatory response to concrete [privacy] harms.”  Others criticized the CCPA for generally allowing personal data collection and use unless prohibited by a “specific legal rule.”

This debate is unlikely to be resolved soon or conclusively, but it is clear that at the federal and state levels, U.S. data privacy laws are likely to expand. While some states – including Washington and Virginia – are considering GDPR-influenced comprehensive bills, states also continue to consider and add laws that address specific data practices, which could cause further fragmentation in U.S. privacy laws and additional compliance challenges for companies.

How should companies manage the uncertain path that privacy legislation is following in the U.S.?  Taking a comprehensive, holistic look at an organization’s data practices is often key to complying with current requirements (such as the CCPA) and also is likely to be an effective way to manage disparate state laws as they develop.

Toward Comprehensive State Privacy Laws

The GDPR has had an important impact on global privacy laws. Argentina, Brazil, Malaysia, and Uruguay, among others, have adopted privacy laws modeled after the GDPR. The CCPA includes several GDPR elements, such as the rights to access and to deletion, though significant differences remain. (For a more granular look at how the GDPR and CCPA compare, see our comparison chart here.) In addition, Washington state legislators are currently pushing for a Washington Privacy Act (SB 6281), a new regulation governing data privacy and facial recognition. The bill explicitly references the GDPR, stating, “The European Union recently updated its privacy law through the passage and implementation of the general data protection regulation, affording its residents the strongest privacy protections in the world. Washington residents deserve to enjoy the same level of robust privacy safeguards” (emphasis added). Virginia similarly is considering privacy legislation that would give consumers the right to access their data and determine if it has been sold to a data broker (HB 473). Virginia’s bill would generally track the GDPR consumer rights, including the rights to access, correction, erasure, and the right to opt-out of further processing.

The Differences: An Example

Other aspects of the “American approach” to privacy are holding fast against the movement toward comprehensive laws. Biometric privacy exemplifies the differing approaches of the EU and U.S. In the EU, biometric data falls under the GDPR as a “special category of personal data,” and companies must not process this data unless they obtain explicit consent, or the processing meets other stringent grounds for lawful processing that apply across all EU member states. Also as part of the GDPR, any unauthorized access to or acquisition of biometric data that constitutes a data breach must be reported to the relevant authority within 72 hours.

In the United States, biometric privacy is a state law issue (for now), complemented by a handful of enforcement orders that address biometric data. Only three states have relevant laws on point – Illinois, Washington, and Texas – and the scope of and requirements under these laws vary considerably. For example, biometric information may trigger data breach notification obligations if it is compromised, but whether the obligations are triggered will vary from state to state. An important distinction also lies in enforcement capabilities – Illinois’s biometric law has a private right of action, whereas the Texas and Washington laws do not. Additionally, laws such as HIPAA and Title VII may provide additional protections in some situations.

Outside of the U.S. and EU, countries are following Europe’s lead. Very few countries have specific laws that govern biometric data, and instead include this data under a national law, which often contains informed consent requirements and data subject rights. While questions remain about how protective this approach is where biometric data is concerned, very few countries are addressing these questions through laws that apply to sectors or territories.

The Implications: A Conversation

While the EU approach to privacy seems to be winning globally, U.S. policymakers are not ignoring more targeted requirements that address specific data practices.  However, this piecemeal approach could also cause confusion, complexity, and expense. For example, the CCPA’s “Do Not Sell My Personal Information” requirement could quickly become impractical if states were to adopt different definitions of “selling” personal information.

Members of Congress who are considering a federal privacy bill have the chance to decide how much of the U.S. approach and how much the EU approach to put into any comprehensive federal law protecting personal information, as well as whether to include preemption and a private right of action. We will be watching closely to see how they decide.

At bottom, this is just the beginning for data privacy laws. Consumer data rights, governance and accountability requirements, and regulatory structures will surely evolve and likely expand. For companies attempting to build some future-proofing into their privacy programs, taking the time to understand their data practices, what types of personal information they collect and maintain, where it is, why and how long they need it, and whether the personal information is sufficiently protected against compromise will enable more options for business strategies as well as more efficiently manage enterprise risk in response to the changing legal landscape.

 

The CFPB announced today a policy statement outlining three new principles that it intends to apply when evaluating whether practices are “abusive” under the Dodd-Frank Act.  The Dodd-Frank Act marked the first time that a federal or state regulator was granted the authority to regulate broadly “abusive” acts and practices.  While Dodd-Frank provided a general standard that must be met for a practice to be considered “abusive,” many stakeholders have argued that the standard does not provide sufficient guidance regarding when a business practice (subject to CFPB jurisdiction) will be considered abusive.

The Policy Statement identifies three new principles that the CFPB intends to apply when evaluating if business practices are abusive:

  • Consideration of consumer harm and countervailing benefits.  The Bureau here emphasized that its overarching mission is to prevent consumer harm.  The principle here parallels the second prong of the “unfairness” standard codified in the FTC Act in 1994 after the FTC’s Unfairness Policy Statement sought to reign in the Commission’s use of its unfairness authority.  The similarity in substance and procedure is notable.
  • Avoiding “add-on” abusive allegations.  The Bureau noted that it plans to avoid alleging an abusiveness allegation when the underlying facts rely on all or nearly all of the same facts as an unfairness or deception allegation.  Conversely, where the Bureau alleges a standalone abusiveness count, “it intends to plead such claims in a manner designed to demonstrate clearly the nexus between the cited facts and the Bureau’s legal analysis of the claims.”
  • No civil penalties or disgorgement for abusiveness allegations where entity acts in good faith.  The Bureau indicated that it does not plan to seek civil penalties or disgorgement when it makes a standalone abusiveness allegation if the covered person made a good faith effort to comply with the law based on a reasonable interpretation of the reasonableness standard.  It may, however, still pursue monetary relief in the form of restitution.  At the same time, the Bureau also emphasized “that it is committed to aggressively pursuing the full range of monetary remedies against bad actors who were not acting in good faith in violating the abusiveness standard.”

The Bureau expressly noted that it was leaving open the possibility of engaging in a future rulemaking to further define the abusiveness standard.  Notably, under administrative law principles, a rulemaking would be harder to overturn down the road in the event that a new CFPB elected to chart a different course and a more expansive definition of abusiveness.

Kelley Drye & Warren LLP today announced the launch of a microsite dedicated to the legal issues regarding advertising, privacy and data security, and consumer product safety. The Advertising and Privacy Law Resource Center, available via www.KelleyDrye.com, provides practical, relevant information to help in-house counsel answer the questions and solve the problems that they face on a daily basis.

“The Resource Center is an online repository of our thought leadership and resources on subjects that affect our clients day-to-day,” said Christie Thompson, chair of the Advertising and Marketing practice. “Like the Ad Law Access Podcast we launched last year, our goal is to provide high-level, insightful analysis of the major issues in consumer protection law as they develop and deliver them in an easily consumable format for our clients.”

The site is organized around three key legal topics: Advertising and Marketing Standards; Privacy and Data Security and Consumer Product Safety. Each section includes curated content on specific areas within each topic.

In conjunction with the launch, Kelley Drye is holding a webinar on January 28 covering the basics of advertising law. Anyone who is new to these areas or in need of a refresher should join us for this online-only event. The webinar will have something for everyone, including attorneys, paralegals, compliance personnel, marketers, researchers, sales representatives, and executives. Information is available here.

In addition, Kelley Drye’s nationally recognized Advertising and Privacy practice groups offer a variety of online products including:

  • Ad Law Access: A regularly updated blog providing visitors with updates on advertising and privacy law trends, issues and developments.
  • Ad Law Access Podcast: Discussions with our team of advertising and privacy lawyers on the latest developments in the world of advertising law, privacy law and consumer protection.
  • Ad Law News and Views:  A newsletter delivered to inboxes every two weeks to help readers stay current on ad law and privacy matters.

On January 1, 2020, the Artificial Intelligence Video Interview Act went into effect in Illinois.  This is the first state law regulating the use of AI for employee interviews.

Illinois’ law reflects increasing scrutiny in the United States and globally of biometrics practices. The law is consistent with U.S. policymakers’ focus on addressing significant concrete harm (e.g., employment decisions) in connection with the use of AI and data analytics.

Here is what you need to know if you use AI to interview employees for positions based in Illinois:

  • The law applies to videos recorded by employers of interviews with applicants.
  • It requires employers to:
    • Notify applicants in writing before the interview that AI may be used to analyze the applicants facial expressions and consider their fitness for a position;
    • Provide applicants with information explaining how the AI works and what general types of characteristics it uses to evaluate them;
    • Obtain applicants’ consent to use the AI program, as described in the notice, before the interview.
  • The law is silent on enforcement, remedies, and penalties for violations.

The law does not define “artificial intelligence,” nor does it provide specific guidance about what the employer’s explanation of AI used in connection with video interviews should contain.

The law also provides employees with certain rights in connection with video recordings of the interviews:

  • The law prohibits employers from disclosing the videos beyond recipients with whose expertise or technology is necessary to evaluate the employee’s fitness for a position.
  • Applicants may request destruction of their video interviews.  Employers must delete videos within 30-days of such requests and instruct others who received the videos to delete them.

The use of AI in connection with hiring and employment has been under broader scrutiny.  For example,   in November 2019, the Electronic Privacy Information Center (“EPIC”) submitted a complaint to the Federal Trade Commission (“FTC”) against HireVue (a company that provides AI-based interviewing assessment technology to companies).  EPIC alleged that HireVue violated the FTC Act by denying that it uses facial recognition technology.  EPIC further alleged that HireVue’s use of facial recognition technology, biometric data, and AI systems is unfair under the FTC Act, unethical, and violates OECD principles on AI.  The FTC has not taken action pursuing this complaint.

Even if you are not hiring in Illinois but using AI to make hiring decisions, take a comprehensive and practical view of the technology and how you use it.

  • State biometric data laws and anti-discrimination laws likely apply to the use of this technology during the hiring process.
  • Be informed about the data these AI technologies use to make hiring decisions on your behalf.
  • Establish policies to address compliance with privacy and other relevant laws.

Illinois is the first state, but may not be the last, to enact laws regulating AI in the employment context.  We will be monitoring and updating you with any new developments.

When Casper launched in 2014, it set out to disrupt the mattress industry. Not only did they change the way mattresses were sold, the company changed the way mattresses were advertised. Among other things, Casper hired celebrities like Kylie Jenner to post pictures of their mattresses on Instagram and other influencers to post “unboxing” videos where they pulled mattresses out of their delivery boxes and tried them out.

Jenner Casper PostDespite Casper’s success using influencers, it apparently sees risks in its marketing strategy. When the company filed for an IPO last week, it warned investors that the “[u]se of social media and influencers may materially and adversely affect our reputation. Influencers with whom we maintain relationships could also engage in behaviors or use their platforms to communicate directly with our customers in a manner that reflects poorly on our brand and may be attributed to us or otherwise adversely affect us.”

We’ve written extensively about some of these risks in previous posts. So far, most of the formal legal actions have involved influencers that failed to disclose their connections to the companies whose products they promote, but that’s only one of the risks companies need to think about. Companies can also be held responsible if influencers make misleading claims about their products. And they can suffer reputational harm if the influencers engage in inappropriate behavior.

If you use influencers to promote your products, you should take steps to guard against these risks. Among other things, you should vet potential influencers before you hire them and ensure you have an agreement that is tailored to the campaign. You should also ensure that you monitor your influencers and that you have systems in place to detect and address problems before they get worse. If you want someone to take a fresh look at how you manage your campaigns, give us a call.

IN FASHION 2020: Kelley Drye’s 6th Annual Fashion and Retail Law Summit
On January 16, 2020, Kelley Drye will host the sixth annual IN FASHION: Fashion and Retail Law Summit for executives and in-house counsel. Kelley Drye lawyers and thought leaders from some of the world’s top fashion and retail companies will convene for a full day of presentations on hot button issues that impact the business.

This complimentary event is by invitation only. This year’s seminars will feature sessions on advertising, customs and trade, employment, intellectual property, government relations, litigation, privacy and data protection and litigation. If you or a colleague are interested in receiving an invitation, please contact infashion@kelleydrye.com.

10 Things You Need to Know to Protect Your Cannabis Brand Webinar
As cannabis prohibition comes to an end, marketers and manufacturers are grappling with how to create and protect their brands. Join partners Kristi Wolff and Mike Zinna on January 23 as they address 10 key brand protection elements relating to advertising, labeling, trademark and patent protection that are essential for building and maintaining a brand based on best practices. This presentation is appropriate for attorneys, marketers, regulatory, quality and executive-level personnel. Register here.

Advertising 101 Webinar
Please join us on January 28, 2020 for a webinar covering the basics of advertising law. Anyone who needs a refresher or is new to these areas should join us for this online-only event. The webinar will have something for everyone, including attorneys, paralegals, compliance personnel, marketers, researchers, sales representatives, and executives. Register for this webinar here.

Can’t Make Any of These Events: The Ad Law Access blog and Ad Law Access podcast are available 24/7 and provide updates on consumer protection trends, issues, and developments from the advertising and marketing practice of Kelley Drye. Our Ad Law News and Views:  A newsletter delivered to inboxes every two weeks to help readers stay current on ad law and privacy matters.

For more in depth information, our soon to launch Advertising and Privacy Law Resource Center is available now to friends of the firm. This “microsite” is dedicated to the legal issues regarding advertising, privacy and data security, and consumer product safety and provides practical, relevant information for in-house counsel.

On December 30, 2019, the United States Court of Appeals for the Ninth Circuit issued an opinion in Becerra v. Dr Pepper/Seven Up, Inc., No. 18-16721 (9th Cir.) that may be the final nail in the coffin of a series of cases filed against diet soda manufacturers in recent years.  The Ninth Circuit affirmed the Northern District of California’s dismissal of claims that Dr Pepper/Seven Up had mislead consumers into believing that drinking their diet soda products would “assist in weight loss or healthy weight management.”

The claims in Becerra were based on several studies finding that ingesting zero-calorie sweeteners, such as those used in diet sodas, may lead to greater calorie consumption and weight gain.  False advertising claims based on these studies have also been recently rejected by the Second Circuit, see Geffner v. Coca-Cola Co. (2d Cir. 2019), Manuel v. Pepsi-Cola Co. (2d Cir. 2019), and Excevarria v. Dr Pepper Snapple Group, Inc. (2d Cir. 2019), and the Ninth Circuit followed suit.  Looking to dictionary definitions, the Court found that the term “diet,” when used as an adjective on the packaging for soda and other beverages is commonly understood to mean “reduced in or free from calories” and that no reasonable consumer would understand it to mean anything else.  Because the operative complaint alleged, at most, that some consumers “may unreasonably interpret the term differently,” the Court held that the district court correctly dismissed the plaintiff’s false advertising claims as a matter of law.

Together, these decisions likely spell the end of the road for false advertising claims based on the use of the term “diet” to describe soft drinks.  They also may have broader implications for false advertising claims in general.  The question of whether a reasonable consumer would be deceived by a marketing claim is typically considered a question of fact appropriate for summary judgment or trial, but these decisions suggest that some of the most consumer-friendly courts in the country are willing to resolve this question at the pleadings stage when the plaintiff’s alleged interpretation of the claim is anything but reasonable.

The California Attorney General unveiled its data broker registry on Monday.  On or before January 31st, companies qualifying as a “data broker” based on the prior year’s activities are required to register their name and contact information with the Attorney General and may provide a statement concerning their data collection practices.  A list of “data brokers” will be published for public inspection.

California law defines a business as a “data broker” when it “knowingly collects and sells to third parties the personal information of a consumer with whom the business does not have a direct relationship.”  That definition is considerably broader than the definition of a “data broker” under Vermont’s now year-old registration requirement.

After registering for an account, a user on the Attorney General’s website can submit a registration of a data broker.  The registration page includes the following fields:

  • Data broker name
  • Email address
  • Website URL
  • Country
  • Address, City, State & Zip Code
  • A description of how a consumer may opt out of sale or submit requests under the CCPA.
  • A description of how a protected individual can demand deletion of information posted online under Gov. Code 6208.1(b) or 6254.21(c)(1).  These code sections relate to legal protections for government officials and victims of domestic violence, sexual assault, and stalking who would like their personal contact information removed from being posted publicly on the internet.
  • Additional information about data collecting practices.

In an emergency regulation approved on December 18, 2019, the California Department of Justice set the initial registration fee at $360.  The fee is based on an assumption that 1,000 data brokers will register, splitting almost evenly the $360,972 estimated costs of setting up the registration website.  By comparison, only 165 data brokers are listed in the Vermont data broker registry.

If you have any questions about whether your business qualifies as a data broker under California law, please contact Alysa Hutnik or Alex Schneider at Kelley Drye.

At the end of 2019, Governor Andrew M. Cuomo released the 10th proposal of his 2020 State of the State Agenda, which aims to eliminate the so-called “pink tax,” a gender-based pricing phenomenon that allegedly results in higher prices for good and services marketed towards women as compared to substantially similar alternatives marketed towards men.

The proposal was prompted by a study conducted by the New York City Department of Consumer Affairs, which analyzed prices of toys, clothing, personal care products, and home health products and concluded that, 42 percent of the time, products marketed towards women are 7 percent more expensive than those targeted towards men.  Consumer products fared even worse in the study results, which reflected a 13 percent price difference for products marketed towards women.

Governor Cuomo’s proposal is another step in a series of actions taken to reduce the gender wage gap.  In 2016, he signed legislation prohibiting a tax on menstrual products, making New York one of the first states to ban the “tampon tax,” and in 2019, he signed legislation mandating equal pay for substantially similar work.

New York is not the first (or the only) state to engage on this issue.  For example, California law prohibits businesses from gender-based pricing discrimination for services such as haircuts, alterations, and dry cleaning.  A bill that sought to extend that law to pricing of goods was withdrawn after opposition from retail and manufacturing companies.

There are, of course, various gender-neutral considerations that factor into pricing decisions, such as cost of materials, cost of manufacturing, packaging, tariffs, and advertising expenses.  Moreover, as the NYC study acknowledged, men’s and women’s products are rarely identical, making exact comparisons difficult and, in many instances, misleading.  But it remains to be seen how (and if) these differences will be factored in to the scope of the bill.

The plaintiffs’ bar has already identified the “pink tax” as a theory of liability under state consumer protection laws.  For example, in Missouri, putative class actions have been filed against dry cleaning services and various consumer product manufacturers and retailers alleging gender-based pricing discrimination.  There has also been litigation in multiple jurisdictions claiming that the “tampon tax” violates the equal protection clause of the U.S. and state constitutions.  With growing media coverage over the pink tax generally, as well as Governor’s Cuomo’s recent attention to the issue, we expect to see more class actions being filed in this space.

The January 1, 2020 effective date of the California Consumer Privacy Act (CCPA) has come and gone, but questions about how to comply with the law show no hint of disappearing.  As companies move past their efforts to comply with the law’s most visible requirement – providing notice at the point of collection and explaining data practices in a full privacy policy – the focus is sharpening on a broad array of operational and implementation questions.

While Attorney General Xavier Becerra has indicated his office will prioritize enforcement relating to the sale of minors’ personal information, will direct enforcement efforts at companies that are not showing a willingness to comply, and will not make major changes before finalizing the proposed regulations, the Attorney General has not fielded specific questions about how to implement the law.  This state of affairs has left companies scrambling to benchmark their compliance practices against competitors and the industry at large.

In this post, we provide some insights on common questions we are hearing about how to comply with the CCPA in the absence of clear guidance or precedent.  Of course, every company is different and companies should always consult with a privacy attorney before deciding on the best way to comply with the CCPA.

Why are so many companies posting a “Do Not Sell My Info” (DNSMI) button on their website if they do not sell personal information in exchange for money?

Companies that post a DNSMI button but do not sell personal information for money likely have determined that their provision of personal information to ad tech companies in connection with interest-based advertising is a “sale.”  Accordingly, they post the DNSMI button to enable consumers to opt out of these “sales.”

The question of whether, and under what circumstances, the use of third-party cookies, pixels, tags, etc. constitutes a “sale” and how to provide DNSMI choices is a flashpoint in the debate over how to interpret the CCPA (as discussed here, here, and here).  There is a growing consensus that only a lawsuit or a government enforcement action will resolve this matter.

For now, two ways of analyzing this question are emerging.  One position concludes that data collected via a third-party cookie, tag, or pixel may be a potential “sale” because the company adding that cookie, tag, or pixel to its website sends, makes available, or otherwise shares personal information to an ad tech provider in exchange for services, and, critically, where that provider does not restrict its use or sharing of that personal information for the provider’s or other entities’ commercial benefit (other than for a limited number of exempted purposes).

The other position is that the third party directly collects personal information via the cookie, tag, or pixel placed on a publisher’s website, and the publisher is not selling that personal information to the third party responsible for the tracker.

Each business, however, will need to evaluate, on a case-by-case basis, whether its interest-based advertising, analytics, and other forms of tracking may constitute a sale under the CCPA.  Often this starts with categorizing  the types of vendors and partners (i.e. ad tech, analytics, or other services); identifying each specific vendor or partner responsible for the tracker on the business’s site(s); and reviewing the vendor or provider’s publicly posted terms, privacy policy, and contract with the business, if there is one, to determine if the transfer of personal information to the vendor could reasonably qualify as a transfer for a business purpose to a service provider, or other exemption, or whether the transfer is likely a “sale.”

When can a business claim that its ad tech partner and purchased ad tech services are exempt from the “sale” provisions of the CCPA?

The CCPA provides an exemption from the definition of a “sale” when a business uses or shares with a “service provider” personal information of a consumer that is necessary and proportionate to perform a “business purpose.”  As a result, companies may want to determine (1) whether an ad tech vendor is a “service provider” and (2) whether that vendor performs its ad tech service for a “business purpose.”  Examining specific arrangements with each advertising partner is the best way to address this question and for each of the relevant services provided by the vendor.

Some of the major players in online advertising have laid down public markers that can be helpful in classifying interest-based advertising activities.  Examples include: Continue Reading CCPA Implementation: An Early Map