Data is helping governments, researchers, and companies across the world track the spread of the novel coronavirus, monitor cases and outcomes of COVID-19, and devise ways to halt the virus’s spread.  As part of these efforts, raw data, software tools, data visualizations, and other efforts are providing the public and policymakers with insights into the growth of the pandemic.

Personal information — some of which may be highly sensitive — is key to many of these efforts.  Although some regulators in the U.S. and abroad have made it clear that privacy laws and the exercise of enforcement discretion provide leeway to process personal information in connection with COVID-19, they have also made it clear that privacy laws continue to apply.  Federal Trade Commission (FTC) Chairman Joe Simons advises that the FTC will take companies’ “good faith efforts” to provide needed goods and services into account in its enforcement decisions but will not tolerate “deceiving consumers, using tactics that violate well-established consumer protections, or taking unfair advantage of these uniquely challenging times.”  And, with many eyes on the California Attorney General’s Office in light of recent requests to delay enforcement of the California Consumer Privacy Act (CCPA), an advisor to Attorney General Xavier Becerra was quoted as stating: “We’re all mindful of the new reality created by COVID-19 and the heightened value of protecting consumers’ privacy online that comes with it. We encourage businesses to be particularly mindful of data security in this time of emergency.”

Devoting some thought to privacy issues at the front end of COVID-19 projects will help to provide appropriate protections for individuals and address complications that could arise further down the road.  This post identifies some of the key privacy considerations for contributors to and users of COVID-19 resources.

1. Is Personal Information Involved?

Definitions of “personal information” and “personal data” under privacy laws such as the CCPA and the EU’s General Data Protection Regulation (GDPR) are broad.  Under the CCPA, for example, any information that is “reasonably capable of being associate with, or could reasonably be linked” with an individual, device, or household is “personal information.”  This definition specifically includes “geolocation data.”  Although some data sources provide COVID-19-related information at coarse levels of granularity, e.g., county, state, or national level, the broad definition of “personal information” under the CCPA, GDPR, and other privacy laws makes it worth taking a close look at geographic and other types of information to determine whether the data at issue in fact reasonably qualifies as “personal information,” or if it is sufficiently anonymized to meet privacy definitions of de-identified and/or aggregate data.  CCPA, HIPAA, and other privacy laws provide examples of what safeguards are expected to reasonably treat data as anonymized, and employing such standards can help avoid unnecessary privacy mishaps despite well-intentioned efforts.

2. What Level(s) of Transparency Are Appropriate About the Data Practices?

Although some COVID-19 tools may be exempt from statutory requirements to publish a privacy policy (e.g., the provider of the tool is not a “business” under the CCPA), there are still reasons for providers to explain what data they collect and how they plan to use and disclose the data:

  • Disclosures help individuals to reach informed decisions about whether they want to provide their data, e.g., by downloading an app and allowing it to collect their location and other information. If business practices and consumer expectations are not reasonably aligned around the data practices, the failure to provide an appropriate privacy notice could be deemed an unfair or deceptive practice, inviting the scrutiny of the FTC or State Attorneys General.
  • Developing a privacy policy (or other disclosure) can help provide internal clarification on what types of personal information (or not) an app or service needs and collects. A granular understanding of such data practices can help providers to identify and mitigate privacy and data security risks associated with such data practices.
  • Developing a disclosure about a provider’s data collection and usage can help clarify the decision-making structure among multiple stakeholders so that the group is better equipped to handle data governance decisions over the lifecycle of a project.

3. How to Address Government Requests/Demands for Personal Information?

Although much remains to be seen in how federal, state, and local governments will use personal information (if at all) to develop and implement strategies to slow the spread of coronavirus, it is not unreasonable to expect that government agencies will seek information from providers of COVID-19-related tools.  The extent to which a provider can voluntarily provide information to the government — as well as the procedures that the government must follow to compel the production of information (and maintain the confidentiality of it in personally identifiable form) — depends on several factors, including what kind of information is at issue and how it was collected.  Becoming familiar with the rules that apply to voluntary and compelled disclosures, and safeguards to help prevent such data from being subject to broad freedom of information laws,  before a request arrives can help save valuable time down the road.  In many of these scenarios, for example, aggregate or pseudonymous data may be sufficient.

4. What Considerations Are There for Licensing COVID-19-Related Personal Information?

Finally, any licensing of personal information in connection with COVID-19 tools deserves careful consideration, particularly if the CCPA applies.  The CCPA imposes notice and opt-out requirements on entities that “sell” personal information. “Sell” is defined to include disseminating, disclosing, or otherwise “making available” personal information to for-profit third parties in exchange for “monetary or other valuable consideration.”  Several types of open source licenses require users to accept certain restrictions on their use and/or redistribution of licensed data or software.  For example, the Creative Commons Attribution-NonCommercial 4.0 International license requires licensees to agree (among other conditions) not to use licensed content for commercial purposes.  Obtaining this promise in exchange for personal information could constitute “valuable consideration” and give rise to a “sale” under the CCPA.   In addition, while not a “sale,” sharing personal information with a government authority would qualify as a disclosure under CCPA and would need to be accurately disclosed in the privacy policy.

Neither the California Attorney General nor the courts have interpreted the CCPA in the context of open source licenses.  Until more authoritative guidance becomes available, it makes sense to think through the potential obligations and other consequences of applying and accepting specific license terms to COVID-19-related personal information.

Bottom line:  Personal information has a key role to play in shaping responses to the novel coronavirus.  Privacy laws remain applicable to this information.  Applying privacy considerations to COVID-19 related practices involving data collection, sharing, and analysis will help mitigate unnecessary harms to consumers, aside from those presented by the virus itself.

For other helpful information during this pandemic, visit our COVID-19 Resource Center.

California is not the only state focused on privacy.  The New Jersey Attorney General’s Office recently emphasized how the Office is prioritizing its enforcement of such issues. Over its first year, the newly-created Data Privacy & Cybersecurity Section within the New Jersey Division of Law has initiated its own actions and joined several multi-state investigations.  Privacy also plays a prominent role in private actions and draft legislation in the Garden State.  Companies marketing or selling to New Jersey consumers or otherwise operating in the state should take steps to confirm their privacy compliance.

Reported Data Breaches

According to statistics released by the New Jersey Attorney General and Division of Consumer Affairs on October 31, 2019, there were 906 separate data breaches reported to the New Jersey State Police in 2018, compared to 958 breaches in 2017.  The number of individual residents impacted declined significantly from 2017 to 2018.  While over 4 million residents were impacted by 2017 breaches, that number fell to approximately 358,000 in 2018.  The 2018 total, however, is still nearly three-times the 116,000 residents impacted in 2016.

State Enforcement Actions

In response to these breach figures, New Jersey actively enforced against lax privacy practices.  Through the first three quarters of 2019, the Attorney General reported $6.4 million in recoveries.  Additionally, New Jersey served a leading role in several large-scale, multi-state recoveries for consumers over the last 9 months.  For example:

  • New Jersey was part of the Leadership Committee pushing the investigation and resolution of claims arising from a 2017 data breach at credit reporting agency Equifax that will result in payment of $575 to $700 million ($6.36 to NJ) as part of a global resolution of claims by the FTC, 50 U.S. states and territories, and individual consumers.
  • New Jersey was also one of 30 states to resolve data breach and consumer privacy claims against health insurer Premera Blue Cross Blue Shield.  Premera’s network had exposed the Social Security and sensitive health information of 10.4 million consumers, including approximately 40,000 NJ residents.  That settlement includes $10 million to the states (including $72,168 to NJ) as well as a $32 million fund for consumers and $42 million in required cybersecurity upgrades at Premera.
  • New Jersey was also part of the multi-state resolution of claims against retailer Neiman Marcus in response to a breach involving shoppers’ credit card numbers and other personal information.  NJ received $57,465 as part of a $1.5 million settlement, which impacted approximately 17,000 individuals with NJ addresses.

Private Consumer Actions

The millions of New Jersey residents impacted by data breaches and cybersecurity threats over the last several years has served as a large pool of potential private litigants.  The New Jersey courts remain an active destination for putative consumer class actions arising from data security and privacy issues.  In addition to recovery for losses, New Jersey’s Consumer Fraud Act includes provisions that can allow for treble damages as well as awards for all costs and attorney fees.  Such provisions make privacy and data breach issues a ripe target for private consumer claims.

Similarly, the District of New Jersey has handled a number of complex privacy matters, including the recently-formed Multi-District Litigation arising from a data breach at American Medical Collection Agency Inc. that implicates patient data from approximately 20 million people related to Quest Diagnostics and LabCorp.

Legislative Focus on Privacy

Following the national trend, New Jersey’s lawmakers have shown a consistent interest in increased regulation of data privacy and cybersecurity.  There are at least 18 separate bills currently pending in the Legislature that address privacy and cybersecurity.  That includes both Senate and Assembly legislation that would require development and implementation of a “comprehensive information security program” by businesses that handle personal information.  In May, Governor Murphy signed a bill expanding the definition of personal information to include online account information as part of the State’s data breach notification law.

With the increased public awareness of comprehensive privacy and cyber legislation garnered by the EU’s GDPR and California’s CCPA, businesses should be prepared for other states to follow suit.  Given its prior history as a leader on consumer-focused legislation, companies can expect New Jersey legislators to seriously consider additional privacy legislation.

New Jersey is only one example of how consumer privacy issues are being addressed at the state level.  Harmonizing business practices across state lines may prove challenging as these new laws regulating data practices are enacted.  For now, as a best practice, it’s helpful to:

  • Take steps to keep privacy and cybersecurity practices, policies, and procedures in line with each state where your customers reside;
  • Determine if your compliance program takes into account and reasonably addresses foreseeable risks to the personal information in your control, and whether this risk analysis is documented so you can point to it if needed if there’s a future lawsuit or government investigation;
  • Evaluate whether the business has sufficiently invested in adequate privacy and cybersecurity and insurance coverage that takes into account how the business, laws, and potential exposure are evolving; and
  • Consult with experienced practitioners in this area that can help guide and counsel your business on options for making practical updates to your compliance program mindful of the changing legal landscape.

A new bill introduced in the Senate Health, Education, Labor, and Pensions (HELP) Committee would impose federal regulatory obligations on health technology businesses that collect sensitive health information from their service users and customers.

The Protecting Personal Health Data Act, S.1842, introduced by Senators Amy Klobuchar (D-Minn.) and Lisa Murkowski (R-Alaska), seeks to close a growing divide between data covered by the Health Insurance Portability and Accountability Act (HIPAA) and non-covered, sensitive personal health data.

More specifically, the bill would regulate consumer devices, services, applications, and software marketed to consumers that collect or use personal health data. This would include genetic testing services, fitness trackers, and social media sites where consumers share health conditions and experiences. Often, these technologies and services are run independent from traditional, HIPAA healthcare operations involving hospitals, healthcare providers, and insurance companies.

The bill directs the U.S. Department of Health and Human Services (HHS) to promulgate rules that would strengthen the privacy and security of such personal health data. The bill contemplates that the new rule would:

  • Set appropriate uniform standards for consent related to handling of genetic data, biometric data, and personal health data;
  • Include exceptions for law enforcement, research, determining paternity, or emergency medical treatment;
  • Set minimum security standards appropriate to the sensitivity of personal health data;
  • Set limits on the use of the personal health data;
  • Provide consumers with greater control over use of personal health data for marketing purposes; and
  • Create rights to data portability, access, deletion, and opt-outs.

Inevitably, the success or failure of the legislation will be tied to federal baseline privacy legislation already pending in Congress. Those efforts are ongoing, but have lost momentum in recent months as focus turns to California’s new privacy law taking effect on January 1, 2020.

In June of this year, California passed the California Consumer Privacy Act (CCPA) giving California residents specific rights related to their online privacy, similar to those proscribed by GDPR. The law was passed hastily to avoid a stricter ballot measure on the subject, but Governor Brown recently signed a bill amending the law.

Many of the amendments clarify some of the CCPA’s “technical” errors, such as solidifying that the Act should not be enforced to contradict the California Constitution. The most significant change, however, deals with the enforcement of the Act. Although Section 1798.198 makes the Act operative on January 1, 2020, the newly-added Section 1798.185(7)(c) prevents the Attorney General from bringing an enforcement action under the Act until July 1, 2020, or six months after the final regulations made pursuant to the Act are published, whichever is sooner. Thus, although the effective date is January of 2020, the California Attorney General may not be able to bring enforcement actions until up to six months after the enactment date, depending on when the office promulgates regulations. The amendments also extend the date by which the Attorney General must promulgate regulations from January 1, 2020 to July 1, 2020.

Another point worth noting is that the amendments remove the requirement for a private plaintiff to inform the Attorney General of a claim he or she has brought to enforce his or her private cause of action under the Act. This eliminates the ability of the Attorney General to bring its own action in lieu of a private one.

Additional changes include specifying additional laws to which the Act does not apply, including: (1) the Confidentiality of Medication Information Act or regulations promulgated in response to HIPAA, or the Health Information Technology for Economic and Clinical Health Act; (2) the Federal Policy for Protection of Human Subjects; and (3) the California Financial Information Privacy Act. The amendments also limit the civil penalty to $2,500 per violation, or $7,500 for each intentional violation.

Although this bill has clarified some issues with the original law, this will likely not be the last set of amendments to the CCPA before it goes into effect. We will keep you posted.

 

Earlier this month, the Massachusetts Attorney General announced that her office had reached a settlement with a digital advertising company, Copley Advertising, Inc. (Copley), prohibiting the company from using mobile geofencing technology to target women at or near Massachusetts healthcare facilities to infer the health status, medical condition, or medical treatment of an individual.

Geofencing technology, as the name implies, takes account of a mobile user’s geolocation and enables advertising companies to tag smartphones within a geographic virtual fence and push targeted messages to consumers. Mobile advertisers can place targeted ads within the apps and browsers of these tagged consumer smartphones when users are in the virtual fence and, in some cases, for up to a month after the user has left the virtual fence.

In the advertising campaign at issue, Copley set mobile geofences at or near healthcare facilities to “abortion-minded women” who were sitting in waiting rooms at health clinics in a number of cities around the country.  The potentially unwanted ads included prompts such as “Pregnancy Help,” “You Have Choices,” and “You’re Not Alone,” that, when clicked, took the consumer to a webpage with abortion alternatives. According to Copley’s representations, the advertising company had not yet engaged in geofencing campaigns near Massachusetts clinics.

The Assurance of Discontinuance resolves the Massachusetts Attorney General Office’s allegations that Copley’s advertising practices would violate consumer protection laws  by:

  • Tracking consumers’ geolocation near or within medical facilities,
  • Disclosing that information to third-party advertisers, and
  • Targeting consumers with potentially unwanted advertising based on inferences about a private and sensitive health condition without the consumer’s consent.

The settlement is a good reminder for both advertisers and ad tech to consider the privacy implications of targeted advertising, whether in geofencing or other digital marketing strategies, and how privacy and broader consumer protection laws may apply.

iStock_000019536561Large-300x225At the Federal Communications Commission’s (“FCC”) Open Meeting on October 27, the Commission voted along party lines (3-2) to impose more stringent rules on broadband Internet service providers (“ISPs”). Chairman Tom Wheeler, along with Commissioners Rosenworcel and Clyburn voted in favor of the item, while Commissioners Pai and O’Rielly voted against it.

The new rules clarify the privacy requirements applicable to broadband ISPs pursuant to Section 222 of the Communications Act. The new rules also apply to voice services and treat call-detail records as “sensitive” in the context of voice services.

According to an FCC press release issued immediately after the meeting, these rules “establish a framework of customer consent required for ISPs to use and share their customers’ personal information that is calibrated to the sensitivity of the information.” The Commission further asserts that this approach is consistent with the existing privacy framework of the Federal Trade Commission (“FTC”). Continue Reading FCC Votes to Impose Aggressive New Privacy Rules on Broadband Providers

On October 6, 2016, Federal Communications Commission (FCC or Commission) Chairman Tom Wheeler published a blog entry on the Commission’s website outlining proposed privacy rules for broadband Internet Service Providers (ISPs). The proposed rules are scheduled to be considered by the full Commission at its monthly meeting on October 27, 2016. These rules come after the Commission received substantial public comment on its March notice of proposed rulemaking (discussed in an earlier blog post) from stakeholders representing consumer, public interest, industry, academics, and other government entities including the Federal Trade Commission (FTC). The proposed rules appear to soften several elements of the Commission’s initial proposal, which received considerable industry criticism.

The actual text of the proposed order is not available, however, a fact sheet along with the Chairman’s blog post outlines the details of the proposal. Under the proposal, mobile and fixed broadband ISPs would have the following requirements:

  • Clear Notification. ISPs would be required to notify consumers about the type of information they collect; explain how and for what purposes that information can be shared or used; and identify the types of entities with which they share information. ISPs will also be responsible for providing this information to customers when they sign up for a service and regularly informing them of any significant changes. The Commission’s Consumer Advisory Committee will be tasked with creating a standardized privacy notice format that will serve as a “safe-harbor” for those ISPs that choose to adopt it.
  • Information Sensitivity-Based Choice. ISPs must get a customer’s “opt-in” consent before using or sharing information deemed sensitive. Geo-location information, children’s information, health information, financial information, social security numbers, web browsing history, app usage history, and communications content are the broad categories of data that would be considered sensitive. All other individually identifiable customer information would be deemed non-sensitive, and will be subject to an “opt-out” approval requirement. For example, the use of service tier information to market an alarm system would be considered non-sensitive and opt-out policies would be appropriate, consistent with customer expectations.  Finally, the rules will infer consent for certain purposes identified in the Communications Act, including the provision of broadband service or billing and collection.
  • Security.
    • Protection: ISPs must take reasonable measures to protect consumer information from vulnerabilities. To help ensure reasonable data protection efforts, ISPs may: a) adopt current industry best practices; b) provide accountability and oversight for security practices; c) use robust customer authentication tools; and d) conduct data disposal consistent with FTC best practices and the Consumer Privacy Bill of Rights.
    • Breach Response: ISPs must notify customers when data is compromised in a way that results in unauthorized disclosure of personal information. ISPs must notify a) the customer no later than 30 days after discovery of the breach; b) the FCC no later than 7 business days after discovery; and c) if it affects more than 5,000 customers, the FBI and U.S. Secret Service no later than 7 business days after discovery.

The proposal addresses other issues, such as,

  • sharing and using de-identified information consistent with the FTC framework;
  • the use of take-it-or-leave-it data usage or sharing policies; and
  • heightened disclosure requirements for discount plans based on consent to data use.

The proposal emphasizes its focus on broadband services. The proposed rules will not apply to the privacy practices of websites or apps, including those operated by ISPs for their non-broadband services, as the Commission believes this is the purview of the FTC.  This is particularly notable in light of the recent 9th Circuit AT&T decision, which has further blurred the boundaries of the FCC and FTC’s jurisdiction (addressed in an earlier blog post). In that case, the Court determined that the FTC’s “common carrier exemption” is “status-based,” and as such exempts telecommunications carriers (like ISPs) from FTC jurisdiction, regardless of whether the company in question is engaging in common carrier activities. Presumably, the 9th Circuit’s reading of the common carrier exemption would extend to websites and apps provided by an ISP, although Chairman Wheeler appears to take a different reading in his privacy proposal.

In response to Chairman Wheeler’s proposal, FTC Chairwoman Ramirez expressed her pleasure with the FCC’s efforts to protect consumer privacy.

We will be tracking this proceeding as it develops, and will follow up with a client advisory when the Commission releases its final rules.

*Avonne Bell, an associate in Kelley Drye’s Communications Practice Group, co-authored this post.

On September 27th, the Senate Committee on Commerce, Science, and Transportation held a general oversight hearing of the FTC, which covered a multitude of major policy issues and included testimony from Chairwoman Edith Ramirez, Commissioner Maureen Ohlhausen, and Commissioner Terrell McSweeny.  Chairman John Thune (R-SD) convened the hearing, joined by Senator Richard Blumenthal (D-CT) who sat in for Ranking Member Bill Nelson (D-FL), who was not in attendance.  Several other Committee members also participated in the hearing, cycling through as schedules permitted on what appeared to be a jam-packed day.  Members in attendance included: Senators Dean Heller (R-NV), Amy Klobuchar (D-MN), Brian Shatz (D-HI), Jerry Moran (R-KS), Steve Daines (R-MT), Dan Sullivan (R-AK), Edward Markey (D-MA), Tom Udall (D-NM), Kelly Ayotte (R-NH), Maria Cantwell (D-WA), and Deb Fischer (R-NE).

The CommSenate Committeeissioners’ opening statements focused on key issues related to the agency’s mandate including enforcement, policy development, business education, and competition promotion.  But for members and Commissioners alike, privacy and data security were the clear headline issues of the day.  A variety of related topics were also raised, including protecting children online, the Internet of Things (IOT), tourism, credit reports, telecommunications, and deceptive claims.  A brief summary of these issues follows.  Continue Reading Senate Commerce Committee Members Air Laundry List of Pressing Issues Including Privacy, Data Security, and FTC Enforcement

1Jessica Rich, Director of the FTC’s Bureau of Consumer Protection, highlighted the agency’s enforcement priorities at the National Advertising Division’s annual conference earlier this week.  Key mentions included the following:

  • Health Claims and Sensitive Populations – With health claims being a constant enforcement priority, Ms. Rich referenced cases involving cognition claims, alleged diabetes cures, and products that featured “gene-altering” claims.  She noted that more cases involving products targeted to sensitive populations – including older consumers – are in the pipeline.
  • Health Apps, Privacy and Data Security – As she has in prior testimony, Bureau Director Rich expressed concern about false cures featured on health apps and consumers’ growing interest in and use of health technology, and said that such products are a growing part of the FTC’s advertising program.  Consistent with these statements, Ms. Rich also encouraged companies offering fitness devices, wearable technology, and other internet of things (IoT) products to figure out how to provide adequate privacy choices for consumers.  Related to this, Ms. Rich highlighted the TrendNet and ASUSTek cases involving data security breaches and IoT products.  She also noted that there are a number of investigations underway.
  • Homeopathics – In September 2015, the FTC hosted a workshop on homeopathic products and claims.  Ms. Rich stated that substantive guidance on this topic will be released in the near future.
  • ROSCA – Consistent with the FTC’s interest in ensuring adequate disclosures in advertising, Ms. Rich noted that enforcement of the Restore Online Shoppers Confidence Act (ROSCA) is ongoing and that more cases are in the pipeline.  One of these is the FTC’s case against DIRECTV, which we wrote about here, in which the court recently found a triable issue of fact relative to the use of hyperlinks and info-hovers.

We will be following all of these developments closely.

The FTC recently reinforced its commitment to protecting consumer health data in its settlement with electronic health record company Practice Fusion.  The company, which stores consumer health data in a cloud for healthcare providers, was charged with misleading consumers when it sought patients’ reviews of their doctors without disclosing that the information would be shared online.

According to the complaint, patients were asked to rate their doctors in an email sent by the company. The email indicated that the patient’s information would be shared with his or her physician.  After providing an initial review, patients were then sent to a survey where they could give more information about their recent appointment.  Included in the survey was a text box where the patient could share comments.  Here, many patients entered private information.  This included full names, phone numbers, and details about medical conditions.  Practice Fusion’s privacy policy did not indicate that the company would publicly post reviews by patients.

Practice Fusion then launched a website providing reviews of the physicians. These reviews included the patients’ names, telephone numbers, and health information that they provided in their surveys.  It wasn’t until after the information was posted online that Project Fusion updated its privacy policy and implemented procedures to keep personal information from appearing on the site.

Per the settlement, Practice Fusion is prohibited from misrepresenting its use of consumer data. Additionally, the company must disclose that it will make information publicly available, separate from a general “privacy policy” or “terms of use” page, and receive the consumer’s affirmative express consent to do so.  The company must also refrain from sharing healthcare provider review information with anyone other than its clients.

This settlement highlights the FTC’s continued efforts to protect the privacy of consumer health data. In April, the FTC released compliance tools and best practices specifically for health app developers to ensure they were complying with the FTC’s expectations for health data providers.  This came just after the Director of the FTC Bureau of Consumer Protection testified to a Congressional subcommittee about the need for the Commission to have increased data security authority to address the area of health privacy.  Collectively, these efforts make clear that as consumers increasingly turn to the internet for health information, the FTC will expect companies large and small to be aware of their obligations to consumers and to comply with them.

Summer Associate Lauren Myers contributed to this post. Ms. Myers is practicing under the supervision of principals of the firm who are members of the D.C. Bar.