Creative abstract healthcare, medicine and cardiology tool concept: laptop or notebook computer PC with medical cardiologic diagnostic test software on screen and stethoscope on black wooden business office table with selective focus effect

In testimony to a Congressional subcommittee last week, FTC Bureau of Consumer Protection Director, Jessica Rich, explained the Commission’s efforts to protect consumers’ health data and repeated the Commission’s request for additional authority to go even further.  Bureau Director Rich began by noting the proliferation of consumer-directed health products, such as websites, wearable technology, and communications portals – many of which are not subject to the Health Insurance Portability and Accountability Act (HIPAA).  They are within the FTC’s jurisdiction, however, and the agency is concerned about the safekeeping of the massive amounts of consumer health information generated by these platforms.  She explained how the Commission has thus far addressed the privacy and security issues posed relative to health privacy and data security through enforcement, policy initiatives, and education. 

The Commission could be even more effective in deterring unfair and deceptive practices, she asserted, if Congress would pass legislation that would strengthen the Commission’s existing data security authority and expand the breach notification requirements to include a broader range of entities, such as health websites or online newsletters, which are not covered by current rules.  In addition, Bureau Director Rich called for Congress to expand the FTC’s civil penalty authority, jurisdiction over non-profits, and rulemaking authority under the Administrative Procedures Act. 

All in all, Bureau Director Rich’s testimony was consistent with the Commission’s approach of continually assessing new developments and emerging trends and threats in the privacy area and with the soon-departing Commissioner Brill’s remarks from February 2016 when she stated that “Neither new technologies nor small companies get a pass under the FTC Act.  So, trying to ‘fly under the radar’ as a small company is not a strategy that I recommend.”

 

New York Attorney General Eric Schneiderman recently announced settlements with three mobile health app developers resolving allegations that they made deceptive advertisements and had irresponsible privacy practices. The Attorney General alleged that the developers sold and advertised mobile apps that purported to measure vital signs or other indicators of health using just a smartphone. The apps had over a million downloads, giving these concerns considerable consumer reach. The Attorney General’s office reportedly became aware of the apps through consumer complaints and reports to the Health Care Bureau.

Failure to Properly Substantiate Health Benefit Claims

The NY AG’s core concerns regarding the advertising claims were as follows:

  • Runtastic created “Heart Rate Monitor, Heartbeat & Pulse Tracker”. The NY AG alleged that Runtastic promoted its app as a product that purports to measure heart rate and cardiovascular performance under stress but had not tested the app with users engaged in vigorous exercise.
  • Cardiio created and sold the “Cardiio Heart Rate Monitor”. Cardiio allegedly also marketed its app as a means of monitoring heart rate following vigorous movement but had not tested the app under those conditions. In addition, the NY AG alleged that Cardiio’s representations that its product was endorsed by MIT were deceptive.

Representations Consistent with a Regulated Medical Device

  • Matis’s “My Baby’s Beat-Baby Heart Monitor App” raised slightly different concerns. Matis allegedly promoted the app with statements such as “Turn your smartphone into a fetal monitor with My Baby’s Beat app” and language that encouraged consumers to use the app as an alternative to more conventional fetal heart monitoring tools.  The app allegedly had not undergone proper review by the FDA to be marketed as such, however.

As readers of this blog and our sister blog, Food and Drug Law Access, know, the FDA has authority to regulate medical devices and has taken a risk-based approach to consumer-directed mobile health products.  The FTC has been even more active than the FDA in bringing health-related enforcement actions, as we have written about here, here, and here.  As these federal agencies transition into a new administration, the NY AG is making clear with these settlements that regulators are still watching for potentially misleading health claims.

The NY AG also alleged several problematic privacy practices, including the following:

  • Failing to disclose the risk that third parties could re-identify de-identified user information,
  • Issuing conflicting statements on data sharing under the Privacy Policy and under the Privacy Settings,
  • Failing to disclose that the company collected and provided to third parties consumer’s unique device identifiers,
  • Employing a practice of consent by default, where a consumer is deemed to have consented to a privacy policy just by using the website, and
  • Failing to disclose that protected health information collected, stored, and shared by the company may not be protected under the Health Insurance Portability and Accountability Act.

As we noted in a previous post on privacy and data security in mobile health apps, legal compliance is all too often an afterthought when it comes to app development. These allegations underscore the importance of understanding and reconciling data collection and use practices with the statements companies make to consumers.

Last week, the FTC held its third and final spring privacy seminar on the implications of consumer generated and controlled health data. The seminar featured presentations by Latanya Sweeney, the FTC’s Chief Technologist, and Jared Ho, an attorney in the FTC’s Mobile Technology Unit, and a panel discussion with representatives from the Department of Health and Human Services, the Center for Democracy and Technology, and the private sector. During the two-hour seminar, the presenters and panelists recognized the benefits of health-related apps, but expressed concerns that consumers may be unaware of the apps’ information collection and transmission practices, as well as that the apps may not be covered by HIPAA. There was no consensus on the type of regulation, if any, needed.

Ms. Sweeney’s presentation, while highlighting the maxim that transparency establishes trust, documented the flow of consumer health data provided to hospitals, noting that consumer health data may flow – and often does flow – from hospitals to entities that are not covered by HIPAA. Additionally, although de-identified when sold, this information may be easily re-identified. Mr. Ho presented the results of an FTC study on the health information collected and transmitted by 12 mobile apps and two wearables. While the Commission did not review privacy policies, the study results revealed that the apps transmitted consumer health information to 76 third parties, many of which collected device information or persistent device identifiers (sometimes from multiple apps) and additional information, such as gender, zip code, and geolocation. Mr. Ho stated that there are significant health concerns when data is capable of being aggregated.

The panel, moderated by two FTC Division of Privacy and Identity Protection attorneys, featured Dr. Christopher Burrow, the Executive Vice President of Humetrix, Joseph Lorenzo Hall, Chief Technologist for the Center for Democracy and Technology, Sally Okun, Vice President for Advocacy, Policy and Patient Safety at PatientsLikeMe, and Joy Pritts, Chief Privacy Officer in the Department of Health & Human Services’ Office of the National Coordinator for Health Information Technology. The panelists spent a significant amount of time discussing the various entities covered – and not covered – by HIPAA, as well as the array of health-related websites and apps that are available to consumers. Some of the concerns raised were: (1) the potential for sensitive health information to be shared in ways consumers would not reasonably anticipate (and the inability to predict what consumers may deem “sensitive”); (2) the lack of a standard definition of “de-identified data”; (3) the potential for data re-identification; and (4) the ever-expanding definition of what constitutes “health” information.

Information on the seminar, including a transcript, is available here, and the FTC is accepting comments until June 9.

Ad Law Access PodcastAs covered in this blog post, on June 24, 2020, the Secretary of State of California announced that the California Privacy Rights Act (CPRA), had enough votes to be eligible for the November 2020 general election ballot. CPRA is a ballot initiative, which, if adopted, would amend and augment the California Consumer Privacy Act (CCPA) to increase and clarify the privacy rights of California residents. The result is a law that is closer in scope to robust international privacy laws, such as the GDPR.

On the latest episode of the Ad Law Access Podcast, Privacy partner Alysa Hutnik and associate Carmen Hinebaugh discuss the initial highlights of CPRA and provide some takeaways for you to begin to understand this new California privacy development.

Listen on Apple,  SpotifyGoogle Podcasts,  Soundcloud or wherever you get your podcasts.

For more information on health claims and other topics, visit:

Advertising and Privacy Law Resource Center

A recent Marketplace Tech podcast episode on the spike in demand for mental health apps caught our attention.  As shocking headlines and stay-at-home orders rolled across the country, demand for mental health apps increased almost 30% since the pandemic began, according to CNBC.  And there is a wide variety of options to choose from, with roughly 20,000 mental health apps available across app stores.  This got the editors of Marketplace Tech asking two questions:  Do mental health apps work?  And what are the regulatory and privacy implications?  It’s worth a listen when you have time and we figured that we could weigh in as well.

Do they work?

One psychiatrist interviewed for the Marketplace Tech story questioned whether the apps should be required to demonstrate effectiveness to the FDA prior to being marketed.  In fact, some of them are, but many are not.

The starting point for this analysis is whether the app or the software is regulated as a medical device.  The Food Drug and Cosmetic Act defines a device as “…an instrument, apparatus, implement, machine, contrivance, implant, in vitro reagent, or other similar or related article, including any component, part, or accessory”, that is “… intended for use in the diagnosis of disease or other conditions, or in the cure, mitigation, treatment, or prevention of disease, in man …” or “… intended to affect the structure or any function of the body of man or other animals…”  Apps that meet this definition are regulated as medical devices and are subject to FDA’s pre-market review requirements, unless they are low risk and subject to FDA’s enforcement discretion policy.

Given the need for patients and consumers to access mental health therapy remotely and in increased numbers over recent weeks, FDA relaxed its requirements for apps intended to help treat depression, anxiety, obsessive compulsive disorder and insomnia.  FDA’s Enforcement Policy for Digital Health Devices For Treating Psychiatric Disorders During the Coronavirus Disease 2019 (COVID-19) Public Health Emergency suspends the 510(k) premarket notifications, corrections and removal notifications, registration and listing requirements, and unique device identification (UDI) requirements for computerized behavioral health devices and other digital health therapeutic devices for psychiatric disorders where those devices do not create an undue risk during the COVID-19 emergency.

There are thousands of apps that relate to mental health and overall wellbeing in some way, however, many of which are not within the definition of “medical device” and do not require premarket review.  FDA’s General Wellness:  Policy for Low Risk Devices explains the agency’s enforcement discretion approach more generally.

In addition, thinking about the “do they work” question, companies marketing these products should also be mindful of the FTC’s claim substantiation requirements.  Health claims are subject to a particularly high bar for claim substantiation – competent and reliable scientific evidence.  In simple terms, this means evidence that is sufficient in quantity and quality such that experts in the field would agree that it supports the claim.  The FTC has pursued app developers (see here and here) whose claims exceeded their substantiation and has issued dozens of warning letters to marketers making aggressive claims that their products can prevent or treat COVID-19.  Companies marketing apps that claim to help address mental and physical health conditions should be mindful of the substantiation requirements and of closely tailoring their claims to their evidence.

What about privacy?

Many apps used by physicians are subject to HIPAA, but the vast majority of health-related apps are not covered by HIPAA.  As health-related apps have proliferated, companies are collecting and storing massive amounts of consumer data.  Many apps do not feature a clear explanation about privacy practices and how data is being stored or used.  As we’ve chronicled here, non-HIPAA health privacy and the need for developers to be transparent with consumers about their privacy practices has been an FTC concern for several years.  Our Advertising and Privacy Law Resource Center provides a wealth of free content to help app developers understand the applicable legal framework.

More specifically related to privacy and data tracking in the era of COVID-19, our “Data Privacy Considerations for Coronavirus Data Tools” provides key considerations for companies seeking to build contact tracing and related health apps.  These include issues such as the following: whether personal information is involved, what level(s) of transparency are appropriate relative to data practices, how to address government requests for information, and considerations related to licensing COVID-19-related personal information.

What’s the takeaway?

As daily life has increasingly shifted online, it’s more important than ever for app developers to understand how their products are regulated and to build those features in to the product and how it is marketed.  In addition, FDA’s temporary relaxation of pre-marketing review standards for certain mental health apps does not mean that the FTC’s claim substantiation and privacy compliance requirements are relaxed for health-related apps more generally.  If anything, we should anticipate an increased regulatory focus on these issues.

* * * *

Ad Law Access Podcast

On the latest episode of the Ad Law Access Podcast, Advertising and Marketing partner Kristi Wolff discusses three keys to making compliant health claims:  determining the product regulatory classification, claim substantiation standards, and the importance of context.  This episode is a prequel to her earlier Health Claims in the Context of COVID-19 episode which focused on recent FTC and FDA enforcement relating to false COVID-19 health claims and the importance of considering the current pandemic context in health-related marketing.

Listen on Apple,  SpotifyGoogle Podcasts,  Soundcloud or wherever you get your podcasts.

For more information, visit:

Ad Law Access Podcast -        Health Claims 101: Key Considerations For Making Compliant Health Claims On the latest episode of the Ad Law Access Podcast, Advertising and Marketing partner Kristi Wolff discusses three keys to making compliant health claims:  determining the product regulatory classification, claim substantiation standards, and the importance of context.  This episode is a prequel to her earlier Health Claims in the Context of COVID-19 episode which focused on recent FTC and FDA enforcement relating to false COVID-19 health claims and the importance of considering the current pandemic context in health-related marketing.

Listen on Apple,  SpotifyGoogle Podcasts,  Soundcloud or wherever you get your podcasts.

For more information on health claims and other topics, visit:

Advertising and Privacy Law Resource Center -        Health Claims 101: Key Considerations For Making Compliant Health Claims

Democrats Release Their Own COVID-19 Privacy LegislationFollowing the Republican-sponsored COVID-19 Consumer Data Protection Act of 2019, Democratic legislators recently introduced the Public Health Emergency Privacy Act. Senators Richard Blumenthal and Mark Warner of Connecticut and Virginia, respectively, and a group of Democratic Representatives, including Jan Schakowsky of Illinois and Anna Eshoo of California, introduced the measure.

While both measures similarly require “affirmative express consent” prior to most processing of personal information for COVID-19 purposes, notice prior to using the data, reporting requirements, and destruction after data use, the bills vary in many other respects. Some differences between the Republican and Democratic bills include preemption, enforcement authority, and civil and voting rights protections.

Perhaps the most material distinctions focus on preemption and enforcement – a common theme in federal privacy legislation. These areas continue to be sticking points between the parties in discussions regarding privacy legislation. While both measures allow for FTC and state attorney general enforcement, the Democrats’ bill also provides for a private cause of action, which would allow for damages between $100 and $1000 per negligent violation, and $500 and $5000 per reckless, willful, or intentional violation. And while the Republican measure expressly preempts any similar state measure, the Democratic measure expressly does not.

The Democratic measure also addresses other concerns regarding using health data for COVID-19 purposes where the Republican bill is silent. Specifically, the Democrats’ bill expressly prohibits the use of emergency health data for advertising or discriminatory purposes. The bill also requires the Secretary of Health and Human Services to work with both the U.S. Commission on Civil Rights and the FTC to submit a report examining how the collection, use, and disclosure of COVID-19 health information impacts civil rights issues.

In addition, the Democrats’ bill prevents government entities from restricting the right to vote based on an individual’s: (1) participation or non-participation in a program to collect emergency health data; (2) medical condition; or (3) emergency health data itself.

As with Congress’s debate over comprehensive federal privacy legislation, COVID-19 privacy legislation may come down to similar disputes over enforcement and preemption. Whether the parties will be able to agree on these issues as they apply in a more limited capacity remains to be seen.

Advertising and Privacy Law Resource Center

In light of concerns associated with attempts to use personal data to track the spread of COVID-19, a group of Republican Senators, led by Mississippi Senator Roger Wicker, introduced the COVID-19 Consumer Data Protection Act of 2020 today.

The bill imposes specific requirements on entities seeking to process precise geolocation data, proximity data, persistent identifiers, and personal health information (together, “covered data”) in association with COVID-19 mitigation efforts. Among other things, the Act would require:

  • Notice/Consent: Prior notice and affirmative express consent for the collection, processing, or transfer of covered data to track COVID-19, monitor social distancing compliance, or for COVID-19 contact tracing purposes;
  • Opt Out Rights: Giving individuals the right to opt out of such processing;
  • Deletion Rights: Deleting or de-identifying all covered data once the entity is no longer using it;
  • Data Processing Restrictions: A public commitment to limit the processing of the data, unless certain exceptions apply;
  • Notice: Posting a clear and conspicuous privacy policy within 14 days of the Act’s enactment that provides information about data transfers, data retention practices, and data security practices; and
  • Accountability: During the public health emergency, providing a bi-monthly public report identifying the aggregate number of individuals from whom the covered entity has collected, processed, or transferred covered data for COVID-19 purposes with additional detail about how and why that information was used.

The bill also requires covered entities to engage in data accuracy (including allowing the individual to report inaccuracies), data minimization, and data security practices. The FTC has enforcement authority under the bill and would also be required to release data minimization guidelines in relation to COVID-19 processing.

Separately, the bill explicitly exempts covered entities from requirements under the Communications Act or regulations in relation to this processing. The bill also preempts any similar state law, although state attorneys general have enforcement authority along with the FTC.

Whether Congress will pass the measure is unclear, as Democrats and public interest organizations have voiced concerns about the bill. Still, assuming Congress can agree, it’s worth monitoring to see whether the measure may be included in any upcoming COVID-19 relief bill.

Advertising and Privacy Law Resource Center

Ad Law Access Podcast - Health Claims in the Context of COVID-19

The FTC recently sent warning letters to companies for falsely claiming that their products can treat or prevent COVID-19. On the latest episode of the Ad Law Access Podcast, partner Kristi Wolff  discusses the importance of keeping the current pandemic context in mind when making health claims more generally.

Listen on AppleGoogleSoundcloudSpotify, or wherever you get your podcasts.

For more information on these and other topics, visit:

Ad Law Access Podcast

 

Data is helping governments, researchers, and companies across the world track the spread of the novel coronavirus, monitor cases and outcomes of COVID-19, and devise ways to halt the virus’s spread.  As part of these efforts, raw data, software tools, data visualizations, and other efforts are providing the public and policymakers with insights into the growth of the pandemic.

Personal information — some of which may be highly sensitive — is key to many of these efforts.  Although some regulators in the U.S. and abroad have made it clear that privacy laws and the exercise of enforcement discretion provide leeway to process personal information in connection with COVID-19, they have also made it clear that privacy laws continue to apply.  Federal Trade Commission (FTC) Chairman Joe Simons advises that the FTC will take companies’ “good faith efforts” to provide needed goods and services into account in its enforcement decisions but will not tolerate “deceiving consumers, using tactics that violate well-established consumer protections, or taking unfair advantage of these uniquely challenging times.”  And, with many eyes on the California Attorney General’s Office in light of recent requests to delay enforcement of the California Consumer Privacy Act (CCPA), an advisor to Attorney General Xavier Becerra was quoted as stating: “We’re all mindful of the new reality created by COVID-19 and the heightened value of protecting consumers’ privacy online that comes with it. We encourage businesses to be particularly mindful of data security in this time of emergency.”

Devoting some thought to privacy issues at the front end of COVID-19 projects will help to provide appropriate protections for individuals and address complications that could arise further down the road.  This post identifies some of the key privacy considerations for contributors to and users of COVID-19 resources.

1. Is Personal Information Involved?

Definitions of “personal information” and “personal data” under privacy laws such as the CCPA and the EU’s General Data Protection Regulation (GDPR) are broad.  Under the CCPA, for example, any information that is “reasonably capable of being associate with, or could reasonably be linked” with an individual, device, or household is “personal information.”  This definition specifically includes “geolocation data.”  Although some data sources provide COVID-19-related information at coarse levels of granularity, e.g., county, state, or national level, the broad definition of “personal information” under the CCPA, GDPR, and other privacy laws makes it worth taking a close look at geographic and other types of information to determine whether the data at issue in fact reasonably qualifies as “personal information,” or if it is sufficiently anonymized to meet privacy definitions of de-identified and/or aggregate data.  CCPA, HIPAA, and other privacy laws provide examples of what safeguards are expected to reasonably treat data as anonymized, and employing such standards can help avoid unnecessary privacy mishaps despite well-intentioned efforts.

2. What Level(s) of Transparency Are Appropriate About the Data Practices?

Although some COVID-19 tools may be exempt from statutory requirements to publish a privacy policy (e.g., the provider of the tool is not a “business” under the CCPA), there are still reasons for providers to explain what data they collect and how they plan to use and disclose the data:

  • Disclosures help individuals to reach informed decisions about whether they want to provide their data, e.g., by downloading an app and allowing it to collect their location and other information. If business practices and consumer expectations are not reasonably aligned around the data practices, the failure to provide an appropriate privacy notice could be deemed an unfair or deceptive practice, inviting the scrutiny of the FTC or State Attorneys General.
  • Developing a privacy policy (or other disclosure) can help provide internal clarification on what types of personal information (or not) an app or service needs and collects. A granular understanding of such data practices can help providers to identify and mitigate privacy and data security risks associated with such data practices.
  • Developing a disclosure about a provider’s data collection and usage can help clarify the decision-making structure among multiple stakeholders so that the group is better equipped to handle data governance decisions over the lifecycle of a project.

3. How to Address Government Requests/Demands for Personal Information?

Although much remains to be seen in how federal, state, and local governments will use personal information (if at all) to develop and implement strategies to slow the spread of coronavirus, it is not unreasonable to expect that government agencies will seek information from providers of COVID-19-related tools.  The extent to which a provider can voluntarily provide information to the government — as well as the procedures that the government must follow to compel the production of information (and maintain the confidentiality of it in personally identifiable form) — depends on several factors, including what kind of information is at issue and how it was collected.  Becoming familiar with the rules that apply to voluntary and compelled disclosures, and safeguards to help prevent such data from being subject to broad freedom of information laws,  before a request arrives can help save valuable time down the road.  In many of these scenarios, for example, aggregate or pseudonymous data may be sufficient.

4. What Considerations Are There for Licensing COVID-19-Related Personal Information?

Finally, any licensing of personal information in connection with COVID-19 tools deserves careful consideration, particularly if the CCPA applies.  The CCPA imposes notice and opt-out requirements on entities that “sell” personal information. “Sell” is defined to include disseminating, disclosing, or otherwise “making available” personal information to for-profit third parties in exchange for “monetary or other valuable consideration.”  Several types of open source licenses require users to accept certain restrictions on their use and/or redistribution of licensed data or software.  For example, the Creative Commons Attribution-NonCommercial 4.0 International license requires licensees to agree (among other conditions) not to use licensed content for commercial purposes.  Obtaining this promise in exchange for personal information could constitute “valuable consideration” and give rise to a “sale” under the CCPA.   In addition, while not a “sale,” sharing personal information with a government authority would qualify as a disclosure under CCPA and would need to be accurately disclosed in the privacy policy.

Neither the California Attorney General nor the courts have interpreted the CCPA in the context of open source licenses.  Until more authoritative guidance becomes available, it makes sense to think through the potential obligations and other consequences of applying and accepting specific license terms to COVID-19-related personal information.

Bottom line:  Personal information has a key role to play in shaping responses to the novel coronavirus.  Privacy laws remain applicable to this information.  Applying privacy considerations to COVID-19 related practices involving data collection, sharing, and analysis will help mitigate unnecessary harms to consumers, aside from those presented by the virus itself.

For other helpful information during this pandemic, visit our COVID-19 Resource Center.