Creative abstract healthcare, medicine and cardiology tool concept: laptop or notebook computer PC with medical cardiologic diagnostic test software on screen and stethoscope on black wooden business office table with selective focus effect

In testimony to a Congressional subcommittee last week, FTC Bureau of Consumer Protection Director, Jessica Rich, explained the Commission’s efforts to protect consumers’ health data and repeated the Commission’s request for additional authority to go even further.  Bureau Director Rich began by noting the proliferation of consumer-directed health products, such as websites, wearable technology, and communications portals – many of which are not subject to the Health Insurance Portability and Accountability Act (HIPAA).  They are within the FTC’s jurisdiction, however, and the agency is concerned about the safekeeping of the massive amounts of consumer health information generated by these platforms.  She explained how the Commission has thus far addressed the privacy and security issues posed relative to health privacy and data security through enforcement, policy initiatives, and education. 

The Commission could be even more effective in deterring unfair and deceptive practices, she asserted, if Congress would pass legislation that would strengthen the Commission’s existing data security authority and expand the breach notification requirements to include a broader range of entities, such as health websites or online newsletters, which are not covered by current rules.  In addition, Bureau Director Rich called for Congress to expand the FTC’s civil penalty authority, jurisdiction over non-profits, and rulemaking authority under the Administrative Procedures Act. 

All in all, Bureau Director Rich’s testimony was consistent with the Commission’s approach of continually assessing new developments and emerging trends and threats in the privacy area and with the soon-departing Commissioner Brill’s remarks from February 2016 when she stated that “Neither new technologies nor small companies get a pass under the FTC Act.  So, trying to ‘fly under the radar’ as a small company is not a strategy that I recommend.”

 

New York Attorney General Eric Schneiderman recently announced settlements with three mobile health app developers resolving allegations that they made deceptive advertisements and had irresponsible privacy practices. The Attorney General alleged that the developers sold and advertised mobile apps that purported to measure vital signs or other indicators of health using just a smartphone. The apps had over a million downloads, giving these concerns considerable consumer reach. The Attorney General’s office reportedly became aware of the apps through consumer complaints and reports to the Health Care Bureau.

Failure to Properly Substantiate Health Benefit Claims

The NY AG’s core concerns regarding the advertising claims were as follows:

  • Runtastic created “Heart Rate Monitor, Heartbeat & Pulse Tracker”. The NY AG alleged that Runtastic promoted its app as a product that purports to measure heart rate and cardiovascular performance under stress but had not tested the app with users engaged in vigorous exercise.
  • Cardiio created and sold the “Cardiio Heart Rate Monitor”. Cardiio allegedly also marketed its app as a means of monitoring heart rate following vigorous movement but had not tested the app under those conditions. In addition, the NY AG alleged that Cardiio’s representations that its product was endorsed by MIT were deceptive.

Representations Consistent with a Regulated Medical Device

  • Matis’s “My Baby’s Beat-Baby Heart Monitor App” raised slightly different concerns. Matis allegedly promoted the app with statements such as “Turn your smartphone into a fetal monitor with My Baby’s Beat app” and language that encouraged consumers to use the app as an alternative to more conventional fetal heart monitoring tools.  The app allegedly had not undergone proper review by the FDA to be marketed as such, however.

As readers of this blog and our sister blog, Food and Drug Law Access, know, the FDA has authority to regulate medical devices and has taken a risk-based approach to consumer-directed mobile health products.  The FTC has been even more active than the FDA in bringing health-related enforcement actions, as we have written about here, here, and here.  As these federal agencies transition into a new administration, the NY AG is making clear with these settlements that regulators are still watching for potentially misleading health claims.

The NY AG also alleged several problematic privacy practices, including the following:

  • Failing to disclose the risk that third parties could re-identify de-identified user information,
  • Issuing conflicting statements on data sharing under the Privacy Policy and under the Privacy Settings,
  • Failing to disclose that the company collected and provided to third parties consumer’s unique device identifiers,
  • Employing a practice of consent by default, where a consumer is deemed to have consented to a privacy policy just by using the website, and
  • Failing to disclose that protected health information collected, stored, and shared by the company may not be protected under the Health Insurance Portability and Accountability Act.

As we noted in a previous post on privacy and data security in mobile health apps, legal compliance is all too often an afterthought when it comes to app development. These allegations underscore the importance of understanding and reconciling data collection and use practices with the statements companies make to consumers.

Last week, the FTC held its third and final spring privacy seminar on the implications of consumer generated and controlled health data. The seminar featured presentations by Latanya Sweeney, the FTC’s Chief Technologist, and Jared Ho, an attorney in the FTC’s Mobile Technology Unit, and a panel discussion with representatives from the Department of Health and Human Services, the Center for Democracy and Technology, and the private sector. During the two-hour seminar, the presenters and panelists recognized the benefits of health-related apps, but expressed concerns that consumers may be unaware of the apps’ information collection and transmission practices, as well as that the apps may not be covered by HIPAA. There was no consensus on the type of regulation, if any, needed.

Ms. Sweeney’s presentation, while highlighting the maxim that transparency establishes trust, documented the flow of consumer health data provided to hospitals, noting that consumer health data may flow – and often does flow – from hospitals to entities that are not covered by HIPAA. Additionally, although de-identified when sold, this information may be easily re-identified. Mr. Ho presented the results of an FTC study on the health information collected and transmitted by 12 mobile apps and two wearables. While the Commission did not review privacy policies, the study results revealed that the apps transmitted consumer health information to 76 third parties, many of which collected device information or persistent device identifiers (sometimes from multiple apps) and additional information, such as gender, zip code, and geolocation. Mr. Ho stated that there are significant health concerns when data is capable of being aggregated.

The panel, moderated by two FTC Division of Privacy and Identity Protection attorneys, featured Dr. Christopher Burrow, the Executive Vice President of Humetrix, Joseph Lorenzo Hall, Chief Technologist for the Center for Democracy and Technology, Sally Okun, Vice President for Advocacy, Policy and Patient Safety at PatientsLikeMe, and Joy Pritts, Chief Privacy Officer in the Department of Health & Human Services’ Office of the National Coordinator for Health Information Technology. The panelists spent a significant amount of time discussing the various entities covered – and not covered – by HIPAA, as well as the array of health-related websites and apps that are available to consumers. Some of the concerns raised were: (1) the potential for sensitive health information to be shared in ways consumers would not reasonably anticipate (and the inability to predict what consumers may deem “sensitive”); (2) the lack of a standard definition of “de-identified data”; (3) the potential for data re-identification; and (4) the ever-expanding definition of what constitutes “health” information.

Information on the seminar, including a transcript, is available here, and the FTC is accepting comments until June 9.

Last year’s voter guide to California Proposition 24, the California Privacy Rights Act (CPRA), included a stark argument against enacting the privacy ballot initiative because it did not go far enough to protect employee privacy.  “Currently, employers can obtain all kinds of personal information about their workers and even job applicants,” the argument against Proposition 24 written by Californians for Privacy Now stated.  “Proposition 24 allows employers to continue secretly gathering this information for more years to come…”

The message did not stick.  Voters overwhelmingly enacted the CPRA, apparently judging that its provisions – including those that apply to employers – were worth an additional two-year waiting period.  The effective date of the new law is January 1, 2023.

As companies build their roadmap to CPRA compliance, that assessment should also take into account planning for employee and job applicant privacy changes.  The new law imposes first in the nation obligations that grant employees and job applicants new rights to access, correct, delete, and opt out of the sale or sharing of their personal information.  The law also prohibits discriminating against employees or job applicants who lodge privacy rights requests.

In this post, we provide an overview of topics that employers should know as the sunset of the employer exception to CCPA approaches.

Why Would CCPA Apply to Employers?

The California Consumer Privacy Act of 2018 (CCPA), which became effective on January 1, 2020, originally applied to employers.  The law defines a “consumer” as a natural person who is a California resident.  This includes employees, job applicants, contractors, or other staff of a business.

In 2019, the California legislature amended the CCPA with a stopgap measure – for one year, the CCPA would not apply to employers.  The measure, AB 25, said that personal information collected by a business in the course of the person acting as an employee, job applicant, or contractor in connection with the consumer’s employee, job applicant, or contractor role is exempt from the CCPA.  Also exempt is emergency contact information or information necessary to administer benefits.

Last year, California voters extended the employer exemption for another two years to January 1, 2023 in the CPRA ballot initiative.

What Employers are Covered by California Privacy Law?

If a business is covered by the CCPA for consumer data, it is covered for employee data.  Starting in January 2023, the CPRA thresholds for coverage are as follows:

  • Annual gross revenues in excess of $25 million in the preceding calendar year,
  • Buys, sells, or share personal information of 100,000 or more California consumers or households, or
  • Derives 50 percent or more of its annual revenues from selling or sharing California consumers’ personal information.

Some employers may be eligible for certain exemptions that are applicable to already-regulated information that they hold about their employees.  For example, credit information that employers routinely collect to assess employment eligibility may be subject to an exception, because the information is already covered under federal fair credit reporting laws.

Also, employers that have existing obligations as business associates under the Health Insurance Portability and Accountability Act (HIPAA) may also be exempt with respect to any medical, protected health information (PHI), or covered benefits information that they maintain, use, or disclose.

In general, employers are also not required to comply with CPRA obligations that conflict with other federal, state, or local laws or legal obligations, or restrict an employer’s ability to exercise or defend legal claims.  For example, affirmative legal obligations to gather and maintain certain information, such as EEO-1 reports or compensation-related information may directly conflict with CPRA.

What Constitutes Employee Personal Information?

The definition of employee “personal information” includes information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular employee.

This may include name, contact information, identifiers, protected classifications (like gender, race, or sexual orientation), financial or medical information, account log in, religious or philosophical beliefs, union membership, commercial information, biometric information, internet or electronic network activity information, geolocation data, audio, electronic, visual, thermal, olfactory, or similar information, professional or employment-related information, education information, and inferences drawn from any of this information about the employee.

The contents of an employee’s mail, email, and text messages constitutes sensitive personal information, a sub-category of personal information, unless the employer is the intended recipient of the communication.

What Obligations Apply Starting in January 2023?

All CPRA obligations apply.  These include:

  • Notice:  Employees will be required to provide a comprehensive notice of their collection of personal information from employees, job applicants, and contractors, including description of the categories of personal information collected, the purposes of collection, details on disclosure of personal information, and information about retention of personal information.
  • Right to access:  Provide employees with a right to access categories of personal information and specific pieces of personal information.  This includes any inferences drawn from personal information to create a profile reflecting the employee’s preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.
  • Right to correct:  Provide employees with the right to correct their personal information using commercially reasonable efforts.
  • Right to delete:  Provide employees the right to delete their personal information.  However, numerous statutory exemptions may apply, including allowing an employer to retain personal information reasonably anticipated by the employee within the context of an ongoing relationship with the employer, to perform a contract between the employee and employer, or to comply with a legal obligation.
  • Right to restrict uses of sensitive personal information:  Sensitive personal information includes a social security number, account log in, financial information, geolocation, racial or ethnic origin, religious beliefs, sexual orientation, health information, biometrics, and the contents of employee communications unless the employer is the intended recipient of the communication.  Starting in January 2023, an employee may be able to direct an employer to limit certain uses of sensitive personal information for specific business purposes, as well as to direct an employer to limit disclosure of sensitive personal information, absent a qualifying exemption.
  • Right to opt out:  Provide employees the right to opt out of the sale of personal information to third parties. The term “sale” is a broad term, and includes disclosing employee information to business partners, vendors, and contractors absent a written agreement containing specific terms restricting the third party’s use of that data, or a qualifying exemption.

Certain obligations are subject to change depending on action expected in the coming year from the newly constituted California Privacy Protection Agency.

What Steps Should Employers Take to Prepare?

Given the complexity of HR data and systems, as well as the sensitivity of employee data generally, it is not too early for employers to prepare for CPRA.  Such efforts might include, for example:

  • Privacy Stakeholders:  Determine the legal, HR, and technology support (internal resources or external technology solutions) responsible for the efforts necessary to build a privacy compliance program and respond to privacy rights requests.
  • Data Mapping:  Understand the information that the business collects, the categorization of data (whether personal information or sensitive personal information), the location of the data, and the steps to access, correct, or delete the data.  A major part of this effort should also include determining which data practices identified are subject to applicable exemptions from CPRA.
  • Contract Review:  Review partner contracts to correctly classify service providers and contractors from third parties, and that the contracts include the necessary restrictions depending on the classification. This effort might prioritize those partners that present more risk to the company, whether due to the nature of the processing, type, or volume of data in scope. Updating these contracts, however, might wait until there is more insight on the forthcoming CPRA regulations by the California Privacy Protection Agency (CalPPA) as to necessary terms, although the CCPA regulations are instructive.
  • Response Procedures:  Develop procedures for responding to employee requests, including managing sensitive requests while maintaining personal information as confidential and accessible to internal personnel only on a need-to-know basis.
  • Retention Policy:  Develop and document a retention policy that complies with applicable employer data retention obligations.
  • Notice:  Draft an employee privacy policy that complies with new statutory obligations under CPRA, as well as forthcoming regulations by the CalPPA.

Do Any of These Obligations Apply Now? 

Employers may have an obligation to provide a notice at or before collection of personal information that details the categories of personal information that they collect and the purposes for which personal information will be used.

However, due to an apparent drafting error in the CPRA ballot initiative, this privacy notice obligation is muddled by a textbook case of unclear statutory construction.

Here’s what happened.  Originally, AB 25 required employers to provide a privacy notice to employees.  However, the CPRA ballot initiative from last year changed a critical code section reference in an apparent drafting error.  In so doing, the CPRA ballot initiative left unclear whether the employer privacy notice is required.

AB 25 said that employers would be required to provide a privacy notice based on Cal. Civ. Code 1798.100(b).  The CPRA ballot initiative changed the reference to Cal. Civ. Code 1798.100(a).  It is possible that the drafters intended to point to subsection (a) because in the CPRA ballot initiative this code section also requires a privacy notice.  But the CPRA ballot initiative version of the code section is not actually the law until January 1, 2023.

That’s a problem because under current law (effective until December 31, 2022), Cal. Civ. Code 1798.100(a) talks about a different topic entirely – giving consumers the right to request that a business disclose the categories and specific pieces of personal information the business has collected about a consumer.

What is a reasonable interpretation in light of this problem?  When it comes to statutory interpretation of ballot initiatives, courts generally say that the drafter’s intent does not matter.  In California, usually a court first looks at the language of the statute.  If the language is not ambiguous, the court presumes the voters intended the meaning apparent from the language.  If the language is ambiguous, then courts usually look at the ballot initiative voter materials for clues on how voters made their decision.

It is easy to see why a court might agree that the language is ambiguous.  The employer exception clearly does not provide a right of employees to access their personal information until January 1, 2023.  Giving full effect to 1798.100(a) would be hampered by the fact that the CCPA’s core instructions on how to provide access to personal information and what to provide are subject to the employer exemption.

This brings us back to the ballot initiative materials provided to voters.  The arguments against proposition 24 from Californians for Privacy Now warn that employers will be able to secretly gather personal information “for more years to come.”  Clearly, there is no recognition in the ballot initiative materials of any interim employee rights.

Bottom line?  The law right now is unclear, and so, as a practical matter, it’s a best practice (and required in a few other states) to publish a privacy notice for employees and job applicants.

Final Question:  Do Employers Have Privacy Obligations in Other States?

There are no other states that have enacted CPRA-style comprehensive privacy laws that apply to employees; for example, Virginia and Colorado explicitly exempted the employment context without a sunset.  But there are some states, such as Connecticut, that do require some form of privacy notice to employees. There are also two-party consent requirements in a number of states that are applicable to recording calls, as well laws that require disclosure about electronic monitoring.

Conclusion

The best way to address navigating these developments is to plan ahead with a compliance roadmap leading to 2023.  Figure out what resources you’ll need, including what types of internal and external support will be critical for success. Given the complexities involved, thoughtful (and realistic) preparation is a must.

*                      *                      *

CPRA Update: How to Prepare for Privacy Compliance as an Employer

Subscribe here to Kelley Drye’s Ad Law Access blog and here for our Ad Law News and Views newsletter. Visit the Advertising and Privacy Law Resource Center for update information on key legal topics relevant to advertising and marketing, privacy, data security, and consumer product safety and labeling.

Kelley Drye attorneys and industry experts provide timely insights on legal and regulatory issues that impact your business.  Our thought leaders keep you updated through advisories and articlesblogsnewsletterspodcasts and resource centers.  Sign up here to receive our email communications tailored to your interests.

Follow us on LinkedIn and Twitter for the latest updates.

California officials today announced their nominees to be the five inaugural members of the California Privacy Protection Agency (“CPPA”) Board.  Created by the California Privacy Rights Act (“CPRA”), the CPPA will become a powerful, state-level privacy regulator long before its enforcement authority becomes effective in 2023, and today’s appointments move the CPPA one large step closer to beginning its work.  This post provides an overview of the CPPA’s authority, examines the issues that might be on its agenda, and outlines a few ways companies can start to get ready for potential regulations.

Inaugural Appointees

The five inaugural nominees of the CPPA Board are:

  • Jennifer Urban, who was appointed as Chair of the CPPA by Governor Gavin Newsom.  Urban is a clinical professor at UC Berkeley School of Law.
  • John Christopher Thompson, who was appointed by Governor Newsom and is Senior Vice President of Government Relations at LA 2028.
  • Angela Serra, who was designated by California Attorney General Xavier Becerra.  Serra served in a wide range of roles in the California Department of Justice, including overseeing the Consumer Protection Section’s Privacy Unit.
  • Lydia de la Torre, who was nominated by Senate President Pro Tem Toni Atkins.  De la Torre is a professor of law at Santa Clara University.
  • Vinhcent Le, who was designated by Assembly Speaker Anthony Rendon.

The announcement indicates that Urban’s and Thompson’s appointments do not require Senate confirmation.

The CPPA’s Next Milestones

Although the CPPA’s administrative enforcement authority does not become effective until July 1, 2023, the agency is poised in the meantime to become a powerful regulatory and supervisory authority, akin to a European data protection authority.  Key dates in the near term are:

  • July 1, 2021:  CPPA takes over rulemaking authority from the California Attorney General.
  • July 1, 2022:  Deadline for the CPPA to adopt final regulations required by CPRA.

Which Regulations Does CPRA Require the CPPA to Issue?

Section 21 of CPRA (codified in Civil Code section 1798.185) adds fifteen areas of CCPA implementation to be spelled out in regulations to the seven areas that were defined under the initial CCPA.  (CPRA also amends existing areas of rulemaking authority.  For example, it grants more specific authority to prescribe standards for opt-out mechanisms.)

Although CPRA requires the CPPA to adopt final regulations in these areas by July 1, 2022, it would not be surprising to see the agency set priorities, as the Attorney General’s Office did initially under the CCPA.  These priorities could include fundamental elements of the CCPA:

  • Opt-Outs for Sale, Sharing, and Profiling, and Limiting Use of Personal Information:  CPRA grants the CPPA the authority to adopt regulations that further define consumers’ opt-out rights.  Specifically, the agency is directed to adopt regulations that define “intentional interactions,” which in turn define the scope of exceptions to “sale” and “sharing.”The CPPA is also charged with issuing rules about “profiling” opt-out rights, and this area is worth watching closely because it is not aligned with Virginia’s new privacy law or the current text of the Washington Privacy Act.  CPRA defines “profiling” as the “automated processing of personal information, . . . to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”  A profiling opt-out under CPRA could apply to any first-party data use that meets this definition.  The profiling opt-out right under the Virginia Consumer Data Protection Act is narrower.  It is limited to the “furtherance of decisions that produce legal or similarly significant effects concerning the consumer.”  (The profiling opt-out proposed in the Washington Privacy Act is substantively identical to Virginia’s opt-out.)Other aspects of opt-out rights that could be initial rulemaking targets include (a) the definition of “technical specifications” for a global platform- or browser-based opt-out mechanism; and, with the potential addition of a feature to indicate that the user is under the age of 13 or between 13 and 15 years old; (b) standards for consent to sell or share personal information, or use or disclose sensitive personal information, for businesses that respond to opt-out signals; and (c) “harmonizing” CCPA rules governing privacy notices, opt-out mechanisms, and “other operational mechanisms” to “promote clarify and functionality . . . for consumers.”
  • Access Requests:  CPRA directs the CPPA to define the scope of responses to consumer requests for specific pieces of personal information.  CPRA suggests that these regulations may exclude system log and other information that “would not be useful to the consumer,” as well as define authentication standards for access to sensitive personal information.
  • Business Purposes:  Finally, it is possible that the CPPA will focus initially on “further defining” business purposes for which contractors and service providers may combine personal information from multiple businesses.

Defining CPPA’s Supervisory Authority

The CPPA will also have considerable supervisory authority.  Section 1798.185(15) authorizes the CPPA to issue regulations defining audit and risk assessments for businesses “whose processing of consumers’ personal information presents significant risk to consumers privacy or security.”

Separately, the CPPA must appoint a Chief Privacy Auditor to audit businesses’ compliance with the CCPA.  The Auditor’s role will be defined almost entirely through regulations, and the statutory guidance on these regulations is scant: The CPPA will define the “scope and process of the agency’s audit authority,” establish criteria for selecting audit targets, and establish protections against disclosure for the information the auditor collects.

As with other areas of CPPA rulemaking, it is unclear when the agency will turn to establishing the Chief Privacy Auditor’s authority.  However, it is worth noting now that the Auditor’s authority is potentially sweeping, as well as considering how a CCPA compliance program will look when it is under the Auditor’s microscope.

Today’s appointments are an important milestone in the development of a new breed of U.S. privacy regulator.  We will keep a close watch on further developments with the Board and the CPPA’s activities.

 

California Privacy Protection Agency Appointments Announced

Kelley Drye attorneys and industry experts provide timely insights on legal and regulatory issues that impact your business.  Our thought leaders keep you updated through advisories and articles, blogs, newsletters, podcasts and resource centers.  Sign up here to receive our email communications tailored to your interests.

Follow us on LinkedIn and Twitter for the latest updates.

On March 2, Governor Ralph Northam signed the Virginia Consumer Data Protection Act (VCDPA) into law, making Virginia the second state to enact comprehensive privacy legislation.

With the VCDPA on the books, companies have the next 22 months to prepare for the VCDPA and the California Privacy Rights Act (CPRA) to go into effect.  This post takes a look at the VCDPA provisions that are novel and require close attention during the transition period to the law’s January 1, 2023 effective date.

  • Sensitive Data: The VCDPA breaks new ground in U.S. privacy law by requiring consent to process “sensitive data” – a term that includes precise geolocation data; genetic or biometric data used to identify a person; and data revealing race or ethnicity, religious beliefs, health diagnosis, sexual orientation, or citizenship or immigration status.  The definition of “consent,” in turn, tracks the GDPR definition: “freely given, specific, informed, and unambiguous” and conveyed by a “clear affirmative act.”
  • Opt-Outs: Controllers will need to offer opt-outs under three distinct circumstances.  In addition to an opt-out of sale (which is limited to exchanging personal data for monetary consideration), controllers must allow consumers to opt out of (1) “targeted advertising” and (2) “profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.”  Although the definition of “targeted advertising” excludes ads “based on activities within a controller’s own websites or online applications” (which also appears to include affiliates’ websites), it is unclear whether this exclusion encompasses, for example, a controller’s use of third-party data sources combined with its first party data to inform its targeting decisions.  As a result, VCDPA opt-outs could have a significant impact on first-party data use as well as third-party data sharing.  Notably though, the law clearly excludes from the targeted advertising definition the processing of personal data solely for measuring or reporting advertising performance, reach, or frequency.
  • Principles for Data Controllers: Section 59.1-574 articulates several broad, principles-based obligations of data controllers, including reasonable security and a duty to limit personal data collection to what is “adequate, relevant, and reasonably necessary” to fulfill purposes that have been disclosed to consumers.  Companies have gained experience with similar principles under the GDPR and federal and state reasonable security requirements, but their inclusion in comprehensive privacy legislation that provides civil penalties of up to $7,500 per violation counsels in favor of taking a close look at how to demonstrate a thoughtful, well-reasoned approach to data strategies.
  • Data Protection Assessments: Controllers will need to conduct data protection assessments not only for high-risk activities but also for targeted advertising, profiling, personal data sales, and sensitive personal data processing.  These assessments will be fair game for the Attorney General in any investigation of a controller’s compliance with the data protection principles and transparency requirements of section 59.1-574, though the VCDPA purports to preserve attorney-client privilege and work product protection for assessments submitted in response to a civil investigative demand.  The affirmative obligation to conduct such assessments does not begin until January 2023.
  • In and Out of Scope: The Virginia law focuses on Virginia residents in their capacity as a consumer, and expressly excludes a person acting in an employment or commercial (B2B) capacity.  The law also excludes GLBA-covered financial institutions and financial personal information, FCRA-covered information, HIPAA covered entities and their business associates, non-profits, and higher education.  Publicly-available information is also outside the scope of regulated personal information, and extends to data from publicly-available government records or when lawfully made available to the general public.
  • No Private Right of Action: The Virginia law provides the Attorney General with the exclusive authority to enforce the law and that it should not be used as a basis to bring a private suit under the act or any other law. However, as we’ve seen with the CCPA, that type of restriction has not stopped parties from pursuing creative ways to bring private actions for privacy violations, including under other provisions of state law, such as unfair and deceptive trade practice statutes.

For better or worse, companies will need to prepare for the VCDPA without an obvious prospect of additional regulatory guidance.  Unlike the regulatory structure the CCPA established – and the CPRA significantly expands, Virginia’s privacy law does not provide any state agency or official with rulemaking authority.  However, the VCDPA could be just a first step.  Governor Northam reportedly “will have an ongoing work group to continue to strengthen the law’s consumer protections,” and Virginia Delegate Cliff Hayes, who introduced the House version of the law, signaled that legislators are open to making such changes.  It will remain to be seen the extent to which this group will recommend allocating additional funding to the Attorney General’s office to enforce the law, and the type of enforcement we may see.  Historically, the office has not been as active as other state attorneys general on consumer protection related matters outside of a fraud context.

We will watch closely for changes in Virginia and progress in other state privacy bills.

Ad Law Access PodcastAs covered in the blog post “It’s Here: California Voters Approve the CPRA,” California voters passed ballot Proposition 24, the California Privacy Rights Act of 2020 (“CPRA”)  Also known as CCPA 2.0, CPRA brings a number of changes to the CCPA, the majority of which will become operative on January 1, 2023. In addition to revising some of the definitions that are fundamental to commercial relationships under the CCPA (e.g., the definition of “sale” and “service provider”), CPRA provides additional consumer rights, incorporates data minimization and certain other principles from the General Data Protection Regulation, and establishes a new California Privacy Protection Agency, which will become the state’s privacy regulator and share enforcement oversight with the State Attorney General’s Office.

On this much anticipated episode of the Ad Law Access podcast, Alysa Hutnik and Aaron Burstein focus on some overarching CPRA issues and a few particular issues that caught their attention.

Listen on Apple, SpotifyGoogle Podcasts,  Soundcloud, via your smart speaker, or wherever you get your podcasts.

For more information on health claims and other topics, visit:

Advertising and Privacy Law Resource Center

 

Prior to the September 30 deadline to sign or veto legislation, California Governor Gavin Newsom recently took action on three bills related to data privacy. Bringing some potential certainty to the dynamic CCPA landscape, Governor Newsom signed into law AB 1281, which provides for the extension of the CCPA’s exemptions related to employee data until January 1, 2022. In 2019, the Legislature exempted from the CCPA collection of personal information from job applicants, employees, business owners, directors, officers, medical staff, and contractors until January 1, 2021. Notably, AB 1281 only goes into effect if California voters do not approve the California Privacy Rights Act (CPRA) ballot initiative on November 3rd.

However, Governor Newsom vetoed two other privacy bills that would have tightened data- and service-specific regulations beyond the CCPA’s standards. Citing the risk of unintended consequences during the COVID-19 pandemic, Governor Newsom nixed SB 980, which would have created heightened privacy and security requirements for genetic data handled by direct-to-consumer genetic testing and analysis companies. Instead, Governor Newsom directed the state’s Health and Human Services Agency and Department of Public Health to work with the Legislature to identify “a solution that achieves the privacy aims of the bill while preventing inadvertent impacts on COVID-19 testing efforts.”

The second vetoed bill, AB 1138, would have required companies that offer “social media” services to obtain parental consent before allowing a user who companies actually know to be under the age of 13 to create an account. In his veto message, Governor Newsom explained that AB 1138 “would not meaningfully expand protections for children,” but indicated that he is “open to exploring ways to build upon current law to expand safeguards for children online.”

Privacy developments in California this year are unlikely to end with the Legislature’s session. As we have discussed, the November 3rd vote on CPRA could have far-reaching implications for California privacy law. With the election only 33 days away, we will continue to monitor and post relevant updates.

Ad Law Access PodcastAs covered in this blog post, on June 24, 2020, the Secretary of State of California announced that the California Privacy Rights Act (CPRA), had enough votes to be eligible for the November 2020 general election ballot. CPRA is a ballot initiative, which, if adopted, would amend and augment the California Consumer Privacy Act (CCPA) to increase and clarify the privacy rights of California residents. The result is a law that is closer in scope to robust international privacy laws, such as the GDPR.

On the latest episode of the Ad Law Access Podcast, Privacy partner Alysa Hutnik discusses the initial highlights of CPRA and provide some takeaways for you to begin to understand this new California privacy development.

Listen on Apple,  SpotifyGoogle Podcasts,  Soundcloud or wherever you get your podcasts.

For more information on health claims and other topics, visit:

Advertising and Privacy Law Resource Center

A recent Marketplace Tech podcast episode on the spike in demand for mental health apps caught our attention.  As shocking headlines and stay-at-home orders rolled across the country, demand for mental health apps increased almost 30% since the pandemic began, according to CNBC.  And there is a wide variety of options to choose from, with roughly 20,000 mental health apps available across app stores.  This got the editors of Marketplace Tech asking two questions:  Do mental health apps work?  And what are the regulatory and privacy implications?  It’s worth a listen when you have time and we figured that we could weigh in as well.

Do they work?

One psychiatrist interviewed for the Marketplace Tech story questioned whether the apps should be required to demonstrate effectiveness to the FDA prior to being marketed.  In fact, some of them are, but many are not.

The starting point for this analysis is whether the app or the software is regulated as a medical device.  The Food Drug and Cosmetic Act defines a device as “…an instrument, apparatus, implement, machine, contrivance, implant, in vitro reagent, or other similar or related article, including any component, part, or accessory”, that is “… intended for use in the diagnosis of disease or other conditions, or in the cure, mitigation, treatment, or prevention of disease, in man …” or “… intended to affect the structure or any function of the body of man or other animals…”  Apps that meet this definition are regulated as medical devices and are subject to FDA’s pre-market review requirements, unless they are low risk and subject to FDA’s enforcement discretion policy.

Given the need for patients and consumers to access mental health therapy remotely and in increased numbers over recent weeks, FDA relaxed its requirements for apps intended to help treat depression, anxiety, obsessive compulsive disorder and insomnia.  FDA’s Enforcement Policy for Digital Health Devices For Treating Psychiatric Disorders During the Coronavirus Disease 2019 (COVID-19) Public Health Emergency suspends the 510(k) premarket notifications, corrections and removal notifications, registration and listing requirements, and unique device identification (UDI) requirements for computerized behavioral health devices and other digital health therapeutic devices for psychiatric disorders where those devices do not create an undue risk during the COVID-19 emergency.

There are thousands of apps that relate to mental health and overall wellbeing in some way, however, many of which are not within the definition of “medical device” and do not require premarket review.  FDA’s General Wellness:  Policy for Low Risk Devices explains the agency’s enforcement discretion approach more generally.

In addition, thinking about the “do they work” question, companies marketing these products should also be mindful of the FTC’s claim substantiation requirements.  Health claims are subject to a particularly high bar for claim substantiation – competent and reliable scientific evidence.  In simple terms, this means evidence that is sufficient in quantity and quality such that experts in the field would agree that it supports the claim.  The FTC has pursued app developers (see here and here) whose claims exceeded their substantiation and has issued dozens of warning letters to marketers making aggressive claims that their products can prevent or treat COVID-19.  Companies marketing apps that claim to help address mental and physical health conditions should be mindful of the substantiation requirements and of closely tailoring their claims to their evidence.

What about privacy?

Many apps used by physicians are subject to HIPAA, but the vast majority of health-related apps are not covered by HIPAA.  As health-related apps have proliferated, companies are collecting and storing massive amounts of consumer data.  Many apps do not feature a clear explanation about privacy practices and how data is being stored or used.  As we’ve chronicled here, non-HIPAA health privacy and the need for developers to be transparent with consumers about their privacy practices has been an FTC concern for several years.  Our Advertising and Privacy Law Resource Center provides a wealth of free content to help app developers understand the applicable legal framework.

More specifically related to privacy and data tracking in the era of COVID-19, our “Data Privacy Considerations for Coronavirus Data Tools” provides key considerations for companies seeking to build contact tracing and related health apps.  These include issues such as the following: whether personal information is involved, what level(s) of transparency are appropriate relative to data practices, how to address government requests for information, and considerations related to licensing COVID-19-related personal information.

What’s the takeaway?

As daily life has increasingly shifted online, it’s more important than ever for app developers to understand how their products are regulated and to build those features in to the product and how it is marketed.  In addition, FDA’s temporary relaxation of pre-marketing review standards for certain mental health apps does not mean that the FTC’s claim substantiation and privacy compliance requirements are relaxed for health-related apps more generally.  If anything, we should anticipate an increased regulatory focus on these issues.

* * * *

Ad Law Access Podcast

On the latest episode of the Ad Law Access Podcast, Advertising and Marketing partner Kristi Wolff discusses three keys to making compliant health claims:  determining the product regulatory classification, claim substantiation standards, and the importance of context.  This episode is a prequel to her earlier Health Claims in the Context of COVID-19 episode which focused on recent FTC and FDA enforcement relating to false COVID-19 health claims and the importance of considering the current pandemic context in health-related marketing.

Listen on Apple,  SpotifyGoogle Podcasts,  Soundcloud or wherever you get your podcasts.

For more information, visit: