Please join Kelley Drye in 2017 for the Advertising and Privacy Law Webinar Series. Like our annual in-person event, this series will provide engaging speakers with extensive experience and knowledge in the fields of advertising, privacy, and consumer protection. These webinars will give key updates and provide practical tips to address issues faced by counsel.

This webinar series will commence January 25 and continue the last Wednesday of each month, as outlined below.

January 25, 2017 | February 22, 2017 | March 29, 2017 | April 26, 2017 | June 28, 2017
July 26, 2017 | September 27, 2017 | October 25, 2017 | November 29, 2017

Kicking off the series will be a one-hour webinar on “Marketing in a Multi-Device World: Update on Cross Device Tracking” on January 25, 2017 at 12 PM ET. For more information and to register, please click here. CLE credit will be offered for this program.

Ad iconThe Digital Advertising Alliance (DAA) recently announced that enforcement of its guidance on cross-device tracking (the “Application of the DAA Principles of Transparency and Control to Data Used Across Devices”) is set to begin on February 1, 2017. Originally published in November 2015, the guidance was intended to clarify how the DAA’s Core Principles of notice and choice should be applied to cross-device tracking.

And for those of you that have not through the guidance recently…or at all… here is a quick summary:

Transparency: The privacy policy must disclose the fact that data collected from a particular browser or device may be used with another computer or device that is linked to the browser or device on which such data was collected, or transferred to a non-affiliated third-party for such purposes. The notice should also include a clear and prominent link to a disclosure that either (1) links to the industry-developed website or choice mechanism that provides the consumer with choices over these practices, or (2) individually lists the third-parties that are engaged in cross-device tracking.

Consumer Control: Consumers must have the ability to exercise choice (i.e., an opt-out mechanism) concerning cross-device tracking.

Although the DAA published the guidance last year, it has delayed enforcement to allow companies time to come into compliance. The guidance on cross-device tracking will be independently enforced by the Council of Better Business Bureaus (CBBB) and the DMA (formerly the Direct Marketing Association), which provide ongoing independent oversight of the DAA Principles.

What does this mean for you?  If you are actively engaging in cross-device tracking, or have implemented beacons or other technologies that permit cross-device tracking to occur on your website or app, be sure that your privacy policy and other public-facing materials provide consumers with appropriate notice and choice about your cross-device tracking practices.

This content is password protected. To view it please enter your password below:

InMobiThe FTC announced a settlement on Wednesday with mobile advertising company, InMobi Pte Ltd., concerning allegations that the company deceptively tracked the geolocation of hundreds of millions of unknowing consumers, including children, to serve them geo-targeted advertising.  As part of the settlement, InMobi will pay $950,000 in civil penalties relating to violations of the Children’s Online Privacy Protection Act (COPPA), and agreed to implement a comprehensive privacy program.

InMobi’s Practices

InMobi provides an advertising platform for app developers and advertisers.  App developers can integrate the InMobi software development kit (SDK) for its Android and iOS apps, allowing them to monetize their applications by allowing third party advertisers to advertise to consumers through various ad formats (e.g., banner ads, interstitial ads, native ads).  Advertisers, in turn, can target consumers across all of the mobile apps that have integrated the InMobi SDK.

InMobi also offers several geo-targeting products, which allow advertisers to target consumers based on specific location information.  For instance, advertisers could target consumers based on their device’s current or previous location, or if the consumer visits a certain location at a particular time of day or on multiple occasions.

FTC Charges

The FTC alleges that InMobi mispresented that its advertising software would track consumers’ locations and serve geo-targeted ads only if the consumer provided opt-in consent, and only when it was done in a manner consistent with their device’s privacy settings.  According to the complaint, InMobi was actually tracking consumers’ locations whether or not the apps with InMobi SDKs requested consumers’ permission to do so, and even when consumers had denied permission to access their geolocation.

Even when users had denied the app permission to access geolocation, InMobi was collecting information about the WiFi networks that the consumer’s device connected to or that were in-range of the consumer’s device, feeding this information into its geocoder database, and using this information to infer the consumer’s longitude and latitude. The FTC claims that this process allowed InMobi to track the consumer’s location and serve geo-targeted ads, regardless of the app developer’s intent to include geo-targeted ads in the app, and regardless of the consumer’s privacy preferences or device settings.  As a result of these practices, app developers could not provide accurate information to consumers regarding their apps’ privacy practices.  The FTC concluded that InMobi’s misrepresentations regarding its data collection and use practices were deceptive in violation of Section 5 of the FTC Act.

In addition, the complaint alleges that InMobi violated COPPA by knowingly collecting personal information from children under the age of 13, despite representations to the contrary. The FTC claims that InMobi did not have adequate controls in place to ensure COPPA-compliance and did not test any controls it implemented to ensure they functioned as intended.  As a result, InMobi collected personal information (including unique device identifiers and geolocation information) in thousands of apps that developers had expressly indicated to InMobi were child-directed, and used this information to serve interest-based, behavioral advertising in violation of COPPA.

Settlement Provisions

Per the stipulated order, the company is prohibited from collecting consumers’ location information without their affirmative express consent and will be required to honor consumers’ location privacy settings.  The company is further prohibited from violating COPPA and from misrepresenting its privacy practices.  The order also requires the company to delete all information it collected from children, delete the location information collected from consumers without their consent, and establish a comprehensive privacy program.  The comprehensive privacy program is typical of what we see in other FTC privacy settlements.  It has provisions governing the designation of a responsible employee to oversee privacy compliance, requiring ongoing assessment of risks that could result in unauthorized collection of information, mandating implementation of reasonable privacy controls, requiring regular testing and evaluation of such controls, and addressing service provider oversight.  Under the terms of the settlement, InMobi is subject to a $4 million civil penalty, which was suspended to $950,000 based on the company’s financial condition.

Key Takeaways

Mobile technology practices continue to be a focus of the FTC’s consumer protection efforts.  Companies collecting personal and geolocation information from consumers should understand precisely what information will be collected from or about a user, clearly and accurately communicate its data practices, and respect any representations that are made.  Particular care should be taken when collecting information through child directed apps and websites.  Taking these simple steps can help avoid FTC scrutiny with respect to a company’s privacy practices and related representations.

iHeartMedia has agreed to pay $8.5 million to resolve allegations that the company sent unsolicited text messages to radio station listeners, in Messageviolation of the TCPA. According to the complaint, the company would invite listeners to send text messages in order to request songs or enter contests. Listeners who submitted requests or entries would receive messages from the company in return.

But rather than simply confirm receipt of the listener’s text, the plaintiffs alleged that the messages frequently included ads for the company’s partners. For example, when the plaintiffs sent a text message to enter a contest, they received a response inviting them to “play us in the brand new version of Words With Friends.” The text message included a link that led the recipient to the Words With Friends download page on their phone’s app store.

It’s tempting to think that a person’s text to your company constitutes consent to text them back, but it’s not that easy. While you may be able to send a simple confirmation of receipt, in order to send an ad, you need prior express written consent. Without that, you could be liable for statutory damages of up to $1,500 per text sent without consent. As this settlement demonstrates, those numbers can quickly add up.

Amazon AppsYesterday, a federal judge ruled that Amazon is liable for permitting unauthorized in-app purchases incurred by children.  Amazon is the last in a series of actions brought by the FTC against third-party platforms related to kids’ in-app charges (we previously blogged about the other two actions against Apple and Google here and here, which resulted in refunds to consumers totaling over $50 million).

FTC Allegations

The FTC first filed its complaint against Amazon in district court in July 2014, alleging that the billing of parents and other account holders for in-app purchases incurred by children “without having obtained the account holders’ express informed consent” violated Section 5 of the FTC Act.  Many of the apps offering in-app purchases were geared towards children and offered as “free” with no indication of in-app purchases.  These in-app charges generally ranged from $0.99 to $99.99, but could be incurred in unlimited amounts.  The FTC alleged that, while the app developers set the price for apps and in-app purchases, Amazon retained 30% of the revenue from every in-app sale.

In app purchaseThe complaint alleged that when Amazon first introduced in-app charges in November 2011, the default setting initially permitted in-app purchases without a passcode, unless this setting had been enabled by the user in the parental controls.  Following a firestorm of complaints by parents surprised to find these in-app charges, Amazon introduced a password prompt feature for in-app charges of $20 or more in March 2012.  This initial step, however, did not include charges that, in combination, exceeded $20.  In August 2012, the FTC notified Amazon that it was investigating its in-app billing practices.

Amazon began to require password prompts more frequently beginning in February 2013, only if the purchase initiated was over $20, a second in-app purchase was attempted within five minutes of the first, or when parental controls were enabled.  Even so, once a password was entered, in-app purchases were often authorized for the next hour.  Amazon continued to refine its in-app purchase process over the next few months, identifying that “In-App Purchasing” was available on an app’s description page, and adding a password requirement for all first-time in-app purchases, among other things.

The Court’s Order

The FTC moved for summary judgement in February 2016.  In it April 27 order, the court granted the FTC’s summary judgement motion finding that: (1) the FTC applied the proper three-prong legal test for determining unfair business practices (e.g., a substantial injury that is not reasonable to consumers, and not otherwise outweighed by countervailing benefits); (2) the FTC’s witness used to calculate money damages was timely disclosed, even though she was identified after the discovery cut-off date since the FTC made its intentions to seek monetary relief known from the beginning; and (3) Amazon’s business practices around in-app purchases violated Section 5. Continue Reading Federal Court Finds Amazon Liable for Kids’ In-App Purchases

Earlier this year, PayPal announced planned changes to its User Agreement that would have, among other things, given the company broad rights to contact people by phone or text messages. The provision stated, in part:

You consent to receive autodialed or prerecorded calls and text messages from PayPal at any telephone number that you have provided us or that we have otherwise obtained. We may place such calls or texts to (i) notify you regarding your account; (ii) troubleshoot problems with your account; (iii) resolve a dispute; (iv) collect a debt; (v) poll your opinions through surveys or questionnaires, (vi) contact you with offers and promotions . . . .

The provision was set to go into effect on July 1, 2015, and the only option to avoid being contacted in this manner was to stop using the service. Predictably, consumers did not react well to the provision, particularly as it related to the offers and promotions. Neither did FTC staff, who contacted PayPal to remind them of their obligations under the Telemarketing Sales Rule and the Do Not Call Registry.

Although the TSR permits telemarketing calls to numbers on the Registry if a consumer has provided express written consent to receive such calls, the proposed language did not meet the requirements for the exception. The staff noted, for example, that the request seeking consent must be “clear and conspicuous” and cannot be “buried” in a lengthy user agreement. Moreover, calls may only be placed to a number specified by a consumer – not to any number “otherwise obtained.”

On June 29, 2015, PayPal revised the proposed language such that the company only reserved rights to place calls or texts to “(i) provide notices regarding your Account or Account Activity, (ii) investigate or prevent fraud, or (iii) collect a debt owed to us.” Based on these changes – and because the company hadn’t yet made any telemarketing calls under the proposed language – the FTC’s Division of Marketing Practices decided not to recommend enforcement action.

As we’ve posted before, few things will get companies in trouble faster than sending unwanted calls or texts. You can ask for permission to send those, but the request has to be done in a way that is clear and conspicuous.

Last week, the FTC concluded a $40 million settlement with TracFone – the largest prepaid mobile provider in the U.S. – over allegations that the company throttled customers’ purportedly unlimited data plans. The FTC alleged that TracFone advertised $45 per month unlimited plans, but systematically throttled and/or suspended customers’ connections after they passed a certain usage threshold, in violation of Section 5 of the FTC Act, which prohibits “unfair and deceptive” trade practices.

The landmark settlement is indicative of vigorous enforcement by the FTC in the mobile broadband space. Our advisory provides an analysis of the settlement and items of note for other companies considering similar claims or related business practices in the broadband space.

On September 4, 2014, the FTC announced a settlement with Google Inc., which requires the search giant to pay at least $19 million in refunds to consumers that the Commission alleges were billed for unauthorized in-app charges incurred by kids.  The settlement follows a similar settlement in January with Apple (which required Apple to pay a minimum of $32.5 million in refunds), and a recent complaint filed by the FTC in federal court against Amazon.

The FTC’s complaint against Google alleges that the company offered free and paid apps through its Play store.  Many of these apps are rated for kids and offer “in-app purchases” ranging from $0.99 to $200, which can be incurred in unlimited amounts.  The FTC alleges that many apps invite children to obtain virtual items in a context that blurs the line between what costs virtual currency and what costs real money. 

At the time Google introduced in-app charges in March 2011, users were notified of an in-app charge with a popup containing information about the virtual item and the amount of the charge.  A child, however, could clear the popup simply by pressing a button labeled “CONTINUE.”   In many instances, once a user had cleared the popup, Google did not request any further action before billing the account holder for the corresponding in-app charge. 

It was not until mid- to late-2012 that Google begin requiring password entry in connection with in-app charges. The complaint alleges, however, that once a password was entered, it was stored for 30 minutes, allowing a user to incur unlimited in-app charges during that time period.  Regardless of the number or amount of charges incurred, Google did not prompt for additional password entry during this 30 minute period.

Google controls the billing process for these in-app charges and retains 30 percent of all revenue.  For all apps, account holders can associate their Google accounts with certain payment mechanisms, such as a credit card, gift card, or mobile phone billing.  The complaint highlights that Google received thousands of complaints related to unauthorized in-app charges by children and that unauthorized in-app purchases was the lead cause of chargebacks to consumers. Continue Reading Google to Refund at Least $19 Million Over Kids’ In-App Purchases

Last week, the FTC held its third and final spring privacy seminar on the implications of consumer generated and controlled health data. The seminar featured presentations by Latanya Sweeney, the FTC’s Chief Technologist, and Jared Ho, an attorney in the FTC’s Mobile Technology Unit, and a panel discussion with representatives from the Department of Health and Human Services, the Center for Democracy and Technology, and the private sector. During the two-hour seminar, the presenters and panelists recognized the benefits of health-related apps, but expressed concerns that consumers may be unaware of the apps’ information collection and transmission practices, as well as that the apps may not be covered by HIPAA. There was no consensus on the type of regulation, if any, needed.

Ms. Sweeney’s presentation, while highlighting the maxim that transparency establishes trust, documented the flow of consumer health data provided to hospitals, noting that consumer health data may flow – and often does flow – from hospitals to entities that are not covered by HIPAA. Additionally, although de-identified when sold, this information may be easily re-identified. Mr. Ho presented the results of an FTC study on the health information collected and transmitted by 12 mobile apps and two wearables. While the Commission did not review privacy policies, the study results revealed that the apps transmitted consumer health information to 76 third parties, many of which collected device information or persistent device identifiers (sometimes from multiple apps) and additional information, such as gender, zip code, and geolocation. Mr. Ho stated that there are significant health concerns when data is capable of being aggregated.

The panel, moderated by two FTC Division of Privacy and Identity Protection attorneys, featured Dr. Christopher Burrow, the Executive Vice President of Humetrix, Joseph Lorenzo Hall, Chief Technologist for the Center for Democracy and Technology, Sally Okun, Vice President for Advocacy, Policy and Patient Safety at PatientsLikeMe, and Joy Pritts, Chief Privacy Officer in the Department of Health & Human Services’ Office of the National Coordinator for Health Information Technology. The panelists spent a significant amount of time discussing the various entities covered – and not covered – by HIPAA, as well as the array of health-related websites and apps that are available to consumers. Some of the concerns raised were: (1) the potential for sensitive health information to be shared in ways consumers would not reasonably anticipate (and the inability to predict what consumers may deem “sensitive”); (2) the lack of a standard definition of “de-identified data”; (3) the potential for data re-identification; and (4) the ever-expanding definition of what constitutes “health” information.

Information on the seminar, including a transcript, is available here, and the FTC is accepting comments until June 9.