Ad Law Access https://www.kelleydrye.com/viewpoints/blogs/ad-law-access Updates on advertising law and privacy law trends, issues, and developments Fri, 15 Nov 2024 09:58:55 -0500 60 hourly 1 FTC Proposes Changes to COPPA Rule: What Businesses Need to Know https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-proposes-changes-to-coppa-rule-what-businesses-need-to-know https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-proposes-changes-to-coppa-rule-what-businesses-need-to-know Tue, 26 Dec 2023 11:00:00 -0500 The Federal Trade Commission (FTC) announced a Notice of Proposed Rulemaking (NPRM) to amend the Children’s Online Privacy Protection Act Rule (COPPA Rule). The COPPA Rule applies to operators of websites and online services that are directed to children under 13 or that have “actual knowledge” they are collecting personal information from children under 13. It imposes notice, consent, data security, and data minimization requirements. Below we summarize highlights from the rulemaking.

Significant Changes

The FTC proposes to modify many of the Rule’s provisions. Some of the proposed changes could have far-reaching practical effects for companies assessing COPPA’s applicability and working to achieve COPPA compliance.

  • Parental Consent for Third-Party Disclosures: The FTC’s proposal includes a major change to parental consent requirements, which currently permit a single consent for the collection, use, and disclosure of child personal information. Under proposed Rule, when verifiable parental consent is required, operators would be required to bifurcate consent and obtain one for collection/use and one for disclosure to third parties, except where such disclosure is “integral to the nature of the website or online service.” This proposal could require companies to not only manage the new requirements, but also address potential parental confusion created by a bifurcated model.
  • Internal Operations Exception: The FTC proposes to modify the Rule’s internal operations exception to the general requirement that operators obtain parental consent. Under the proposal, an operator would be prohibited from using a persistent identifier collected under the internal operations exception “in connection with processes, including machine learning processes, that encourage or prompt use of a website or online service.” By way of example, the FTC noted this new limitation would “prohibit operators from using or disclosing persistent identifiers to optimize user attention or maximize user engagement with the website or online service, including by sending notifications to prompt the child to engage with the site or service.” How this new limitation would be reconciled with internal operations activities that companies currently use to operate their sites, including for personalization, is an important question to be answered.
  • School Authorization Exception: The rulemaking includes an entirely new parental consent exception for education technology. In certain circumstances, an operator may rely on the consent of a school, as opposed to a parent, in collecting child information. The proposal specifies a number of requirements that must be satisfied to rely on the School Authorization exception, including: (1) that the information collected can only be used for a school-authorized education purpose and not a commercial one; (2) a written agreement between the authorizing school and the operator with specific terms; (3) direct supervision by the school over the operator’s use, disclosure, and maintenance of the personal information; (4) that the operator must post an online notice with proscribed information; and (5) the ability for the school to review and request deletion of any child personal information. Although the FTC’s proposal makes clear that the new School Authorization exception is merely a codification of existing guidance, both schools and operators will need to address issues left open under the exception, such as what activities fall within the educational purpose limitation.
  • Data Security Obligations: The FTC’s proposal offers more granular and prescriptive terms on what is required by COPPA’s requirements to establish and maintain “reasonable procedures” to protect children’s data. Such reasonable procedures would include establishing, implementing, and maintaining a written children’s comprehensive security program. It would also include written assurances from third parties to which child personal information is disclosed that they can satisfy requirements to secure and protect child personal information.
  • New Parental Consent Options: The FTC proposes allowing parents to provide consent through text messages, knowledge-based authentication, and facial recognition technology. The FTC is also proposing to eliminate the monetary transaction requirement for obtaining consent through a parent’s use of a credit card, debit card, or online payment system. Under this proposal, the parent would simply need to enter payment information and no charge would be transacted.
  • Defining “Directed to Children”: The FTC is proposing to add “a non-exhaustive list of examples of evidence” it would use in its assessment of whether a site or online service is directed to children, including “marketing or promotional materials or plans, representations to consumers or to third parties, reviews by users or third parties, and the age of users on similar websites or services.” It also asks whether websites or online services be able to rebut that they are directed to children through an audience composition analysis.

Anticipated Clarifications

The NPRM also proposes important clarifications that have been previewed in recent COPPA settlements and guidance.

  • Data Retention: The NPRM would expand the Rule’s express data retention and deletion requirements, making clear that “personal information collected online from a child may not be retained indefinitely” or used for a “secondary purpose” beyond the purpose(s) for which the information was collected. The FTC also proposes to require that companies establish internal policies to limit data retention and disclose their data retention policies in their COPPA privacy policies.
  • Defining “Personal Information” to Include Biometrics: In FTC’s proposal would expand the definition of personal information to include biometric identifiers that can be used for the automated or semi-automated recognition of an individual, reasoning that “biometric recognition systems are sufficiently sophisticated to permit the use of identifiers . . . to identify and contact a specific individual either physically or online.”
  • Voice Commands: The FTC proposes to largely codify its 2017 policy statement regarding COPPA and voice recordings (which comes up in the context of smart home assistants or similar technology). The FTC proposes that as long as a business uses the audio file to respond to a specific request and does not (1) use the information for another purpose, (2) disclose the information, or (3) retain the information after responding, COPPA direct notice and consent requirements do not apply.

Open Questions

Finally, the FTC asks a series of questions regarding its approach to COPPA rulemaking. Some of the questions that stood out to us include:

  • Screen and User Names: The FTC asks whether it should expand the definition of “personal information” to include screen or user names that do not allow contacting an individual, despite the fact that the FTC’s rulemaking authority to define “personal information” is limited to identifiers that can be used to “contact[] . . . a specific individual.” The NPRM reasons that children may use the same screen or user name across websites or online services, and they may be able to be contacted on platforms not controlled by the operator.
  • Limiting Personalization and Contextual Advertising: The existing COPPA Rule permits businesses to use persistent identifiers for internal operations without parental consent. The FTC asks whether certain types of personalization and contextual advertising remain appropriately categorized as internal operations. For example:
    • The NPRM explains that personalization that is “user driven” may be permitted under the internal operations exception, but it asks whether personalization that is “driven by an operator” and that is designed to “maximize user engagement” should be permitted.
    • Similarly, the NPRM questions whether to continue to permit contextual advertising under this exemption. It explains, “given the sophistication of contextual advertising today, including that personal information collected from users may be used to enable companies to target even contextual advertising to some extent, should the Commission consider changes to the Rule’s treatment of contextual advertising?”
  • Role of Platforms: The NPRM asks whether platforms can play a role in establishing consent mechanisms to enable obtaining verifiable parental consent. In particular, the FTC states it would be interested in understanding the benefits that platform-based consent mechanisms would create for businesses and parents.

* * *

For 60 days after publication in the Federal Register—which should occur in the next couple of weeks—the FTC is accepting public comment on the proposed changes to the Rule and the questions the agency raises in its NPRM. We will continue to monitor developments.

]]>
Kids’ Privacy and Safety Redux: Amended KOSA and COPPA 2.0 Advance By Voice Vote https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/kids-privacy-and-safety-redux-amended-kosa-and-coppa-2-0-advance-by-voice-vote https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/kids-privacy-and-safety-redux-amended-kosa-and-coppa-2-0-advance-by-voice-vote Tue, 01 Aug 2023 00:00:00 -0400 Last year, the Senate Commerce Committee marked up two bipartisan bills to protect kids’ privacy and safety – the Kids Online Safety Act (KOSA), and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) – amidst high hopes that the bills would get a vote on the Senate floor. With comprehensive privacy legislation still tripped up over preemption and private rights of action, policymakers thought that legislation to protect kids would have the best chance of passage. The bills never made it to the floor, however, and they died in the 117th Congress.

This year, the bills’ sponsors are trying again and, on July 27, the Committee marked up amended versions of both bills. (The markups came up on heels of President Biden’s once again urging passage of these bills in public remarks.) The amendments to the bills address policy concerns that various groups have continued to raise since the bills were introduced last year. We watched the July 27 hearing to see what we might learn about the prospects for the bills’ passage in 2023.

Brief Background on the Bills, as Amended

For those in need a reminder, KOSA (sponsored by Senators Blumenthal (D-CT) and Blackburn (R-TN)) is a kids and teen safety bill, designed to reduce harmful content on social media and to give minors and parents more tools and controls to block or filter such content. The bill has strong support from members of the child safety, medical, and consumer advocacy communities. At the same time, however, other consumer advocates, as well as the tech community, have criticized features of the bill that they believe would block minors’ access to content (including about LGTBQ+ and abortion issues) and/or potentially require the collection of more data from or about minors to determine their age. To address these concerns, the new version of the bill amends various definitions, as well as the standard governing when companies are expected to know who is minor, among other changes.

COPPA 2.0 (sponsored by Senators Markey (D-MA) and Cassidy (R-LA)), by contrast, is a privacy bill, the primary purpose of which is to extend privacy protections to teens 13 through 16 and change COPPA’s “actual knowledge” standard so that websites and apps have greater obligations to know when they are dealing with minors. Like KOSA, various groups have expressed concern about whether the proposal would lead to restrictions on content available to minors. Also like KOSA, the new version of the bill includes various changes, including revisions to the knowledge standard proposed in last year’s version.

The Markup

The markup this year was part of a full Committee Executive Session considering multiple bills on a range of topics. At the session, the Committee approved both of the amended kids’ bills, as well as several additional amendments to each of them (most of which were relatively minor). While there wasn’t extensive discussion surrounding these bills, Committee Members took the opportunity to highlight the importance of kids’ privacy and safety, as well as future actions that they’re contemplating in this area. Here’s our rundown of notable moments from the hearing:

Chair Cantwell (D-WA) led the session, addressing the multiple bills being considered (including, e.g., legislation on the topics of satellite waste in space and AM radio capabilities in cars). With regard to children’s privacy, she described COPPA 2.0 as a “vital upgrade” to protect minors, closing loopholes and protecting teens 13 through 16. KOSA, she explained, is also long overdue, as there is an ongoing mental health crisis related to social media’s impact on children. Cantwell acknowledged, however, that there are still outstanding concerns among groups who would be affected by the legislation (including members of the LGBTQ+ community), requiring additional work before the bills reach the floor. She added that these bills are “not the last” of the privacy issues that the Committee will consider and that she hopes, when the Committee returns in September, that it will consider other privacy issues as well.

Ranking Member Cruz (R-TX) offered support for both bills advancing out of the Committee. The Internet hasn’t come without cost, he said, especially for children. He further chided Big Tech companies for failing to protect children or give parents the safeguards and controls they need to protect their kids. Finally, he suggested adding a potential preemption clause to KOSA, as multiple states have passed laws that may be inconsistent with parts of the bill.

Senator Schatz (D-HI) gave an impassioned speech about protecting children, stating that we are in a crisis – an epidemic of teen mental illness due to the algorithmic boosting of harmful content online. He cited a study from the CDC showing that two thirds of high-school girls feel persistently sad or hopeless, and that 22% of all high-school students have seriously considered suicide. Due to his concerns about these issues, Schatz initially offered an amendment to KOSA – his Protecting Kids on Social Media Act – which would prohibit users under the age of 13 from accessing social media platforms, require parental consent for children 13 through 17, and ban the recommendation of content using algorithms to all minors under 18. However, he later withdrew his amendment, noting “productive conversations” he’d had with Cantwell and Cruz. For their part, Cantwell and Cruz expressed eagerness to work with Schatz on these issues in the fall.

Senator Thune (R-SD) offered an amendment to KOSA that would require platforms to notify users if they are using an algorithm, which passed by voice vote. Senators Blackburn (R-TN) and Klobuchar (D-MN) both expressed frustration about how long it has taken to address kids’ privacy and safety issues and said now is the time for action. Other Members, such as Senators Sullivan (R-AK) and Welch (D-VT), expressed support for the bills and their commitment to protecting children’s safety. Finally, Senator Markey (D-MA) confirmed his commitment to address concerns raised by the LGBTQ+ community and others, but expressed confidence that he would be able to resolve them.

* * *

Bottom line: While the Committee approved both bills, there will likely to more changes before either bill reaches the Senate floor. Further, while President Biden and Senate Majority Leader Schumer (D-NY) have both stated (at times) that these bills are a priority, the clock in the 118th Congress is ticking.

]]>
FTC Attempts End Run to Ban Meta from “Monetizing” Minors’ Data https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-attempts-end-run-to-ban-meta-from-monetizing-minors-data https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-attempts-end-run-to-ban-meta-from-monetizing-minors-data Thu, 04 May 2023 14:50:36 -0400 The FTC took unprecedented action yesterday when it moved to impose what it describes as a “blanket prohibition” preventing the company from monetizing young people’s data. The FTC contends that this prohibition is warranted as a result of repeated violations of Meta’s 2020 consent order (“Proposed Order”).

In taking this action, the FTC is relying on its administrative authority to “reopen and modify” orders to address alleged order violations, rather than to press its compliance case in federal court under the FTC Act. In doing so, the FTC seeks to significantly expand the scope and duration of the existing order to cover new conduct. Even against recent examples of aggressive FTC action (see examples here, here, and here), this one markedly stands out. And, in the face of mounting agency losses in challenges to its enforcement authority in Axon and AMG and its aftermath, the Proposed Order is extraordinary.

The Commission voted 3-0 to issue the Proposed Order and accompanying Order to Show Cause. Commissioner Bedoya issued a statement expressing reservations about the “monetization” restrictions described below, specifically questioning whether the provision related to minors’ data is sufficiently related to either the 2012 or 2020 violations or order. Meta has 30 days to answer the FTC’s proposal.

Order to Show Cause

The FTC’s 2020 Consent Order, which was obtained in federal court consistent with prior Commission practice, was itself a modification of a 2012 order. If the FTC adopts the Proposed Order, it would be the third order stemming from a single administrative complaint that was filed more than a decade ago. That alone sets the FTC’s action apart from any other Commission action in memory.

The heavily redacted Order to Show Cause alleges that Meta violated several obligations under the 2020 Consent Order. The FTC did not release its Preliminary Finding of Facts, but it is evident that the first report filed by the independent assessor, Protiviti, under the 2020 Consent Order, is the underlying source behind many of the FTC’s allegations. It is notable that the only unredacted conduct relates to practices that predate entry of the 2020 order, which is strange, given that 2020 order contained terms broadly releasing Meta from all pre-2020 order violations.

Specific alleged order violations include deficiencies in risk assessment and third-party risk management processes, security controls, and transparency practices, among others. The Order to Show Cause also asserts that Meta misrepresented the extent to which third-party developers would have access to users’ nonpublic information. The FTC acknowledges that Meta corrected one of these alleged instances by July 2019, but nonetheless alleges that Meta violated the 2012 Consent Order, Section 5 of the FTC Act, and the COPPA Rule (a Rule not included in the prior orders) during this time period. This, of course, raises the question of why the FTC is moving on this now, fully four years after it was corrected by Meta.

The Proposed Order

The FTC’s Proposed Order would expand the 2020 Consent Order by permanently prohibiting Meta from “[c]ollecting, using, selling, licensing, transferring, sharing, disclosing, or otherwise benefitting from Covered Information from Youth Users” except for specific purposes, such as operating a service, performing authentication, or maintaining security. “Youth Users” include not only children under the age of 13 but also minors who are ages 13 through 17.

This provision specifically prohibits using Youth Users’ information for targeted advertising or to train or improve algorithms models. Although the FTC’s press release focuses on stopping Meta from “monetizing” minors’ data, the Proposed Order goes further by prohibiting Meta from “benefitting” from minors’ data, except as permitted by this paragraph.

The Proposed Order also would require specific safeguards and assessment requirements concerning Youth Users and “enhanced monitoring of higher risk Covered Third Parties” at least once per year.

And, remarkably, it would prohibit Meta from releasing new or modified products, services, or features without written confirmation from the assessor that the Meta privacy program complies with the order’s requirements and presents no material gaps or weaknesses. This is an extraordinary provision that would essentially turn the independent privacy assessor into the master of all new launches on Facebook, Instagram, WhatsApp, and Oculus, among other services.

Why Isn’t this in Federal Court?

The FTC’s authority to reopen an administrative order stems from Section 5(b) of the FTC Act:

[T]he Commission may at any time, after notice and opportunity for hearing, reopen and alter, modify, or set aside, in whole or in part any report or order made or issued by it under this section, whenever in the opinion of the Commission conditions of fact or of law have so changed as to require such action or if the public interest shall so require, . . .

In the past, the FTC has touted how it eases conditions in its orders in response to changes in legal or factual circumstances, usually in response to a respondent’s petition. One relatively recent example comes from 2018, when the FTC granted Sears’ petition to modify its 2009 order to exempt certain first-party, mobile app-based data collection from the order’s opt-in consent requirements. The FTC agreed to modify the definition of “Tracking Application” to exclude software programs that only engage in types of tracking consumers have come to expect, citing changes to the mobile application marketplace that make the collection and transmission of certain types of consumer data critical to support application features expected by consumers. In light of market realities and consumer expectations, the FTC recognized that the original notice and consent requirements were burdensome, unnecessary, counterproductive, and potentially confusing to consumers, who might mistakenly fear that Sears’ applications were unusual or used consumer data in unusual ways.

This decidedly is not what is happening here. The FTC is leveraging § 3.72(b) to attempt to impose new and onerous obligations – without having to make its case in federal court -- based on what it perceives as changed circumstances, not to ease an order obligation as warranted by changed facts and the public interest.

What Happens Next?

The FTC’s Rules of Practice provide scant details about what happens next. According to 16 C.F.R. § 3.72(b), after receiving an answer from Meta, the FTC may determine whether the matter “raises issues of fact to be resolved” and order a hearing. If the briefs for a hearing raise “substantial factual issues,” the Commission may order an evidentiary hearing. It is then up to the Commission to determine whether modifying the order is “in the public interest” – a determination that a court of appeals may review.

At this point, the reach of any such modification is anyone’s guess. The Order to Show Cause asserts that the “changed conditions” include not only violations of FTC orders but also violations of “Section 5, COPPA, and the COPPA Rule,” and that it has “good cause to believe the public interest” and these “changed conditions” require modifying the 2020 Consent Order.

In the end, it may be up to a federal court of appeals to determine whether these assertions are correct. It is also possible, however, that the Supreme Court’s recent decision in Axon clears a path to an early challenge to the Proposed Order in federal district court. In a statement released on the same day as the FTC’s announcement, Meta stated that “[w]e will vigorously fight this action and expect to prevail.”

]]>
Two Epic Cases from the FTC: Spotlight on COPPA, Unfairness, Teens, Dark Patterns, In-App Purchases, Cancellations, and More https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/two-epic-cases-from-the-ftc-spotlight-on-coppa-unfairness-teens-dark-patterns-in-app-purchases-cancellations-and-more https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/two-epic-cases-from-the-ftc-spotlight-on-coppa-unfairness-teens-dark-patterns-in-app-purchases-cancellations-and-more Wed, 21 Dec 2022 17:45:00 -0500 Just in time for the holidays, the FTC has released two companion settlements resolving allegations that Epic Games (maker of the popular video game Fortnite) violated the Children’s Online Protection Act (COPPA) and the FTC Act, with Epic to pay $520 million in penalties and consumer redress. The cases build on existing FTC law and precedent but add new dimensions that should interest a wide array of companies subject to FTC jurisdiction.

Notably, the first case alleges COPPA violations (compromising the privacy and safety of users under 13) but adds allegations that Epic violated teens’ privacy and safety, too. And the second case alleges unauthorized in-app purchases – not just by kids, which was the focus of earlier FTC cases, but by users of all ages. Both cases rely on unfairness theories in extending their reach. Both incorporate the (now ever-present) concept of dark patterns (generally defined as practices that subvert or impair user choice). And both got a 4-0 Commission vote, with a strong concurrence from Republican Commissioner Wilson explaining her support for the FTC’s use of unfairness here. Neither case names any individuals.

The privacy case

The FTC’s privacy case alleges that, for over two years following Fortnite’s launch in 2017, Epic allowed kids to register with no parental involvement, and for kids and teens to play the game with features enabling them to communicate in real time with anyone on the platform. According to the FTC, these practices subjected kids and teens to bullying, harassment, threats, and “toxic” content, including “predators blackmailing extorting, or coercing children and teens…into sharing explicit image or meeting offline for sexual activity.” Further, says the FTC, Epic knew about these problems, resisted fixing them and, when it finally took action, added controls that were hard to find and use, and failed to cure the violations.

The complaint includes two counts. First, it alleges that that EPIC violated COPPA because it operated a website directed to children (based on e.g., visual content and features, merchandising tie-ins, and audience composition); knew specific users were kids (based on player requests, reports, and complaints): and failed to comply with COPPA’s notice, consent, access, and deletion requirements.

Second, the FTC alleges that EPIC engaged in an unfair practice by operating a “ubiquitous, freely available” video game that was directed at children and teens and that, through default settings allowing real time social interaction, put children and teens at risk of substantial injury.

Under the order, Epic must (1) fully comply with COPPA; (2) delete data collected in violation of COPPA; (3) provide default settings that prevent interaction between minors and other users, unless Epic obtains affirmative express consent from parents or teens or, alternatively, the user identifies as 13 or older through a neutral age gate; (4) implement a privacy program with third party assessments for 20 years; (5) submit annual certifications from Epic’s chief executive (for not just Epic, but certain affiliated companies); and (6) pay $275 million in civil penalties. The order’s definition of “affirmative express consent” prohibits the use of dark patterns.

What’s new or notable here? For one thing, the case provides further insight into how the FTC analyzes the “directed to children” element of COPPA (and to a lesser extent, “actual knowledge”), with detailed discussion of the factors it considered in the analysis. For another, the penalty is the largest ever obtained in a COPPA case and, according to the FTC, in any FTC rule violation matter. Of perhaps greatest significance, though, is FTC’s decision to address teen privacy in this case. Indeed, amidst all of the public discussion and concern about teen privacy (and on the same day Congress declined to include kid/teen privacy legislation in the end-of year omnibus package), the FTC announced a teen privacy case based on its existing FTC Act authority, with a 4-0 vote.

The dark patterns case

The FTC’s second settlement with Epic, framed in the press release as an “illegal dark patterns case,” is strikingly similar to the FTC’s earlier cases against Apple, Google, and Amazon involving unauthorized in-app charges by kids, but with some new elements. (In a prior post, we said that those three cases were essentially dark patterns cases but without the “catchy term.” I guess we were prescient!)

In brief, the complaint here alleges that Epic charged accountholders for purchases that weren’t authorized – either because accountholders weren’t told about, and didn’t authorize, their kids’ purchases, or because they themselves incurred unwanted charges due to poor disclosures and a deliberately confusing purchase flow.

At the same time, the complaint alleges, Epic designed the process for canceling purchases and seeking refunds to be difficult and cumbersome, and even deactivated user accounts (removing allof the user’s content) when users attempted to dispute unauthorized charges. According to the FTC, users incurred billions of dollars in unwanted charges. Further, despite receiving thousands of complaints and acknowledging the issues in internal emails (and even after the FTC took action against Apple, Google, and Amazon for similar practices), Epic failed to correct the problem.

The complaint contains two counts. First, it alleges that Epic engaged in unfair billing practices by charging users for in-app purchases without express informed consent from the accountholder. Second, it alleges that Epic unfairly denied consumers access to their accounts after they disputed unauthorized charges.

The order, in turn: (1) prohibits charging any user without express informed consent; (2) in the case of consent for continuing charges, requires that consumers be able to revoke consent at any time, using a mechanism that isn’t difficult, costly, confusing, or time-consuming, and is as simple as the mechanism used to initiate the charges; (3) enjoins Epic from denying someone access to their account “for reasons that include” disputing a charge; and (4) obtains $245 million in refunds.

What’s new or notable in this case? First, as already mentioned, the case extends, not just to obviously-unauthorized in-app purchases by kids, but also to purchases by older users. Second, the FTC is once again focusing on ease of cancellation (see our post on the FTC’s Vonage settlement), requiring that cancelling recurring charges be just as simple and frictionless as signing up. Third, the FTC appears to be saying that deactivating accounts following the dispute of charges is per se illegal (and, due to the broad wording of the injunction, that companies can never cancel an account for this or any other reason).

Finally, this is an example of the FTC’s continuing ability to obtain consumer redress post AMG. In a case like this, where no rule violations have been alleged, the FTC would normally be forced to pursue redress using a two-step process – an administrative action, followed by a federal district court action. Here, Epic has simply agreed to pay redress in one step.

Why two cases and not one?

Some readers might be wondering why the FTC split this matter into two cases. Under the FTC Act, the agency must refer any civil penalty case (here, the COPPA/privacy case) to the Department of Justice for filing in federal district court. By contrast, as discussed above, the FTC must initiate an administrative action to obtain redress in non-rule matters (here, the dark patterns case). While there may be some theory for consolidating both cases into one DOJ action, that would be exceedingly complicated – even more so than the shortcut the parties agreed to here. These cases were also handled by two different FTC divisions, which also may have weighed in favor of bifurcation.

* * *

One last thing – as if the cases themselves weren’t enough to digest, it’s worth taking a look at Epic’s post on the topic, explaining that the laws haven’t kept pace with technological developments but fully embracing the principles and requirements laid down in the settlements.

]]>
Blurred Lines: A Rundown on the FTC Workshop “Protecting Kids from Stealth Advertising in Digital Media” https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/blurred-lines-a-rundown-on-the-ftc-workshop-protecting-kids-from-stealth-advertising-in-digital-media https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/blurred-lines-a-rundown-on-the-ftc-workshop-protecting-kids-from-stealth-advertising-in-digital-media Wed, 26 Oct 2022 07:36:48 -0400 As we recently blogged here, the FTC’s review of the COPPA rule has been pending for over three years, prompting one group of Senators, in early October, to ask the agency to “Please Update the COPPA Rule Now.” The FTC has not yet responded to that request (at least not publicly) or made any official moves towards resuming its COPPA review. However, the agency is focusing on children’s privacy and safety in other ways, including by hosting a virtual event on October 19 on “Protecting Kids from Stealth Advertising in Digital Media.”

The FTC’s day-long event examined how advertising that is “blurred” with other content online (“stealth advertising”) affects children. Among other things, the event addressed concerns that some advertising in the digital space – such as the use of influencers on social media, product placement in the metaverse, or “advergames” – can be deceptive or unfair because children don’t know that the content is an ad and/or can’t recognize the ad’s impact.

The event focused in particular on: (1) children’s capacity at different ages to recognize advertising content and distinguish it from other content; (2) harms resulting from the inability of children to recognize advertising; (3) what measures can be taken to protect children from blurred advertising content; and (4) the need for, and efficacy of, disclosures as a solution for children of different ages, including the format, timing, placement, wording, and frequency of disclosures. The FTC has also sought public comment on these topics (until November 18).

The event dove deeply into these issues, with help from a range of legal, policy, behavioral, and communications experts. (See here for the agenda and list of panelists.) The discussion was interesting and substantive, and built on actions already undertaken in Europe and California to develop Age-Appropriate Codes governing child-directed content. However, the event left open the question of whether and how the FTC intends to address the issues discussed. Will it proceed via guidance or rulemaking? If rulemaking, does it plan to use COPPA, the pending Mag-Moss rulemaking on “commercial surveillance,” or some other regulatory vehicle?

All of these options present challenges: COPPA gives parents the tools to control the content that their children see, but generally doesn’t regulate the content itself. Mag-Moss is a long process, which the FTC has made especially complex with its sprawling ANPR. Finally, any rulemaking restricting kids’ advertising could run into the specific Mag-Moss provision (discussed here) limiting the FTC’s regulatory authority in this area. (On the other hand, protecting kids’ privacy and safety tends to be a bipartisan issue, which will assist the agency as it seeks to address these issues.)

Here’s more detail on what happened at the workshop:

First up, Opening Remarks from FTC Chair Lina Khan

In her opening remarks, Chair Khan set the stage by describing how much advertising has changed over the past few decades. In the past, every child would see the same ad, but now, the digital space allows companies to treat each child as an audience of one. Also, ads now blur commercial and organic content, and kids can’t tell the difference. Chair Khan stated that the FTC is exploring whether to update its COPPA rule, while also soliciting comments on commercial surveillance more broadly, including stealth advertising to kids.

Next, CARU VP Mamie Kresses provided a “Children’s Advertising Show and Tell”

Kresses (who co-ran the COPPA program when she was at the FTC) explained that CARU (the Children’s Advertising Review Unit at BBB’s National Programs) has increased its focus on monitoring ads to children in the digital space because that’s where the majority of ads now are. Advertisers need to make sure that, when they engage in blurring, they don’t mislead kids about the nature of the commercial content. Advertisers also should avoid manipulation – i.e., making it hard for children to tell when they’re making purchases or leaning too much on their emotions.

Importantly, Kresses said, the digital space has changed with the creation of computer-generated imagery influencers. Advertisers should make clear to kids that a game avatar, for example, is part of a paid relationship. In general, advertisers must clearly disclose whether something is an ad, even in these new, creative spaces.

Panel One: Children’s Cognitive Abilities – What do they know and when?

In this discussion, panelists highlighted why protecting children in the digital space is so important. Children lack the skills to understand the persuasive effects of advertising and also tend to believe that companies have their best interest in mind. When entertainment and commercial content are blurred (e.g., when a virtual reality character gives a child something in the metaverse, or an influencer promotes a product), the child cannot tell that these are ads and takes for granted the content is all good or true. In these spaces, children develop para-social relationships and emotional attachments with content creators and influencers, which affects their ability to evaluate ads and cues. As one panelist stated, the naiveté of children should not be a tool for advertisers.

Panel Two: The Current Advertising Landscape and its Impact on Kids

This panel primarily discussed whether stealth advertising is an unfair practice under the FTC Act. Citing the elements of unfairness, some thought that the harm outweighed the benefits, while others believed the research was not strong enough to prove harm, and that any harm is outweighed by the value of the information conveyed by the ads.

According to some panelists, blurred advertising can distract a child from the persuasive intent of an ad, causing them to rely more on emotion and less on rationality. Also, they said, research suggests that some methods of blurred advertising, such as the use of influencers, can be toxic to children, increasing eating disorders and adding to the current mental health crisis, especially when the advertising is targeted and prolonged. Other panelists argued that just because an advertisement works does not mean it’s harmful or deceptive. They also said that non-deceptive ads are protected under the First Amendment.

Panel Three: Looking Forward and Considering Solutions

The last panel discussed potential solutions to the challenge of stealth advertising, including disclosures, parental controls, educational programs, or even a ban on blurred advertising directed at children. As these panelists recognized, the solutions all come with limitations: (1) children cannot always read or understand disclosures; (2) parents don’t have the time or resources to monitor every piece of content their child consumes; (3) there’s a lack of resources for educational programs; and (4) a ban could face First Amendment issues.

Closing Remarks from FTC Associate Director Serena Viswanathan

In closing, Viswanathan stated that the FTC hopes to provide guidance and recommendations regarding how to comply with applicable laws and avoid problems associated with stealth advertising to kids. She said the FTC is following these issues with interest, eager to review the public comments, and continuing to engage with stakeholders.

* * *

That’s our quick summary for now. Stay tuned as we continue to track this topic and learn about the next steps the FTC may be planning in this area.

The next Ad Law News and Views newsletter is almost here to help you stay current on advertising, marketing, and and privacy law matters. Sign up here so you don't miss it .

]]>
Congress to FTC: “Please Update the COPPA Rule Now” https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/congress-to-ftc-please-update-the-coppa-rule-now https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/congress-to-ftc-please-update-the-coppa-rule-now Tue, 04 Oct 2022 08:13:31 -0400 Amidst all of the recent news and developments about the privacy of kids and teens (including multiple Congressional hearings; Frances Haugen’s testimony; enactment of the UK’s and California’s Age Appropriate Design Codes; the Irish DPC’s GDPR decision against Instagram; numerous bills in Congress; and the FTC’s ongoing focus on kids’ privacy in policy statements, workshops, and its “commercial surveillance” rulemaking), the FTC still has a powerful tool that seems to be sitting on the back-burner: the Children’s Online Privacy Protection Act (COPPA) and its implementing rule.

But some members of Congress just wrote a letter to the FTC, asking it to make COPPA a priority.

Background on COPPA

As most of our readers know, COPPA protects the privacy of kids under 13, mostly by requiring kid-directed web sites or apps, or sites/apps that have actual knowledge they’re dealing with kids, to get parental permission before collecting, using, or sharing kids’ data. Enacted in 1998, COPPA is now nearly 25 years old, a dinosaur in today’s fast-moving world of privacy. However, using the APA rulemaking authority granted in COPPA, the FTC has amended its COPPA rule to ensure that it keeps pace with developments – for example, extending the rule to ad networks and plug-ins; adding geolocation, persistent identifiers, photos, and videos to the definition of “personal information”; and strengthening the rule’s requirements governing data security, retention, and deletion.

However, those updates to COPPA became final in 2013 – almost ten years ago – and the FTC hasn’t amended the rule since then. Although the FTC initiated a rule review in July 2019, that review is still pending more than three years later. According to Regulations.gov, the Commission received over 176,000 public comments in the rule review. That’s a lot of comments, but it surely can’t explain such a lengthy delay.

Why hasn’t the FTC moved forward here?

There are likely a few reasons. First, as Commissioner Bedoya reportedly stated at a recent conference, the FTC is hoping that Congress updates the law – whether through amendments to COPPA (aka “COPPA 2.0”) or enactment of general privacy legislation – before the FTC must decide if and how to revise its COPPA rule. This is because Bedoya and other champions of kids’ privacy believe that fundamental changes to COPPA are needed – changes that go beyond what the FTC can do via rulemaking. Such changes include, for example, extending protections to teens or “tweens”; banning certain practices, like targeted advertising to minors; and changing the knowledge standard for general audience sites/apps from “actual knowledge” to “constructive knowledge.”

Second, the FTC appears to be considering whether it can expand kids’ and teens’ privacy protections through its FTC Act/Mag Moss authority – i.e., without having to rely on the COPPA statute. As we blogged a few weeks ago, the FTC’s “commercial surveillance” ANPR includes numerous questions about kids and teens that extend well beyond the FTC’s authority under COPPA, presumably in reliance on the underlying authority for the rulemaking (i.e., the FTC Act and Mag Moss). However, as we mentioned in the blogpost, there are obstacles to doing so, which the FTC is likely mulling. For one thing, the FTC’s power to expand kids’ protections through Mag Moss is limited. Indeed, Mag Moss requires proof that any practice to be regulated is “unfair or deceptive” but includes a specific provision restricting the FTC’s ability to regulate kids’ advertising using unfairness. (Advertising and privacy aren’t exactly the same thing, but there’s a big overlap.) For another thing, Congress and/or the courts might look askance at efforts by the FTC to “fill gaps” in COPPA using its general FTC Act/Mag Moss authority.

Third, the FTC (and its roughly 50-person privacy division) may simply have its hands full with the all of the tasks it has undertaken in privacy – including the “commercial surveillance” rulemaking; the upcoming workshop on “stealth advertising” directed to kids; the still-pending 6(b) study on social media and video streaming services (which included pointed questions regarding kids’ privacy); the “crackdowns” it has announced on EdTech, dark patterns (see here and here), and the misuse of sensitive health and location data; and other ongoing enforcement and policy demands.

Congressional letter

The above context makes the letter that four Democratic members of Congress (Senators Markey and Blumenthal and Representatives Castor and Trahan) sent to the FTC last week all the more interesting. The letter, which appears to be a response to Commissioner Bedoya’s statement that the FTC is waiting for Congress to act, essentially says “please don’t wait for us” and “we expect you to move forward on COPPA.” These members are all key players in kids’ privacy: Markey is the architect of the original COPPA statute and all four have sponsored bipartisan bills to expand kids’ and teens’ privacy (with the two bills in the Senate getting markups and votes out of the Senate Commerce Committee). However, they see time running out in this Congress for passage of their bills, and potentially time running out for the FTC before the 2024 election – and they don’t want the opportunity for kids’ privacy reform to slip away.

In particular, the letter commends the FTC for including questions about “surveillance threats to young users” in its “commercial surveillance” ANPR, and recognizes that Congress has a “responsibility to pass strong legislation” protecting kids. However, the letter stresses that the FTC also has a duty to “use its regulatory authority [under COPPA] to institute additional protections that address pressing threats online, a process the Commission has already begun.” According to the letter, such additional protections include expanding the scope of personal information covered by COPPA; fleshing out the prohibition on conditioning a child’s participation in an activity on excessive data collection; and updating and expanding protections related to platforms and online advertising.

Perspective

What does this mean? At one level, this finger-pointing exchange illustrates why state legislatures are moving forward more swiftly than Congress on privacy. At another, it provides some insight on the status of privacy legislation and the COPPA rulemaking. In particular, it suggests that (1) even the ardent privacy hawks in Congress don’t expect privacy legislation to pass during this session, and are now recognizing that publicly, and (2) FTC action on the COPPA rule might resume when the session in fact ends without passage of a privacy law (especially if Congress continues to send letters to the FTC like this one). We will continue to track all of these developments and dimensions here.

]]>
FTC Announces “Crack Down” on COPPA Violations by Ed Tech Companies https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-announces-crack-down-on-coppa-violations-by-ed-tech-companies https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-announces-crack-down-on-coppa-violations-by-ed-tech-companies Tue, 24 May 2022 12:44:35 -0400 Amidst the rising focus on privacy issues affecting children and teens (which we’ve highlighted here, here, here, and here), the FTC just released a new Policy Statement on COPPA, its signature rule protecting the privacy of kids under 13. The Policy Statement, which the FTC unveiled at its May 19 Open Meeting, focuses in particular on COPPA’s application to education technologies used in and by schools to support learning (including remote learning during the pandemic). All five Commissioners voted for the Statement, including newly sworn-in Commissioner Bedoya, and four issued their own written statements. After the meeting, a bipartisan group of Senators, as well as President Biden, released statements praising the FTC’s actions.

While the FTC’s Republican Commissioners questioned whether there was anything really new in the Policy Statement (which was based on longstanding COPPA provisions, as well as FAQs posted on the FTC’s website), all seemed to agree that it elevates the issues highlighted and shows that COPPA is a top FTC priority.

And of course it is! Protecting kids and their data is one privacy issue that most people, regardless of professional or political affiliation, support. Further, under COPPA, the FTC can seek monetary relief (even post-AMG) and conduct rulemaking under the Administrative Procedures Act, as opposed to under the more cumbersome Mag-Moss process. So it’s not surprising that this issue would be high on the FTC’s agenda during this dynamic and volatile time for privacy.

What does the Policy Statement Say?

The Policy Statement emphasizes that COPPA includes substantive limits on the collection and use of children’s data (not just notice and consent requirements), and says that the FTC intends to fully enforce these provisions, including in school and learning settings where “parents may feel they lack alternatives.”

The Statement focuses in particular on the use of ed tech tools and devices, which have become integral to a range of school activities (especially during the pandemic) but which, per the Statement, raise concerns about data collection, use, and sharing beyond what’s necessary for these activities.

The statement describes COPPA’s substantive limits as follows:

  • Prohibitions Against Mandatory Collection: Covered entities can’t condition participation in an activity on collecting more information from a child than is necessary for that activity. (This prohibition comes right out of the COPPA statute and is echoed in the rule.)
  • Use Prohibitions: Covered entities, including ed tech providers, are “strictly limited” in how they can use data collected from children; for example, ed tech providers that collect kids’ data pursuant to a school’s authorization may use it only for the authorized educational purpose. (This isn’t in the COPPA statute or rule but builds on COPPA guidance and FERPA.)
  • Retention Prohibitions: Covered entities can’t retain personal information from a child longer than reasonably necessary to fulfill the purpose for which it was collected. (This isn’t in the COPPA statute, but was added to the rule as part of the 2013 amendments.)
  • Security Requirements: Covered entities must have reasonable procedures to maintain the confidentiality, security, and integrity of kids’ data. (This comes from the COPPA statute, and the FTC expanded these duties in the 2013 amendments. See Section 312.8)

What are some key takeaways?

  • Kids’ privacy (and advertising) will be a major focus in the coming year. Yeah, this one is obvious, especially since the FTC released the Policy Statement alongside (1) a press release announcing an October 19 workshop on “stealth advertising” directed to children, and (2) proposed updates to the Endorsement Guides, with a new section addressing child-directed endorsements. (The May 23 announcement for Privacy Con also calls for research on privacy risks for kids and teens.) However, it’s worth noting that while the FTC has significant authority to address these issues (under COPPA and the FTC Act), that authority isn’t limitless. By law, COPPA is confined to children under 13, so the FTC can’t use it to address teens – currently a big concern. Further, Congress barred the FTC (long ago) from using its unfairness authority to regulate kids’ advertising. See FTC Act Section 18(h).
  • The provisions highlighted in the Policy Statement aren’t limited to ed tech (for the most part). I say “for the most part” because one of the limits discussed (use limitations) is narrower than the Statement suggests. In particular, the Statement implies that COPPA contains “strict” use limitations that extend to all covered entities. In fact, the COPPA law and rule don’t contain broad use limitations (other than the limits created by notice and consent) – ed tech is a special case, woven together from COPPA guidance and FERPA.
  • Ed tech and other covered entities should assess their compliance now. The FTC is unlikely to be sympathetic to any company caught violating the highlighted provisions. All five Commissioners voted for the Policy Statement; it reiterates longstanding COPPA requirements (mostly – see above); and it has bipartisan support in Congress. Although we may not see the big “crack down” promised in the FTC’s press release (indeed, the FTC has announced a lot of “crack downs” and it has a lot on its plate), the FTC is likely to conduct investigations and bring some cases here.
  • The status of the COPPA regulatory review remains a mystery. A formal review of the COPPA rule has been pending since 2019, and Commissioners Wilson and Philips (among others) queried why the FTC would issue a policy statement instead of completing that review. Where’s the rule? Neither the Policy Statement nor discussions at the Open Meeting answered that question.
  • The announcement provides clues about the FTC’s future plans. Clearly, the FTC is moving forward to impose more substantive limits on business conduct, as Khan has said it would. That’s evident here, as well as in FTC cases requiring, for example, deletion of data and algorithms as a remedy. Khan and her colleagues have also stated that they want to use unfairness more aggressively (for example, to stop “surveillance” and discrimination) – a strategy that could apply to cases and rulemakings across the FTC’s many program areas. In the coming months, we should expect to see stricter conduct limits imposed (or proposed) in multiple contexts, including in the FTC’s anticipated “surveillance” rulemaking.
  • The Commissioners are worried about staff morale. In the face of crushing reports about the steep drop in staff morale at the agency, all of the Commissioners (in their oral and written remarks) thanked staff profusely for their work in developing the Policy Statement and related announcements. (As well they should.)

We’ll keep the news coming on kids and privacy!

]]>
Webinar Replay: Teen Privacy Law Update https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/webinar-replay-teen-privacy-law-update https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/webinar-replay-teen-privacy-law-update Fri, 20 May 2022 12:22:16 -0400 The replay for our May 19, 2022 Teen Privacy Law Update webinar is available here.

Protecting the privacy and safety of kids and teens online is receiving enormous attention lately from Congress, the States, the FTC, and even the White House. Further, just last month, BBB National Programs unveiled a Teenage Privacy Program Roadmap offering a comprehensive framework for companies to use in identifying and avoiding online harms impacting teens.

Amidst these developments, Kelley Drye held a webinar to discuss the unique challenges associated with teen privacy. Dona J. Fraser, Senior Vice President Privacy Initiatives, BBB National Programs, and Claire Quinn, Chief Privacy Officer, PRIVO, along with Kelley Drye’s Laura Riposo VanDruff provided an update on key concerns and developments related to teen privacy, as well as practical tips for companies seeking to address these issues.

To view the webinar recording, click here or view it on the new Ad Law Access App.

Subscribe to the Ad Law Access blog to receive real-time updates on privacy and other related matters.

The Ad Law News and Views newsletter provides information on our upcoming events and a summary of recent blog posts and other publications.

Visit the Advertising and Privacy Law Resource Center for additional information, past webinars, and educational materials.

For easy access to all of our webinars, posts and podcasts, download our new Ad Law Access App.

Kelley Drye Unveils First-of-its-kind Advertising Law App
]]>
Lina Khan’s Privacy Priorities – Time for a Recap https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/lina-khans-privacy-priorities-time-for-a-recap https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/lina-khans-privacy-priorities-time-for-a-recap Wed, 16 Mar 2022 11:47:07 -0400 Lina Khan’s Privacy Priorities – Time for a Recap

Rumors suggest that Senator Schumer is maneuvering to confirm Alvaro Bedoya as FTC Commissioner sooner rather than later, which would give FTC Chair Khan the majority she needs to move forward on multiple fronts. One of those fronts is consumer privacy, for which Khan has announced ambitious plans (discussed here and here) that have stalled for lack of Commissioner votes. With Bedoya potentially on deck, now seems like a good time to recap those plans, as they might provide clues about what’s in the pipeline awaiting Bedoya’s vote. We focus here on three priorities Khan has emphasized in statements and interviews since becoming Chair.

Privacy Rulemakings

At the top of the list are privacy rulemakings, which could create baseline standards for the entire marketplace and enable the FTC to obtain monetary relief in its cases. (Recall that the FTC has limited authority to obtain money in its cases, especially post AMG, but that it can seek penalties or redress when it’s enforcing a rule.) Last December, Khan issued a Statement of Regulatory Priorities detailing the privacy rulemakings she wants to initiate or complete, including:

  • New rules to halt “abuses stemming from surveillance-based business models,” which could curb “lax security practices” and “intrusive surveillance,” “ensur[e] that algorithmic decision-making does not result in unlawful discrimination,” and potentially limit the use of “dark patterns” to manipulate consumers. (Yes, this is an ambitious one.)
  • Possible amendments to existing privacy rules – including the Children’s Online Privacy Protection Act (COPPA), the Health Breach Notification Rule, the Safeguards Rule (breach notification requirements), and the FACTA Identity Theft Rules (including the Red Flags Rule).
  • Possibly other new rules to “define with specificity unfair or deceptive acts or practices.”

Of note, absent Congressional legislation, any new privacy rules would need to follow the arduous process detailed in Section 18 of the FTC Act (referred to as “Mag-Moss” rulemaking). With Bedoya on board, the FTC can start these rulemakings, but they could still take years to complete, as we discuss here.

By contrast, the FTC can amend its existing privacy rules under the more manageable Administrative Procedures Act. Further, it’s already in the midst of rule reviews for all of the rules listed above (including COPPA’s, which started back in 2019). As a result, the FTC could act on these rules relatively quickly once Bedoya is on board.

Focus on Platforms

Khan has also made clear that she intends to focus on the tech platforms – which she has described as “gatekeepers” that use their critical market position to “dictate terms,” “protect and extend their market power,” and “degrade privacy without ramifications.” In a statement and accompanying staff report last September, Khan stated that such efforts would include:

  • Additional compliance reviews of the platforms currently subject to privacy orders (Facebook, Google, Microsoft, Twitter and Uber), followed by order modifications and/or enforcement as necessary.
  • As resources permit, examining the privacy implications of mergers, as well as potential COPPA violations by platforms and other online services – COPPA being of special importance as children have increasingly relied on online services during the pandemic. (Relatedly, report language accompanying the omnibus budget just signed into law directs the FTC to prioritize COPPA enforcement.)
  • Completion of the pending Section 6(b) study of the data practices of the social media companies and video streaming services, which was initiated in December 2020.

So far, we’ve seen limited action from the FTC on platforms (at least on the consumer protection side). Last October, the FTC issued a 6(b) report on the privacy practices of ISPs, but largely concluded that the topic should be addressed by the FCC. Then, in December, the FTC announced a settlement with online ad platform OpenX for COPPA violations. Given Khan’s bold plans in this area, it seems likely that there are matters in the pipeline awaiting Bedoya’s vote.

Stronger Remedies

The third major area that Khan has highlighted is obtaining stronger remedies in privacy cases – that is, considering “substantive limits”, not just procedural protections that “sidestep[] more fundamental questions about whether certain types of data collection and processing should be permitted in the first place.” By this, Khan is referring to deletion of data and algorithms, bans on conduct, notices to consumers, stricter consent requirements, individual liability, and monetary remedies based on a range of theories post AMG.

As to this priority, the FTC has moved ahead where it can (even prior to Khan’s tenure), often using strategies that have been able to garner unanimous votes. For example, its settlements with photo app Everalbum (for alleged deception) and WW International (for alleged COPPA violations) required deletion of consumer data and algorithms alleged to have been obtained illegally. Its settlement with fertility app Flo Health (for alleged deception about data sharing) required the company to notify affected consumers and instruct third parties that received their data to destroy it. The FTC also has alleged rule violations where possible, and partnered with other agencies to shore up its ability to obtain monetary relief.

But we’ve also seen signs of a more combative approach that could increase when Khan has the votes to push it forward. Of note, last September, the FTC issued an aggressive interpretation of the Health Breach Notification Rule, purporting to extend the rule’s reach (and thus its penalties) to virtually all health apps, even though a rule review was already underway. Further, FTC staff are making strong, often unprecedented demands for penalties, bans, and individual liability in consent negotiations. It’s even possible, based on an article written by former Commissioner Chopra and now-BCP Director Sam Levine, that the agency could attempt to use penalty offense notice letters (explained here) to lay the groundwork for penalties in privacy cases under Section 5(m)(1)(B). However, given the paucity of administratively litigated privacy cases (a key requirement under 5(m)(1)(B)), this would be very aggressive indeed.

* * *

For more on Khan’s privacy plans, you can read our earlier blogposts (here and here), as well as the various FTC statements and reports cited in this post. Or, if you like surprises, you can simply wait for Bedoya to be confirmed and see what happens. Needless to say, things should speed up at the FTC when he arrives.

Privacy Priorities for 2022: Tracking State Law Developments Thursday, March 24, 2022 at 4:00pm ET/ 1:00pm PT Register Here

In the absence of a federal privacy law, privacy has been at the forefront of many states’ legislative sessions this year:

  • Utah is poised to be the fourth state to enact comprehensive privacy legislation
  • Florida came close to passing legislation when the State House advanced privacy legislation by a significant margin
  • Other state legislatures have privacy bills on their calendars

Against this backdrop, state attorneys general continue to initiate investigations into companies’ privacy practices, and state agencies continue to advance privacy rulemakings under existing law.

Please join us on Thursday, March 24 at 4:00 pm ET for this webinar to learn about the latest developments in state privacy law, make sense of these developments and understand their practical impact.

]]>
New Federal Bill to Protect Kids’ Privacy: Will This One Break Through? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-federal-bill-to-protect-kids-privacy-will-this-one-break-through https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-federal-bill-to-protect-kids-privacy-will-this-one-break-through Tue, 22 Feb 2022 08:16:29 -0500 New Federal Bill to Protect Kids’ Privacy: Will This One Break Through?

Last October, we blogged that bipartisan momentum was building in Congress to enact stronger privacy protections for children, even if (and especially if) Congress remains stalled on broader federal privacy legislation. Of particular significance, we noted a strong push to protect, not just kids under 13 (the cutoff under COPPA), but also teens.

Since then, the momentum to enact stronger privacy protections for kids and teens has only increased, fueled by charges that social media and algorithms are causing self-harm and addictive behaviors by minors; multiple rounds of testimony from a former social media insider; and the desire in Congress to find common ground on some aspect of consumer privacy. Several kid/teen bills have been proposed in just the last couple months. (See for example here and here.)

The latest of these bills, introduced last week by Senators Blumenthal and Blackburn, has drawn a lot of attention – both because it’s bipartisan, and because these two Senators lead a key Senate subcommittee and held multiple hearings on algorithmic harms to teens. The bill (the Kids Online Safety Act or “KOSA”) has been endorsed by a number of organizations that focus on protecting kids’ safety and mental health. It also has drawn praise from Senator Cantwell, Chair of the Senate Commerce Committee, who told at least one media outlet that she is considering a committee markup on the bill.

KOSA’s stated purpose is to “require social media platforms to put the interests of children first” by establishing a “duty of care” to prevent harms to minors, “mak[ing] safety the default,” and enabling kids and parents “to help prevent the harmful effects of social media.” In announcing the bill, Blumenthal stated that it “would finally give kids and their parents the tools and safeguards they need to protect against toxic content—and hold Big Tech accountable for deeply dangerous algorithms.” Portions of the bill appear to be modeled after the UK’s Age Appropriate Design Code, a law that establishes content standards for minors, but is styled more like a guide setting forth principles and best practices. Here’s our summary of the bill’s key features:

  • It covers a wide range of entities. Although the press release and bill summary focus on social media platforms, the bill would extend to any “covered platform,” defined as “a commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.” This definition would reach a huge range of Internet-connected devices and online services. It also leaves open the question of what it means to be “reasonably likely to be used” by a minor. (Some of the bill’s provisions are triggered when a platform “reasonably believes” a user is a minor – a phrase that raises similar questions.)
  • It extends protections to any minor 16 or under. This contrasts with the under-13 cutoff in COPPA, the primary U.S. federal law protecting kids’ privacy. It’s not clear how this bill would interact with COPPA.
  • A covered platform has a duty of care to minors. It must act in the “best interests” of minors, including by preventing and mitigating “heightened risks of physical, emotional, developmental, or material harms” posed by materials on, or engagement with, the platform. Examples of such harm include: (1) self-harm, eating disorders, or other physical or mental health risks; (2) patterns of use indicating or encouraging addictive behaviors; (3) physical harm, online bullying, or harassment; (4) sexual exploitation; (5) promoting products that are illegal to minors; and (6) predatory, unfair, or deceptive marketing practices.
  • The platform must provide tools allowing minors or their parents to control the minor’s experience. These include “readily-accessible and easy-to-use” settings that can: (1) limit the ability of strangers to contact the minors; (2) prevent third-party or public access to a minor’s data; (3) limit features that would increase, sustain, or extend a minor’s use of the covered platform (e.g., automatically playing media); (4) permit opting out of algorithmic recommendations; (5) delete the minor’s account and personal data; (6) restrict sharing a minor’s geolocation information; and (7) limit time spent on the platform. The defaults for these settings must be the “strongest option[s] available” and the platform can’t use features that would encourage minors to weaken or turn off the safeguards. The bill does not specify whose choice would control if the parent and child both try to change the same settings.
  • The platform must enable parental controls by default for any user it reasonably believes to be a minor. These include tools allowing parents to: (1) control the minor’s privacy settings; (2) restrict purchases; (3) track the minor’s time on the platform; (4) change the default settings; and (5) control options necessary to prevent the harms described above. The platforms also must provide clear and conspicuous notice to the minor when parental controls are on, as well as a mechanism for a parent to submit reports of harm to a minor.
  • The platform must provide detailed disclosures about its safeguards, risks, algorithms, and advertising. As part of these requirements, the platform must obtain the minor’s or parent’s acknowledgement of the risks before the minor can use the platform; label and explain any advertising (including targeted advertising) aimed at minors; and allow minors or their parents to “modify the results of the algorithmic recommendation system” (as well as opt-out, as noted above).
  • Each year, the platform must obtain a third-party audit of the risks posed to minors and issue a public report. In addition to identifying the risks, the audit must address (1) what efforts the platform has taken to prevent or mitigate them; (2) how algorithms and targeted ads can harm minors; (3) how the platform collects and uses sensitive data, including geolocation, contacts, and health data; and (4) who is using the platform and for how long, by age ranges.
  • The bill gives the FTC APA rulemaking and civil penalty authority, and authorizes AG enforcement. Other provisions (1) give independent researchers access to the platform’s datasets; (2) direct the FTC and the Department of Commerce to establish guidelines for market or product research; (3) require a multi-agency study on age verification options; and (4) establish a Kids Online Safety Council to advise on the Act’s implementation.

Will this be the bill that breaks the federal privacy law stalemate and makes it into law? We suppose it’s possible. This bill is bipartisan, and Chair Cantwell is dangling the possibility of a markup – a rare event (at least lately) for a federal privacy bill. On the other hand, we’re already in an election year and Congress has a lot of other matters on its plate. Further, the extraordinary reach of the bill, coupled with its lack of clarity on a number of issues, suggest that many changes would be needed before this bill could become law.

Still, regardless of the outcome of this particular bill, it confirms what we predicted in October – that Congress has its sights on kids’ privacy, and that “kids” now includes teens 16 and under. Stay tuned.

Privacy Priorities for 2022

Please join us on Thursday, February 24 at 4:00 pm EST for Privacy Priorities for 2022, the second installment of Kelley Drye's 2022 practical privacy series. Register here.

]]>
Where to Find More Info on the FTC’s Top Rules for 2022 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/where-to-find-more-info-on-the-ftcs-top-rules-for-2022 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/where-to-find-more-info-on-the-ftcs-top-rules-for-2022 Tue, 21 Dec 2021 06:10:22 -0500 Where to Find More Info on the FTC’s Top Rules for 2022

Last week, Jessica Rich wrote about the FTC’s rulemaking plans for 2022. Make sure you read that post for a detailed analysis of what the Commission is planning. As we looked at which of those topics have generated the most interest on Ad Law Access recently, we wanted to point you to where you can find additional information.

  • The FTC will review its Guides Against Deceptive Pricing and its Guide Concerning Use of the Word “Free” and Similar Representations. Although most of the activity in these areas has taken place at the state level, it will be interesting to see what the FTC adds to the ongoing conversation. (Click here for more coverage on pricing claims.)
  • The FTC will review its Guides for the Use of Environmental Marketing Claims. A lot has changed since the Guides were last updated in 2012 and, as we’ve noted before, the lack of clarity in certain areas is leading to an increase in lawsuits and other challenges. (Click here for more coverage on green marketing.)
  • The FTC is still analyzing and reviewing the public comments it has received as part of its review of the Children’s Online Privacy Protection Rule (or “COPPA”). That hasn’t stopped the FTC and other regulators for brining enforcement actions, though. (Click here for more coverage on children’s privacy.)
  • The FTC is still analyzing and reviewing the public comments it has received as part of its review of the Endorsement Guides. As we’ve noted, this has been a hot topic, and the FTC recently sent out 700 warning letters, which could signal upcoming enforcement. (Click here for more coverage on endorsement issues.)

We’ll keep you posted, as these develop. In the meantime, rest up over the holidays because 2022 could be a bumpy year.

]]>
New Mexico Attorney General Settles Google Children’s Privacy Cases: A Unique Settlement Adds to a Complicated Landscape https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-mexico-attorney-general-settles-google-childrens-privacy-cases-a-unique-settlement-adds-to-a-complicated-landscape https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-mexico-attorney-general-settles-google-childrens-privacy-cases-a-unique-settlement-adds-to-a-complicated-landscape Thu, 16 Dec 2021 15:38:52 -0500 On December 13, the New Mexico Attorney General announced a settlement with Google to resolve claims regarding children’s privacy, including in the burgeoning EdTech space. The federal lawsuits Balderas v. Tiny Lab Productions, et al. and Balderas v. Google LLC, respectively, alleged COPPA and privacy violations related to collection of children’s information on game developer Tiny Lab’s apps and on Google’s G Suite for Education products. There are many features of this settlement that are worth discussing further as either potential future trends, or novel provisions.

Privacy Compliance Provisions

New Mexico’s injunction related to the Tiny Lab case includes changes to Google Play which will take effect after 120 days. Some of the specific measures include:

  • revising Google Play Families policies and including additional help pages to assist app developers in compliance;
  • requiring all developers to complete a form to indicate the targeted age group of apps;
  • using a rubric to evaluate app submissions to help determine whether it appeals to kids and check for consistency with the age group form;
  • requiring Families apps to certify they will comply with COPPA;
  • requiring all apps to only use SDKs that certify compliance with Google’s policies including COPPA;
  • requiring developers of Families apps to disclose collection of any children’s data including through third parties;
  • requiring a link to the app’s privacy policy on the Google Play store page; and
  • communicating whether an app is Child Directed to AdMob and AdMob will then follow COPPA pertaining to that data.
The content of the help pages the injunction requires do not just contain answers to frequently asked questions. They prescribe certain decisions by and limitations on third parties using the Google Play store. For example, Exhibit 3 to the injunction provides “if you serve ads in your app and your target audience only includes children, then you must use Google Play certified SDKs.”

In addition to these injunctive provisions, Google agreed to a set of voluntary enhancements to the Google Education platform intended to promote safety for students. New Mexico’s enforcement of these provisions is limited to its ability to confirm that Google has made the changes, or inquire as to the status of changes not made.

These injunctions demonstrate continued state Attorney General scrutiny regarding children’s information. And they come at a time that the Federal Trade Commission, which is responsible for issuing the COPPA Rule, is redoubling its COPPA efforts. The FTC’s ongoing COPPA Rule Review includes a number of questions regarding the intersection of COPPA and education technology. The FTC’s Statement of Regulatory Priorities, which we wrote about here, identifies COPPA as a top priority. And just this week, the FTC released its first COPPA settlement in almost 18 months.

Additional Settlement Terms Part from Historical State Settlements

Not to be ignored, several other provisions of the settlement have unique aspects that are extremely noteworthy. Google has agreed to pay New Mexico $5.5 million – with $1.65 million of that going to outside counsel for the state. The remaining payment will be used to fund the “Google New Mexico Kids Initiative” – a program jointly run by Google and New Mexico to award grants to schools, educational institutions, charitable organizations, or governmental entities. This unique allocation of the payment to the State could result in scrutiny that other State Attorney General settlements have met in the past where they attempted to designate funds to specific third party recipients. Some state legislatures may see it as an effort to appropriate funds without their involvement.

While New Mexico reserves its rights under the agreement regarding public statements, it has agreed to provide Google 24-hour notice before making any written public statement. Moreover, New Mexico agrees to consider in good faith any suggestions or input Google has, and any statement will reference the parties’ shared commitment to innovation and education. States routinely resist any efforts to negotiate press in this manner, and it is unclear how enforceable a provision like this could really be anyway. That said, this certainly reflects the cooperative nature of the agreement, in which case it’s fair to assume the State would issue press reflecting such cooperation anyway.

Google and New Mexico have also agreed to an ADR provision, requiring the state to pursue any disputes relating to the agreement in mediation prior to pursuing relief. This again is fairly unique for a State AG settlement, as is the overall form of the document (a “Settlement Agreement and Release”) – normally states will only settle matters through a consent judgment or a statutorily authorized Assurance of Compliance or Discontinuance. But just like some of the other unique provisions, agreeing to ADR may be more of a reflection of the cooperative nature of the agreement, and certainly presents opportunity for a more streamlined enforcement mechanism in the future.

It remains to be seen if these provisions will serve as a template for future state agreements with other companies, but given that state Attorneys General continue to pursue Google on a variety of fronts[1], New Mexico’s settlement will certainly be relevant in any future settlement efforts.

[1] Google Search Manipulation, Google Ad Tech, Google DOJ Search Monopoly, State of Arizona v. Google LLC geolocation privacy

]]>
Cracks in the Privacy Wall Between Kids and Teens: Is Teen Privacy Legislation on the Horizon? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/since-congress-enacted-coppa-the-regulatory-wall-between-kids-and-teens-has-been-a-remarkably-durable-one https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/since-congress-enacted-coppa-the-regulatory-wall-between-kids-and-teens-has-been-a-remarkably-durable-one Wed, 13 Oct 2021 09:20:44 -0400 Since Congress enacted the Children’s Online Privacy Protection Act (COPPA) in 1998, the regulatory wall between kids and teens has been a remarkably durable one. During all this time, COPPA, the primary U.S. law protecting kids’ privacy, has protected children under 13 but hasn’t provided any protections for teens. While California’s privacy law grants some rights to teens under 16, these protections are narrow (opt-in for data sharing) and only apply within that state. This means that teens are generally treated like adults for purposes of privacy in the U.S.

It’s not exactly clear why COPPA’s age 13 cut-off was chosen the first place. First year of teen-hood? Bar Mitzvah age? The age when children become too independent and tech-savvy to let their parents control their media? (Ahem – that happened at age six in my house.) Whatever the reasons for the original choice, age 13 has stuck, even as concerns about teens’ privacy and use of social media have grown, and Senator Markey and others have repeatedly proposed extending privacy protections to teens.

However, we might finally be seeing some cracks in the kid-teen privacy wall – cracks that could lead to a federal law protecting teens in the not-too-distant future.

These cracks are due to a confluence of events. Notably, in September 2020, the U.K. passed a law (the Age Appropriate Design Code or AADC) that requires all online commercial services “likely to be accessed by” kids and teens (including apps, programs, websites, games, community environments, and connected toys or devices) to meet 15 standards to ensure that their content is age appropriate. The law, which became fully effective in September 2021, starts with the principle that any service be designed with the “best interest of the child” as a primary consideration. It then details more specific requirements, including that defaults be set at the most protective level (e.g., location tracking and profiling are set to “off”), that data is not shared with third parties without a “compelling reason,” and that “nudge” techniques aren’t used to encourage minors to provide data or reduce their protections.

In response to the law, U.S. companies operating in the U.K. (notably, some of the large tech platforms) recently announced new protections for teens – a significant development in the long-running kid-teen debate, but one that has received relatively little attention. For example, Facebook/Instagram now says that for kids under 16, it will default them into private accounts; make it harder for “suspicious” accountholders to find them; and limit the data advertisers can get about them. Meanwhile, Google/YouTube has pledged similar protections for kids under 18, including private accounts by default; allowing minors to remove their images; applying restrictive default settings; turning off location history permanently; and limiting the data collected for ad targeting.

Following these announcements, Senator Markey and two House members sent a letter to the FTC urging it to ensure that these companies keep their promises, using its authority to stop deceptive practices under the FTC Act.

And there’s more. Last week, in developments widely covered in the media, a former Facebook employee detailed what she viewed as manipulation of teens using algorithms that kept them on the platform and exposed them to harmful content. Also, with broad-based privacy legislation perennially stalled, there’s been talk that Congress might prefer to tackle privacy issues that are more manageable and bipartisan (like kids’ and teen privacy) – talk that has only grown louder since the developments regarding Facebook.

Adding to the momentum, Senator Markey recently introduced a bipartisan bill (with Republican Senator Cassidy) that would provide privacy protections specific to teens, and Representative Castor has introduced a similar bill in the House. Further, the FTC has expressed a strong interest in protecting kids’ privacy, and in undertaking enforcement and rulemakings to extend U.S. privacy protections beyond the status quo.

In short, the kid-teen privacy wall is under pressure, and we could soon see a federal law, FTC enforcement, and/or (a harder climb) an FTC rulemaking using the agency’s Magnuson-Moss authority. For companies that collect teen data in connection with marketing or providing commercial products or services, this means double-checking your data practices to ensure that they’re age-appropriate and don’t expose teens to harms that can be avoided. (While the U.K.’s AADC principles are very ambitious, and do not apply to U.S.-only companies, they’re a valuable reference point.) It also means being prepared to explain and defend your data practices with respect to teens if regulators come knocking.

We will continue to monitor developments on this issue and provide updates as they occur.

]]>
“Not Outgunned, Just Outmanned” (For Now): Senate Hearing on Privacy Law Addresses Under-resourced FTC https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/not-outgunned-just-outmanned-for-now-senate-hearing-on-privacy-law-addresses-under-resourced-ftc https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/not-outgunned-just-outmanned-for-now-senate-hearing-on-privacy-law-addresses-under-resourced-ftc Fri, 01 Oct 2021 08:45:07 -0400 On September 29, 2021, the Senate Commerce Subcommittee held a hearing titled Protecting Consumer Privacy. The senators addressed the potential $1 billion earmarked to strengthen the FTC’s privacy work, the future of a federal privacy and data protection law, and a myriad of other privacy related topics such as children’s privacy.

Prepared Statements. In their opening testimonies, the witnesses emphasized different types of needs for the FTC.

  • David Vladeck, a former Director of the FTC Bureau of Consumer Protection, strongly advocated for a federal privacy law and additional funding for the FTC to support increased efforts on technology-centered consumer protection enforcement. In his remarks, Vladeck noted that the FTC has been wholly understaffed and underfunded for forty years, despite the agency’s ever increasing responsibilities and the complexity of issues it now faces. Additionally, Vladeck emphasized the need to increase the FTC’s enforcement powers by giving the FTC rulemaking authority under the APA and civil penalty authority.
  • Morgan Reed, the president of The App Association, focused more on the need for a federal privacy law to reduce the compliance costs for small businesses. He reiterated that the patchwork of state laws increases risk and costs for small businesses.
  • Maureen Olhausen, a former Acting FTC Chairman and Commissioner, shifted the conversation from funding for the FTC to the importance of a federal privacy law. She noted that “the FTC lacks explicit authority to enforce statutory privacy requirements or promulgate privacy regulations,” and that a federal privacy law should address this gap, allowing for enforcement, along with state attorneys general.
  • Ashkan Soltani, a former FTC Chief Technologist, primarily concentrated on the urgent need for expertise at the FTC. He emphasized the importance of hiring technologists and experts, but also paying them competitive rates to retain talent. The FTC is understaffed to handle litigation matters or to monitor compliance with consent orders, particularly those that require technical fluency.

Discussing the Federal Privacy Bill. The senators appeared to be in consensus that there is a need for a federal privacy law. Senator Wicker called on the Biden Administration to provide a liaison to Congress to prioritize the enactment of a law.

  • Right to Cure. Reed was adamant that a right to cure provision be written into the bill to protect small businesses from being punitively fined for unintentional mistakes such as not responding to an email within 30 days.
  • Private Right of Action. The witnesses went back and forth on the correct approach to a private right of action. While Soltani supported a private right of action as a means to “make up for the concern that there’s not enough enforcement capacity,” Olhausen was concerned that the private right of action would not result in consumer redress, but rather attorney’s fees. Reed stated that he preferred injunctive relief as a type of private right of action. Similarly, Soltani noted that in his experience, core behavior changes come not from fines, but injunctions and restrictions imposed on the business.
  • Preemption. Vladeck, Reed, and Olhausen supported federal preemption. Soltani agreed that a federal privacy law should only be a floor, and not a ceiling. In other words, a federal privacy law should preempt less rigorous laws to set a baseline standard, but states could enact additional measures and add further protections for their constituents.
  • Carve-out. The witnesses went back and forth on whether size of business should factor into whether an entity would be covered by the bill. Vladeck emphasized that small businesses can create big harms; therefore, the legislation needs to be focused on consumer harm rather than the size of the company. Reed agreed, but reiterated the need for a right to cure for small businesses.

Funding for the FTC. Senators focused on whether the FTC needs $1 billion to achieve its goal of protecting consumers. Vladeck wholeheartedly agreed and said that an additional $100 million a year would be a good start for the FTC. For example, on the recent Google litigation, Vladeck theorized that Google had 1,000 privacy attorneys, whereas the FTC had less than 100. Vladeck noted that the funding would be earmarked for hiring more attorneys, engineers, and technologists, as well as setting up a new bureau of privacy.

Children’s Privacy. The witnesses received several questions on their thoughts on protecting children’s privacy in the aftermath of reports on how social media impacts children’s mental health. Vladeck specifically advocated for lowering the scienter standard that the FTC has to prove to show that a developer knew their technology was tracking children. This mirrors the EU’s “constructive knowledge” standard that is used for children’s privacy. Additionally, Vladeck suggested getting rid of COPPA’s safe harbor program and rethinking the age limit. All witnesses agreed that children were vulnerable to targeted ads. In response to Senator Markey’s concern for children’s privacy, all witnesses responded that they would approve of another children’s privacy bill if Congress could not enact a sweeping data protection and privacy law for adults.

]]>
FTC COPPA Settlement Shows Ongoing Rift Over Privacy Harm https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-coppa-settlement-shows-ongoing-rift-over-privacy-harm https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-coppa-settlement-shows-ongoing-rift-over-privacy-harm Mon, 08 Jun 2020 06:52:52 -0400 The FTC’s most recent COPPA enforcement action, announced on June 4 with app developer HyperBeard, provides evidence of an ongoing debate within the Commission about privacy harm and the role of monetary relief in the agency’s privacy enforcement program. Specifically, Commissioner Noah Phillips voted against the settlement with app developer HyperBeard and two corporate officers, and argues in a dissent that the $4 million civil penalty (to be suspended after payment of $150,000) imposed on HyperBeard is too great for the consumer harm caused by the company’s alleged COPPA violation. In a separate statement, Chairman Simons defended the fine and rejected Commissioner Phillips’s argument that consumer harm should guide the FTC’s civil penalty calculations.

The action against HyperBeard also underscores that developers of child-directed services must not allow third-party interest-based advertising unless they meet COPPA’s parental notice and consent requirements, and that COPPA enforcement remains an FTC priority while the COPPA Rule is under review.

HyperBeard’s Alleged COPPA Violation

The central allegation in the FTC’s complaint is that HyperBeard allowed third-party ad networks to serve interest-based advertising in several child-directed apps without providing notice to parents or obtaining verifiable parental consent. To support its conclusion that HyperBeard’s apps were child-directed, the complaint cites the apps’ content (e.g., cartoon characters and kid-friendly prizes) as well as a cross-promotion with children’s books that were categorized as such and declared as intended for child audiences on Amazon.

The complaint cites specific alleged failures in how HyperBeard handled third-party advertisers. According to the complaint, HyperBeard did not “inform [the] third-party advertising networks that any of the [company’s apps were] directed to children and did not instruct or contractually require the advertising networks to refrain from behavioral advertising.”

Commissioner Phillips Dissents; Chairman Simons Responds

Although Chairman Simons and Commissioner Phillips apparently agreed on the merits of charging HyperBeard with a COPPA violation, they differed sharply on the magnitude and justification for the fine. Chairman Simons argues that “deterrence should come first” when it comes to calculating civil penalties. Specifically, penalties should “make compliance more attractive than violation.” The correct starting place for such a measure in this case was HyperBeard’s gain from allowing interest-based advertising in its apps. Consumer harm, in Chairman Simons’s view, is “inapposite” to the objective of deterrence.

Commissioner Phillips, however, argues that consumer harm should be “a more central consideration in the calculation of privacy penalties.” He also raises concerns that the FTC has been “relentless, without clear direction other than to maximize the amount in every case” and invites Congress to “pay attention to how the FTC is approaching monetary relief, including civil penalties, especially in privacy cases.” In Commissioner Phillips’s view, the only harm that HyperBeard caused was to collect data that allowed “users presumed to be children” to be served with interest-based ads without the parental notice and consent that COPPA requires. Such data collection is “endemic to the economy” and does not warrant a penalty that approaches the $5.7 million fine recently issued against Musical.ly – a case that involved a range of more serious alleged harms.

We do not expect a resolution of the questions about privacy harm and civil penalty calculations anytime soon. In the meantime, developers should take note of the FTC’s continuing attention to COPPA enforcement and closely examine how they manage any data that flows from child-directed apps to third parties.

]]>
Google Presses Pause on Joint Investigation by Agreeing to Record-Setting COPPA Settlement https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/google-presses-pause-on-joint-investigation-by-agreeing-to-record-setting-coppa-settlement https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/google-presses-pause-on-joint-investigation-by-agreeing-to-record-setting-coppa-settlement Fri, 06 Sep 2019 22:21:21 -0400 The FTC and the New York Attorney General recently announced a record-setting $170 million ($136 million to the FTC and $34 million to the NY AG) joint settlement with Google. The settlement resolves allegations that YouTube violated the Children’s Online Privacy Protection Act (“COPPA”) and is the largest penalty the FTC has ever received in a COPPA case, easily dwarfing the agency’s next-highest $5.7 million settlement with TikTok.

In the complaint, the agencies alleged that YouTube violated the COPPA Rule because the site did not provide direct notice to parents of, or attempt to obtain verifiable parental consent prior to, collecting children’s personal information. Although the site markets itself as general audience and prevents users under age 13 from creating an account, the complaint alleged that YouTube had actual knowledge that it collected children’s personal information, including persistent identifiers, through the child-directed channels commercial entities operate on the site. This “actual knowledge” made YouTube an “operator” subject to the COPPA Rule.

The complaint also noted that, while identifying itself as a general audience platform not subject to COPPA, YouTube promoted its site as the “favorite website for kids 2-12” in pitches to toy companies and manually rated its content based on age group. Still, the company treated any content self-identified as child-directed similarly to any other content in terms of monetization and behavioral advertising practices.

The settlement's injunctive provisions include:

  1. Developing and implementing a system for channel owners to designate whether their content is child-directed;
  2. Providing annual COPPA training for employees who manage child-directed channel owners;
  3. Making reasonable efforts to ensure that parents receive direct notice of the collection, use, or disclosure of children’s personal information;
  4. Posting prominently a link to that COPPA notice on any area of the site that collects children’s personal information;
  5. Obtaining verifiable parental consent prior to collecting, using, or disclosing children’s personal information; and
  6. Ceasing disclosing, benefiting from, or using any children’s personal information collected prior to the settlement within 90 days of the compliance date in January of 2020.
Although not specifically required by the settlement, YouTube also recently announced that it will be creating a site specifically for children’s content. Parents will be able to filter videos based on a child’s age, and track their children’s viewing history, and the site will not use behavioral advertising. Previously, the kids’ site was only available via mobile app.

Children’s privacy has been a hot topic recently, with the FTC announcing a request for comment on the COPPA Rule and legislators proposing updated COPPA legislation. Initial reports indicate that Congress sees this settlement as a slap on the wrist for the tech giant, as the total monetary penalty is allegedly less than two-days’ worth of profits for Google. Similar complaints were made after the FTC’s Facebook settlement, but it is left to be seen if disappointment with either settlement will be enough to push Congress to identify a new privacy enforcer via federal legislation.

]]>
Tech Innovation Prompting Revisions to Children’s Privacy Law? FTC Reviewing COPPA Rule and Holding Workshop https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/tech-innovation-prompting-revisions-to-childrens-privacy-law-ftc-reviewing-coppa-rule-and-holding-workshop https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/tech-innovation-prompting-revisions-to-childrens-privacy-law-ftc-reviewing-coppa-rule-and-holding-workshop Wed, 17 Jul 2019 21:59:51 -0400 The FTC announced today that it is seeking comments on a variety of issues related to the Children’s Online Privacy Protection (COPPA) Rule. Although only six years have passed since the FTC’s 2013 COPPA Rule update, the FTC is initiating an early review in response to new technologies and applicability to certain business sectors, including the educational tech sector, voice-enabled connected devices, and general audience platforms that host third-party child-directed content. Along with seeking comments, the FTC will hold a public workshop on October 7, 2019 to further examine the Rule.

The agency’s expedited review may also be a response to an increase in legislative scrutiny of children’s privacy, including questions about child-directed content on general audience platforms, and calls from various Senators for an overhaul of COPPA legislation. Specifically, Sens. Josh Hawley (R-Mo.) and Edward Markey (D-Mass.), COPPA’s original author, introduced a bill in March to update COPPA that would address children as old as 15 and provide additional rights to parents regarding children’s information.

Aside from the standard questions the FTC includes in its review, the proposed notice also seeks comments on COPPA’s major provisions, including definitions, notice and parental consent requirements, verifiable parental consent exceptions, and safe harbors. The notice also asks whether the FTC’s 2013 Rule amendments have led to stronger protections of children’s privacy and more parental control over collection of children’s personal information, and if the 2013 Rule amendments resulted in any negative consequences.

In particular, the FTC seeks public comment on the following issues:

  • Whether behavioral targeting and profiling should be addressed in defining exemptions for COPPA compliance;
  • Whether factors to determine if a website or online service is directed to children should be updated to address sites that may have a number of child users, but aren’t specifically directed to children;
  • Whether the Rule should incentivize operators of general audience platforms to gain actual knowledge of whether there is child-directed content on their platforms;
  • Whether the Rule should include an exception for audio files collected as replacement for written words, such as for voice-activated searches; and
  • Whether the Rule should include an exception to parental consent for use of education technology in schools, and, if so, what such an exception should look like.
Comments in response to the notice are due 90 days after the notice is published in the Federal Register.

The FTC’s accelerated evaluation of the Rule indicates that the agency is seriously considering the evolving technological landscape and how it affects children’s privacy. In light of the continuing conversations about online privacy and the FTC’s role in policing it, today’s announcement indicates that the FTC continues to take its job as privacy’s top cop seriously.

]]>
Time Runs Out for TikTok App: Developer Musical.ly Agrees to FTC’s Largest-Ever Fine for Children’s Privacy Violations https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/time-runs-out-for-tiktok-app-developer-musical-ly-agrees-to-ftcs-largest-ever-fine-for-childrens-privacy-violations https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/time-runs-out-for-tiktok-app-developer-musical-ly-agrees-to-ftcs-largest-ever-fine-for-childrens-privacy-violations Fri, 08 Mar 2019 11:11:35 -0500 The FTC recently announced a $5.7 million settlement with app developer Musical.ly for COPPA violations associated with its app (now known as TikTok)—the agency’s largest-ever COPPA fine since the enactment of the statute. The agency charged the app company, which allows users to create and share videos of themselves lip-syncing to music, with unlawfully collecting personal information from children.

To create a TikTok profile, users must provide contact information, a short bio, and a profile picture. According to the FTC, between December 2015 and October 2016, the company also collected geolocation information from app users. In 2017, the app started requiring users to provide their age, although it did not require current users to update their accounts with their age. By default, accounts were “public,” allowing users to see each other’s bios (which included their grade or age). It also allowed users to see a list of other users within a 50-mile radius, and gave users the ability to direct message other users. Many of the songs available on the app were popular with children under 13.

The FTC further alleged that Musical.ly received thousands of complaints from parents asserting that their child had created the app account without their knowledge (and noted an example of a two-week period where the company received more than 300 such complaints). The agency also noted that while the company closed the children’s accounts in response, it did not delete the users’ videos or profile information from its servers.

The FTC’s Complaint focused on practices spanning from 2014 through 2017. Musical.ly was acquired by ByteDance Ltd. in December 2017, and merged with the TikTok app in August 2018.

COPPA identifies specific requirements for operators who collect personal information from children under 13, including obtaining consent from parents prior to collection and providing information about collection practices for children’s data. Online services subject to the rule generally fall into two categories: (1) sites that are directed to children and collect personal information from them; and (2) general audience sites that have actual knowledge that they are collecting personal information from children. Civil penalties for violations of COPPA can be up to $41,484 per violation.

According to the FTC, Musical.ly’s app fell into both categories:

  1. The company included music and other content appealing to children on the app. For example, many of the songs included on the app were popular with children under 13, and the app used “colorful and bright emoji characters” that could appeal to children.
  2. Once the company began collecting the ages of its users, Musical.ly had actual knowledge that some of its users were under the age of 13. In spite of this, the company did not obtain consent from the parents of users under the age of 13, or comply with other COPPA requirements.
FTC Commissioners Chopra and Slaughter issued a joint statement on the settlement, pointing out that FTC staff had uncovered disturbing practices of a company willing to pursue growth at the expense of endangering children. They also noted that previously, FTC investigations typically focused on individual accountability in limited circumstances, rather than pursuing broader enforcement against company leaders for widespread company practices. The Commissioners further indicated that as the FTC continues to pursue legal violations going forward, it is time to “prioritize uncovering the role of corporate officers and directors” and to “hold accountable everyone who broke the law.”

This settlement indicates that the FTC continues to prioritize privacy enforcement—particularly where vulnerable audiences, such as children, are involved. Future FTC enforcement actions could signal an expanded approach to individual liability, including with respect to larger companies.

The case is also a good reminder of the value in performing robust privacy due diligence when considering acquiring an entity, and meaningfully assessing the risk of a company’s data practices before adding them to the portfolio. A widely popular business with significant data assets may not look as attractive once civil penalties and injunctive terms are added to the mix.

]]>
FTC Issues Enforcement Policy on Collection and Use of Voice Recordings of Children Under COPPA https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-issues-enforcement-policy-on-collection-and-use-of-voice-recordings-of-children-under-coppa https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-issues-enforcement-policy-on-collection-and-use-of-voice-recordings-of-children-under-coppa Fri, 27 Oct 2017 06:55:35 -0400 On Monday, the FTC issued an Enforcement Policy Statement stating that the Commission will not take action against operators that collect an audio file of a child’s voice as a replacement for written words, such as for translation into text, without first obtaining parental consent, provided the file is retained only for the brief time necessary for that purpose. However, the operator is still obligated to indicate in its privacy policy how it will collect and use children’s voice recordings, as well as its policy for deletion. The FTC reasons that, although COPPA applies to the collection online of files that contain children’s voices, even if they are immediately deleted after collection, the risk associated with such collection and immediate deletion is minimal.

There are some additional limitations on this policy. It does not apply when the operator requests personal information, such as a child’s name. Moreover, the operator may not use the recording for any use other than translation into text, such as behavioral targeting or identification purposes, before deleting it. If the operator does plan to collect other types of personal information, then it would be required to obtain parental consent.

Although the policy provides some clarification about the application of COPPA to voice-capture technologies, operators of child-directed services that collect children’s voices should ensure that their privacy policies and consent and notification procedures comply with COPPA requirements. Violators are liable for up to $40,654 in civil penalties per violation.

]]>
Privacy Certification Program Settles COPPA Violations with NYAG https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/privacy-certification-program-settles-coppa-violations-with-nyag https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/privacy-certification-program-settles-coppa-violations-with-nyag Thu, 13 Apr 2017 15:14:49 -0400 Last week, True Ultimate Standards Everywhere, Inc. (“TRUSTe”) agreed to pay the New York Attorney General (“NYAG”) a $100,000 penalty, and beef up privacy measures, to settle alleged violations of the Children’s Online Privacy Protection Act of 1998, 15 U.S.C. §§ 6501-6506 (“COPPA”). The Federal Trade Commission (“FTC”) is authorized to issue rules under COPPA, § 6502(b), and, along with the State Attorneys General, enforces it, §§ 6504(a), 6505(a). Generally, the FTC’s “COPPA Rule” mandates that operators of websites directed to children under the age of 13, or website operators that knowingly collect personal information from children under 13, must provide parents with notice of their information practices, make the collected information available upon request, limit the collection of such information, and secure it. See generally 16 C.F.R. pt. 312. Additionally, and relevant to this settlement, “collection” includes the passive tracking of children’s personal information through a persistent identifier, and not just its active collection. 16 C.F.R. § 312.2. Failure to comply may subject an operator to a penalty up to $40,654 per violation and a permanent injunction. See 15 U.S.C. §§ 45(a)(1), 45(m), 53(b), 57a(d)(3), 6502(c); 16 C.F.R. §§ 1.98, 312.9.

TRUSTe offers a privacy compliance certification solution to website and app operators. The FTC approved TRUSTe (along with six other organizations to date) as a self-regulatory, safe harbor program that subjects its operators to the same or greater protections for children as the COPPA Rule. See 16 C.F.R. § 312.11. Once certified, TRUSTe’s members may display the “TRUSTe Kids Privacy” seal on digital properties (and possibly avoid being the subject of government enforcement). According to the NYAG, however, TRUSTe failed to scan its members’ web sites for third-party tracking practices prohibited by COPPA during annual recertification assessments required of any safe harbor program. In other instances, TRUSTe failed to notify members about the detection of prohibited tracking, or accepted member representations about the legality of such tracking. These accusations came on the heels of TRUSTe’s settlement with the FTC a few years ago for similar concerns. See True Ultimate Standards Everywhere, Inc., Doing Business as TRUSTe, Inc.; Analysis To Aid Public Comment, 79 Fed. Reg. 69850 (Nov. 24, 2014). Similar to its settlement with the FTC, TRUSTe must improve the scanning, assessment, and reporting of any third-party tracking on its members’ sites in addition to its penalty to the NYAG.

COPPA remains a powerful tool in the arsenal of the FTC and State Attorneys General to curtail the ever-increasing marketing by online businesses to children under the age of 13. Failure to adequately prevent illegal tracking technology, or any other enumerated prohibition of COPPA, opens the door to regulatory scrutiny and enormous monetary penalties. The increasing enforcement of COPPA by states signifies that the welfare of children online is a top priority for federal and state authorities alike.

]]>