Ad Law Access https://www.kelleydrye.com/viewpoints/blogs/ad-law-access Updates on advertising law and privacy law trends, issues, and developments Sat, 16 Nov 2024 09:46:17 -0500 60 hourly 1 State AGs and Consumer Protection: What We Learned from . . . Connecticut Part I https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/state-ags-and-consumer-protection-what-we-learned-from-connecticut-part-i https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/state-ags-and-consumer-protection-what-we-learned-from-connecticut-part-i Thu, 11 May 2023 10:34:28 -0400 Our State AG webinar series continues with Connecticut Attorney General William Tong and Chief of the Privacy Consumer Protection Section Michele Lucan. During our webinar, the Connecticut AG’s office described their structure and the tools available to them to enforce the state’s consumer protection laws. In particular, as the fifth state to pass comprehensive privacy legislation, AG Tong highlighted the AG office’s privacy priorities and agenda which we will focus on here in Part I. We will explore the more general consumer protection topics in Part II. In case you missed it, here is a recording of the webinar.

While the Connecticut Unfair Trade Practices Act (CUTPA - Connecticut’s UDAP law) is broad and robust, in the privacy and cybersecurity space, the AG has additional authority derived from specific state laws such as the Data Breach Notification law and Connecticut’s Data Privacy Act (CTDPA). General Tong noted Connecticut’s dedication to enforcing consumer protection, as it relates to privacy, traces back to at least 2011 when it was the first state to create the Privacy Task Force and eventually a standalone Privacy Section in 2015.

Enforcing the CTDPA

AG Tong noted that the CTDPA reflects a “philosophical judgment of Connecticut to return rights and power of authority to consumers regarding their Personal Information.” As we have previously reported, the CTDPA provides for several rights such as the right to access, right to portability, right to correct mistakes, right to deletion, and the right to opt out of targeted advertising, sale, and profiling of personal data.

The CTDPA also creates obligations for “controllers” which are entities that alone or jointly determine the purpose and means of processing of personal data. Some of these obligations include: minimizing data collection and storage, providing transparency about the types of data collected and why, ensuring that data is secure, and obtaining consent to process sensitive data. Notably, the CTDPA also provides heightened protections for data related to teenagers, a hot topic for State AGs. Controllers must obtain consent to sell teens’ data or conduct targeted advertising to teens.

The Connecticut AG has the exclusive authority to enforce the CTDPA’s provisions, making their insights all the more valuable. However, the law provides for a cure period. This means that if the AG’s office is aware of a potential violation, the office will reach out to the entity and issue a notice of violation if the AG determines that a cure is possible. If the controller fails to cure within sixty (60) days, then the AG may bring an action against the entity. Similar to the data breach notification law discussed below, a violation is a per se violation of CUTPA.

Connecticut AG’s Advice: How to Prepare for Compliance with the CTDPA

With the CTDPA’s effective date quickly arriving on July 1, 2023, the Connecticut AG’s office provided their own recommendations on how to take steps and prepare for compliance with the new law:

  • Applicability. Entities should determine whether they meet the thresholds to trigger CTDPA obligations.
  • Data Inventory. Entities should understand what data they are collecting and where it lives, while also thinking about how to minimize data collection if possible.
  • Consumer Facing Updates. Entities should review their privacy policies to ensure they are up to date, and that entities are prepared to operationalize and effectuate the mechanisms for consumers to take advantage of their privacy rights (i.e. ensure links work).
  • Internal Updates. Entities should review and update their vendor contracts to address CTDPA requirements and conduct employee training to minimize data security risks.

Safeguards and Data Breach Notice Laws

The Connecticut Safeguards Law, referred to by the office as the basic building blocks for Connecticut’s privacy infrastructure, requires any person in possession of Personal Information (PI) to safeguard data against misuse by third parties, and destroy, erase, or make unreadable the data prior to disposal. Penalties under the Safeguards law can be significant—up to $500 per intentional violation and up to $500,000 for a single event.

Connecticut defines PI as information capable of being associated with a particular individual through one or more identifiers. The AG’s office noted that PI is broadly defined. For instance, PI includes a person’s name, but also covers other identifiers including social security numbers, driver’s license numbers, credit/debit card numbers, passport numbers, biometric information, online account credentials, and certain medical information.

Connecticut’s Breach Notification Law requires that an entity that experiences a data breach provide notice to the Connecticut AG without “unreasonable delay” within a 60-day limit. The law also requires that the entity provide two years of ID theft prevention services if social security numbers and taxpayer numbers (ITINs) are compromised. A violation of this law is a per se violation of CUTPA. Last year, Connecticut received over 1,500 data breach notifications, and the office is experienced in reviewing all types of data breaches and determining which ones to pay attention to.

Our Take

Connecticut has consistently been a leader in data security and privacy issues over the last decade, and with the passage of the CTDPA we expect to see the office double down on enforcement efforts. Businesses should pay particular attention to the compliance tips highlighted above by Ms. Lucan and General Tong, as there is little doubt the office will be actively looking for targets right out the gate on July 1. In General Tong’s words, “data privacy and the law of data privacy are here. Its obligations are here, present, and they are demanding.” Privacy laws can’t be approached as “optional” or “too cumbersome” to take precautions and manage the risks of collecting data. Law enforcement will take action where we believe people have failed to meet their obligations under the law” as that is what people in the state of Connecticut “expect and demand.”

Given Connecticut’s leadership in the multistate Attorney General community, we would not be surprised to see other states joining Connecticut in enforcement efforts, even without a comprehensive privacy law (relying on their UDAP authority as states have done for decades). Understanding your data collection and security practices is more important than ever.

***

Be sure to look out for Part II of this blogpost where we will talk about Connecticut’s UDAP law in more detail as well as priorities and more tools that the Connecticut AG’s office uses to enforce consumer protection laws. We also have an exciting blogpost recapping our conversation with the Nebraska Attorney General just around the bend. Stay tuned.

]]>
Senate Judiciary Hearing on Kid’s Privacy – Sunny with a Chance of Section 230 Reform https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/senate-judiciary-hearing-on-kids-privacy-sunny-with-a-chance-of-section-230-reform https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/senate-judiciary-hearing-on-kids-privacy-sunny-with-a-chance-of-section-230-reform Thu, 23 Feb 2023 11:11:31 -0500 As we’ve described here, the Senate made major strides last year on legislation to protect children’s privacy and safety online. Indeed, two bipartisan bills sailed through a Commerce Committee markup, though they didn’t ultimately make it to the floor for a Senate vote. This year, kids’ privacy is once again getting attention, beginning with a February 14 Senate Judiciary Committee hearing on the issue. Members used the hearing to tout last year’s bills and mention some new ones, too. They also touched on other top-of-mind issues involving the tech industry, such as Section 230 reform and encryption.

Of note, Senators Blumenthal and Blackburn discussed the Kids Online Safety Act (KOSA) (their bill from last year, just re-introduced), which would impose a “duty of care” on tech companies and shield young people from harmful content. Senator Hawley, in turn, talked up his Making Age-Verification Technology Uniform, Robust, and Effective Act (MATURE Act), which would enforce a minimum age requirement of 16 for users of social media platforms. (As noted below, panelists were quite skeptical that this would work.)

The event highlighted, once again, the bipartisan interest in tackling the harms that minors face online. Here’s more detail on what happened:

First up, opening remarks from Chairman Durbin (D-Ill.), Ranking Member Graham (R-S.C.), and Senators Blumenthal (D-Conn.) and Blackburn (R-Tenn.)

Chairman Durbin kicked off the hearing by explaining that the internet and social media have become a threat to young people. He noted that while the Internet offers tremendous benefits, cyberbullies can hurt kids online via platforms like Facebook and Snapchat. Durbin stated that “we don’t have to take” the lucrative business that the platforms (who were not in attendance) have created to keep kids’ eyes glued to the screens. He said that the addictive nature of the platforms has created a mental health crisis – causing anxiety, stress, and body image issues, for example – which can lead to tragic results.

Sen. Graham announced that he and Sen. Warren (D-Mass.) are working on a bipartisan bill to create a Digital Regulatory Commission with the power to shut down websites that don’t engage in “best business practices” to protect children from sexual exploitation online. He also expressed concern about the lack of regulatory oversight for the abuses of social media.

Sen. Blumenthal promoted KOSA, and also committed to major reform of Section 230. Sen. Blackburn echoed these sentiments, describing social media as the Wild West: the kids are the product, there are very few rules, and data is taken and sold to advertisers. Blackburn added that she wants social media to be safer by default, not something that becomes safer after a consumer takes additional steps. She also said that she supports transparent audits of company practices.

Next, the Statements from Witnesses

Kristin Bride is a Survivor Parent and Social Media Reform Advocate whose son took his own life after being cyberbullied via anonymous messaging apps on Snapchat. She explained how she was ignored when she reached out to the apps for help in learning the bullies’ identities. Then, when she filed a class action against Snap, it was dismissed due to Section 230 immunity. While Snap did remove the anonymous messaging apps after she filed the class action, she has seen new apps pop up that charge children to reveal the identities of those sending harmful messages.

Emma Lembke, a sophomore in college and the founder of the Log Off movement, expressed frustration with being a passive victim of big tech, saying that social media led her to disordered eating. She also stressed the importance of including young people in efforts to effect change.

Michelle DeLaune is President and CEO of the National Center for Missing & Exploited Children. One of her chief concerns is companies’ use of end-to-end encryption. Encryption allows people to send messages that platforms cannot read or flag for harmful content. She described this as “turning off the lights” on content that exploits children.

John Pizzuro is the CEO of Raven and a former Commander of the Internet Crimes Against Children Department of the New Jersey State Police. He explained that police are overburdened with cybercrime reports, forcing them to become far more reactive and less proactive in protecting children online.

Dr. Mitch Prinstein, Chief Science Officer at the American Psychological Association, testified that many social media apps are directed to children. He said that the average teen picks up their phone over 100 times per day and spends over eight hours per day on social media, developing a clinical dependency. According to Prinstein, social media stunts kids’ ability to develop healthy relationships; increases loneliness, stress, anxiety, and exposure to hateful content; and causes lack of sleep. He supports more federal funding for research on the effects of social media, and believes that manipulating children and using their data should be illegal.

Finally, Josh Golin, Executive Director of Fairplay, said he supports policies to make the internet safe, non-exploitative, and free of Big Tech. He said that digital platforms are designed to maximize engagement because companies make more money the longer kids spend online. They use manipulative design and relentless pressure to entice kids to use the platforms as often as possible, thereby profiting from targeted ads. Golin supports limits on data collection; banning surveillance advertising based on teen's vulnerabilities; holding platforms liable for design choices that affect young people; and requiring transparency for algorithms.

Questions from the Committee

Committee members asked an assortment of questions, some related to kids’ privacy and others related to tech issues more broadly.

Sen. Durbin highlighted Section 230 immunity, which last year’s EARN IT Act would amend. Sen. Whitehouse (D-R.I.) also said he supported Section 230 reform and that he wants to see class actions like Ms. Bride’s be allowed to continue, rather than dismissed on immunity grounds. Sen. Hirono (D-HI) cautioned against a wholesale repeal of Section 230 and stressed that any reform should be done carefully. Sen. Graham reiterated his support for a Digital Regulatory Commission, while also noting that repeal or reform of Section 230 would be a step in the right direction.

Others, such as Sens. Lee (R-Utah) and Coons (D-Del.), asked questions about the complaints received by the National Center for Missing and Exploited Children, and said they support better research on the design choices of social media. Sen. Coons added that he supports new mandates for platforms, including a duty of care; limits on data collection from kids; and disclosures regarding how they manage content. (As to the latter, see Coon’s bill from last year here).

Sen. Blumenthal echoed the call for further research on social media, including as to the role that it plays in, on the one hand, harming members of the LGBTQ+ community, and, on the other, providing this community with access to important information and connections. Meanwhile, Sens. Blackburn and Grassley (R-Iowa) mentioned the link between social media and drug-overdose deaths, and Sen. Ossoff (D-Ga.) mentioned his legislation, with Grassley, to address child exploitation.

Rounding out the discussion, Sen. Cornyn (R. Tex.) described the online landscape as “designed” to “hoover up” children’s data and stated that he supports legislation to “attack the business model.” Sen. Klobuchar (D. Minn.) said she supports imposing a duty of care on companies, as well as requirements to stop companies from pushing harmful content in response to innocent queries (for example, serving content related to disordered eating in response to a search for “healthy food”). Sen. Welch said he wants to focus on companies that target more clicks to obtain more ad revenue.

Finally, Sens. Kennedy (R-La.) and Hawley both pushed for legislation that would ban children under 16 from using social media. The witnesses generally agreed that this is unrealistic.

* * *

That’s our snapshot of the hearing. The question now is whether Congress will move forward on kids’ privacy legislation and succeed in 2023 where it fell short in 2022. Some related questions are:

  • Where is the House on this issue? Will it resume its push for general privacy legislation (see last year’s bipartisan ADPPA) or follow the Senate’s lead and focus more narrowly on kids?
  • What about the Senate Commerce Committee, which typically leads on privacy issues and could take up other bills, such as Senator Markey’s COPPA 2.0, which Markey has said he will re-introduce this year?
  • What will happen at the FTC, which has sidestepped its review of COPPA (at least for now) but whose “commercial surveillance” rulemaking could affect the kids’ and teens’ privacy?
  • With the Supreme Court seemingly reluctant to alter the scope of Section 230, will Congress tackle this issue in a serious way?

Stay tuned as we continue to track these and other developments related to kids’ privacy.

]]>
Upcoming Price Gouging and Employee/HR Data Privacy Webinars https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/upcoming-price-gouging-and-employee-hr-data-privacy-webinars https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/upcoming-price-gouging-and-employee-hr-data-privacy-webinars Mon, 18 Jul 2022 14:53:02 -0400 How To Protect Employee/HR Data and Comply with Data Privacy Laws Wednesday, July 20

As workforces become increasingly mobile and remote work is more the norm, employers face the challenge of balancing the protection of their employees’ personal data and privacy against the need to collect and process personal data to recruit, support and monitor their workforces. Mounting regulations attempt to curb employers’ ability to gather and utilize employee data—from its historical use in processing employee benefits and leave requests to employers’ collection, use or retention of employees’ biometric data to ensure the security of the organization’s financial or other sensitive information systems. Learn what employers can do now to protect employee data and prepare for the growing wave of data privacy laws impacting the collection and use of employee personal data.

RSVP

Avoiding Price Gouging Claims Wednesday, August 3 Recently State Attorneys General, the House Judiciary Committee, and many others have weighed in on rising prices in an attempt to weed out price gouging and other forms of what they deem “corporate profiteering.” States and federal regulators are carefully looking at pricing as consumers and constituents become more sensitive to the latest changes and price gouging enforcement is an avenue states may be able to use to appease the public. Unlike other emergencies in the past, the current state of supply chain and labor shortages, along with skyrocketing costs for businesses, make it unrealistic for companies to simply put a freeze on any price increases. This webinar will cover:

• The basics of price gouging laws and related state emergency declarations and how to comply • The differences and varied complexities in state laws • General best practice tips • How AGs prioritize enforcement

Register

* * * *

Find more upcoming sessions, links to replays and more here

]]>
Upcoming Webinars: State Attorneys General 102 and How To Protect Employee/HR Data and Comply with Data Privacy Laws https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/upcoming-webinars-state-attorneys-general-102-and-how-to-protect-employee-hr-data-and-comply-with-data-privacy-laws https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/upcoming-webinars-state-attorneys-general-102-and-how-to-protect-employee-hr-data-and-comply-with-data-privacy-laws Mon, 27 Jun 2022 08:45:40 -0400

As discussed in State Attorneys General 101, State Attorneys General are the primary enforcers of consumer protection laws within their state and hold sweeping powers to protect the public they serve by launching investigations and litigation alone or in multi-state actions involving numerous states and territories across the country.

As requested by many, please join Kelley Drye State Attorneys General practice Co-Chair Paul Singer and Senior Associate Beth Chun for State Attorneys General 102. This short 30-minute webinar picks up where we left off and answers a number of questions regarding:

  • Pre-suit/investigation notice requirements for Attorneys General
  • Additional information on the scope of Attorneys General investigative authority and how to challenge an investigation
  • Consumer Complaints: differences among the AGs on handling and use
Register here

July 20

How To Protect Employee/HR Data and Comply with Data Privacy Laws As workforces become increasingly mobile and remote work is more the norm, employers face the challenge of balancing the protection of their employees’ personal data and privacy against the need to collect and process personal data to recruit, support and monitor their workforces. Mounting regulations attempt to curb employers’ ability to gather and utilize employee data—from its historical use in processing employee benefits and leave requests to employers’ collection, use or retention of employees’ biometric data to ensure the security of the organization’s financial or other sensitive information systems. Learn what employers can do now to protect employee data and prepare for the growing wave of data privacy laws impacting the collection and use of employee personal data.

RSVP

Find more upcoming sessions, links to replays and more here

]]>
New California Draft Privacy Regulations: How They Would Change Business Obligations and Enforcement Risk https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-california-draft-privacy-regulations-how-they-would-change-business-obligations-and-enforcement-risk https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-california-draft-privacy-regulations-how-they-would-change-business-obligations-and-enforcement-risk Mon, 30 May 2022 18:24:04 -0400 On Friday May 27, 2022, the California Privacy Protection Agency (CPPA) Board announced its next public meeting will be on June 8, 2022. The announcement simply stated the date of the meeting, that there are “some discussion items [that] will be relevant to the Agency’s rulemaking work,” and that information on how to attend the meeting and the meeting agenda could be found on the CPPA’s site. It did not take too many Internet sleuths to review the posted agenda, and note that Agenda Item No. 3 was “Discussion and Possible Action Regarding Proposed Regulations, Sections 7000–7304, to Implement, Interpret, and Make Specific the California Consumer Privacy Act of 2018, as Amended by the California Privacy Rights Act of 2020, Including Possible Notice of Proposed Action,” and that the posted meeting materials included a copy of the “Draft Proposed CCPA Regulations.” In addition, Agenda Item No. 4 provides for “Delegation of Authority to the Executive Director for Rulemaking Functions.” Full stop, June will be an active month for California privacy rulemaking.

But let’s unpack the surprises in the draft regulations. The 66-page draft proposed CCPA regulations (and they are referred to within the document as CCPA regulations) take a prescriptive approach to privacy obligations. In concept, that is not too surprising. Of concern, in some areas, they uniquely depart from approaches set forth by other state privacy laws. The quiet release of dramatic new obligations while bipartisan Senators reportedly may be reaching consensus on federal privacy legislation that could preempt state law obligations puts companies doing business in California in a difficult position. Do they scramble to operationalize new programs to comply with the CPPA’s new requirements, if finalized? Do they wait on Congress? Do they choose a third path? For now, while these draft rules are certain to change in some respects before they are finalized, they directionally outline a new privacy baseline for the United States. We highlight certain aspects of the draft rules below, with a particular focus on accountability and risk exposure, how data can be shared with other businesses for digital advertising or other functions, and what those business agreements must include to lawfully support such business relationships and comply with the amended CCPA.

Quick and Costly Potential CPPA Enforcement

Consumers, the CPPA, and the California Attorney General’s Office all are empowered to take businesses (and contractors, service providers, and third parties) to task for perceived non-compliance with privacy obligations. Among all of the proposed changes in the draft regulations, the enforcement provisions should cause many companies, regardless of their role, to pause and evaluate whether they’ve allocated sufficient resources to address privacy compliance. While there is not a privacy private right of action under the CCPA/CPRA, the draft rules set forth a new increased, and fast tracked form of compliance monitoring and action that could be surprising to many companies and costly.

First, while there are provisions about requiring consumers to file sworn complaints, the CPPA provides that it can accept and initiate investigations on unsworn and anonymous complaints too. For every sworn complaint, the CPPA must notify the consumer complainant in writing of what actions the Agency has taken or plans to take and the reasons for action or non-action. Because the Agency has to respond to each complaint, this could turn into a routinized process of a high volume of complaints forwarded to businesses, with tight timeframes to respond in writing or else face violations and administrative fines.

The rules provide that there is “probable cause” of a privacy violation if “the evidence supports a reasonable belief that the CCPA has been violated.” There is no mention of extensions of time for good faith reasons. Under the statute, the CPPA can find a violation through a probable cause hearing if it provides notice by service of process or registered mail with return receipt to the company “at least 30 days prior to the Agency's consideration of the alleged violation.” The notice must contain a summary of the evidence, inform the company of their right to be present “in person and represented by counsel.” The “notice” clock starts as of the date of service, the date the registered mail receipt is signed, or if the registered mail receipt is not signed, the date returned by the post office. It’s possible this process occurs through the forwarding of unverified consumer complaints.

Under the draft rules, a company can request the proceeding be made public if they make a written request at least 10 business days before the proceeding. A company has a right to an in-person proceeding only if it requests the proceeding be made public. Otherwise, the proceeding may be conducted in whole or in part by telephone or video closed to the public. Participants are limited to the company representative, legal counsel, and CPPA enforcement staff. The CPPA serves as prosecutor and arbiter, and the draft rules do not define how the agency preserves its neutrality in its latter role.

The CPPA makes a determination of probable cause at such proceeding “based on the probable cause notice and any information or arguments presented at the probable cause proceeding by the parties.” If a company does not participate or appear, it waives “the right to further probable cause proceedings” (it’s not clear in the draft rules whether that is limited to the facts of that matter, or future alleged violations) and a decision can be made on the information provided to the CPPA (such as through a complainant).

The CPPA then issues a written decision and notifies the company electronically or by mail. Of concern, the draft rules provide that this determination “is final and not subject to appeal.” Under the statute, violations can result in an administrative fine of up to $2500 for each violation, and up to $7500 for each intentional violation or if the violation involves minors. Multiple parties involved can be held jointly and severally liable. It’s conceivable that violations may be calculated on any number of factors that could add up substantially, and as contemplated by these draft rules, there is no process to challenge such judgments, including if there are factual or legal disputes. One can imagine future legal proceedings that challenge a variety of the legal bases for such a structure if these rules are finalized as drafted.

Service Provider Requirements and Restrictions

Data Privacy Addendums Get a Further Tune Up, and Open Question on Whether They Need to be Bespoke. One aspect of state privacy law compliance that has consumed much resources and time are the service provider contracts. Who is a service provider? What must the contract say? What restrictions apply to service providers (or contractors)? The draft rules continue to add more obligations.

One must have a written contract in place that meets all of the requirements outlined below to even qualify as a service provider and contractor. The contract requirements are very granular, and go beyond what most current privacy addendums (or technology provider terms and conditions) look like today, and include:

  • Restrictions from selling or sharing the business’s personal information.
  • Identify which specific business purposes and services are required for processing the business’s personal information, and that such disclosure occurs only for the limited and specified business purposes set forth in the contract. This cannot be stated generally with reference to the agreement, but rather requires a specific description.
    • This language suggests that a one-size-fits-all data processing agreement for all vendors processing personal information for different business purposes or functions might not be sufficient, which is very concerning from a resource and practicality standpoint.
  • Restricting the processing of personal information outside or for any other purpose from those business purposes in the contract, including to service a different business, unless permitted by the CCPA. Awkwardly, the proposed rule suggests that all of the specific business purpose(s) and service(s) identified earlier would need to be restated as part of the restrictions.
    • On this last point, the draft rules underscore this specific example: “a service provider or contractor shall be prohibited from combining or updating personal information received from, or on behalf of, the business with personal information that it received from another source unless expressly permitted by the CCPA or these regulations
  • Requiring compliance with all applicable provisions of the CCPA, including providing the same level of privacy protection as applicable to businesses, to cooperate with the business for handling consumer rights requests, and reasonable data security provisions.
  • Reasonable audit provisions to ensure CCPA compliance, such as “ongoing manual reviews and automated scans of the service provider’s system and regular assessments, audits, or other technical and operational testing at least once every 12 months.”
  • Notification to the business within 5 business days if the service provider/contractor determines it cannot meet its obligations.
  • Providing the business the right to take reasonable steps to stop and remediate any unauthorized use of personal information by the service provider/contractor, such as “to provide documentation that verifies that [the service provider/contractor] no longer retain[s] or use[s] the personal information of consumers that have made a valid request to delete with the business.”
  • Provides that the business will notify the service provider/contractor of any consumer rights request and provide the information necessary for the service provider/contractor to comply with the request.
In addition to the contract, the draft rules emphasize that these cannot just be words on paper that diverge from actual practices. Section 7051(e) notes in particular that, in assessing compliance, the CPPA can evaluate whether the business conducted any due diligence to support a reasonable belief of privacy compliance, and whether and how the business enforces its contract terms, including performing audits. If there is non-compliance, both parties can be held jointly and severally liable.

The Limitations on Internal Use of Customer Data by a Service Provider/Contractor. The draft rules provide that a service provider/contractor is restricted from using customer personal data for its own purposes, except for internal use to build or improve the quality of its services, provided that the service provider/contractor does not use the personal information to perform services on behalf of another person in a manner not permitted under the CCPA. This language is notably different from the governing CCPA rules. Based on the examples outlined below, and the admonition above that the service provider cannot combine or update personal information received from another source unless permitted by the CCPA, makes it ambiguous as to when updating personal information crosses the line. From the examples, it suggests that where such functions are to facilitate personalized advertising or data sales, they would not fit within a service provider/contractor role.

Use for Analysis/Data Hygiene (Sometimes). The draft rules set forth two examples that seem to allow some analysis and data correction under particular circumstances. For example, the first illustration emphasizes that the service provider/contractor can analyze how a business customer’s consumers interact with company communications to improve overall services, and the second example highlighted that a service provider/contractor can use customer data to identify and fix incorrect personal information that, as a result, would improve services to others. The draft rules underscore, however, that a service provider/contractor could not compile (e.g., enrich/append) personal information for the purpose of sending advertising to another business or to sell such personal information.

Data Security/Fraud Prevention. Consistent with the statute, the draft rules allow service providers/contractors to use and combine customer personal information “[t]o detect data security incidents or protect against malicious, deceptive, fraudulent or illegal activity.”

Other Legal Purposes. The draft rules acknowledge that a service provider/contractor can use customer data to comply with other laws, lawful process, to defend claims, if the data is deidentified or aggregated, or does not include California personal information.

Advertising Service Provider Functions Look Limited. The draft rules acknowledge a business can engage a service provider/contractor for advertising/marketing services if the services do not combine opted out consumer data from other sources. The draft rules also affirmatively reiterate that an entity who provides cross-contextual behavioral advertising is a third party and not a service provider/contractor.

  • As an example of what would cross the line, the draft rules provide that a service provider/contractor can provide non-personalized advertising based on aggregated or demographic information (ads based on gender, age range, or general geographic location), but could not, for example, share the business’s customer information with a social media platform to “identify users on the social media company’s platform to serve advertisements to them.” This example is stated without qualification to what commitments the platform has provided on its own use and restrictions as to such data, or if and how any other permitted “business purposes” under the CPRA may apply.
  • In another example, the draft rules provide that an advertising agency can be a service provider/contractor by providing contextual advertising services. Again, this example is set forth without reference to any other business purposes that may apply. However, one wonders whether the enforcement structure may inhibit broader interpretations where functions involve personalized advertising and analytics.
Third Parties that “Control the Collection” of Personal Information

Notice at Collection. The draft rules have new language that, in the context of “notice at collection” provide that when more than one party controls personal information collection, such as in connection with digital advertising, all such parties must provide a very detailed “notice at collection” that accounts for all parties’ business practices. As an example:

  • A “first party may allow another business, acting as a third party, to control the collection of personal information from consumers browsing the first party’s website. Both the first party that allows the third parties to collect personal information via its website, as well as the third party controlling the collection of personal information, shall provide a notice at collection.”
Both parties also would need to honor opt outs of sale/sharing, and the “notice at collection” would need to include “the names of all the third parties that the first party allows to collect personal information from the consumer,” or the first party can include in its “notice at collection” the information provided by the third party that would meet all of the requirements about its business practices. For example, a company that has a third party analytics tag on its website would need to post a conspicuous link to its “notice at collection” about the analytics company’s information practices on its homepage and all webpages that include the tag collecting personal information. The analytics company also would need to post a “notice at collection” on its website’s homepage. These requirements also apply offline, where applicable.

Honoring Opt Outs. Section 7051 provides that third parties are directly obligated to honor opt outs, including as conveyed through a global privacy signal or otherwise on a first-party business’s site hosting the third party’s tag collecting personal information, unless the first-party business informs the third party that the consumer has consented to the sale/sharing, or “the third party becomes a service provider or contractor that complies with the CCPA and these regulations.”

  • This latter provision is interesting because it suggests implicit support for frameworks, such as IAB’s LSPA, where a contract that contains commitments around use of personal data post-opt outs can support a continued service provider role.
The first-party business would also be required to “contractually require the third party to check for and comply with a consumer’s opt-out preference signal unless informed by the business that the consumer has consented to the sale or sharing of their personal information. A contract must be in place with the first party in order for the third party to lawfully collect and use personal information collected from the first party site by a third party. The contract would need to comply with all of the express requirements for such third party contracts under the CCPA. As with service providers/contractors, these contract provisions are very detailed, and due diligence and accountability provisions are also required.

* * *

There is a lot to consider and while all of these provisions remain subject to further changes, it is clear that the draft rules suggest a more exacting expectation as to privacy compliance by companies doing business in California or otherwise with California residents, and an expansive new set of obligations to tighten such compliance within the information supply chain. We will cover in future blog posts how these draft rules contemplate other business obligations, including as to obligations around obtaining consent, privacy policies, responses to consumer privacy rights, the use of sensitive personal information, and mechanics of complying with opt out of sales/shares, and global privacy controls. If you are interested in submitting comments in the rulemaking process or have questions about privacy compliance, please reach out to members of Kelley Drye’s privacy team.

JOIN US

A Readout of the California Privacy Protection Agency's Draft Proposed CPRA Regulations

Separately, join us as Kelley Drye privacy lawyers provide observations on the proposed regulations, including which would pose the biggest challenge for businesses if implemented, and will offer strategies to plan efficiently for compliance in the face of these proposals. Register here.

]]>
Webinar Replay: Teen Privacy Law Update https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/webinar-replay-teen-privacy-law-update https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/webinar-replay-teen-privacy-law-update Fri, 20 May 2022 12:22:16 -0400 The replay for our May 19, 2022 Teen Privacy Law Update webinar is available here.

Protecting the privacy and safety of kids and teens online is receiving enormous attention lately from Congress, the States, the FTC, and even the White House. Further, just last month, BBB National Programs unveiled a Teenage Privacy Program Roadmap offering a comprehensive framework for companies to use in identifying and avoiding online harms impacting teens.

Amidst these developments, Kelley Drye held a webinar to discuss the unique challenges associated with teen privacy. Dona J. Fraser, Senior Vice President Privacy Initiatives, BBB National Programs, and Claire Quinn, Chief Privacy Officer, PRIVO, along with Kelley Drye’s Laura Riposo VanDruff provided an update on key concerns and developments related to teen privacy, as well as practical tips for companies seeking to address these issues.

To view the webinar recording, click here or view it on the new Ad Law Access App.

Subscribe to the Ad Law Access blog to receive real-time updates on privacy and other related matters.

The Ad Law News and Views newsletter provides information on our upcoming events and a summary of recent blog posts and other publications.

Visit the Advertising and Privacy Law Resource Center for additional information, past webinars, and educational materials.

For easy access to all of our webinars, posts and podcasts, download our new Ad Law Access App.

Kelley Drye Unveils First-of-its-kind Advertising Law App
]]>
Privacy Priorities for 2022: Tracking State Law Developments https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/privacy-priorities-for-2022-tracking-state-law-developments https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/privacy-priorities-for-2022-tracking-state-law-developments Fri, 25 Mar 2022 13:10:13 -0400 The replay for our April 28, 2022 Privacy Priorities for 2022: Tracking State Law Developments webinar is available here.

In the absence of a federal privacy law, privacy has been at the forefront of many states’ legislative sessions this year. Against this backdrop, state attorneys general continue to initiate investigations into companies’ privacy practices, and state agencies continue to advance privacy rulemakings under existing law. Aaron Burstein, Laura VanDruff and Paul Singer, presented this webinar to help learn about the latest developments in state privacy law, make sense of these developments and understand their practical impact.

To view the webinar recording, click here or view it on the new Ad Law Access App.

Subscribe to the Ad Law Access blog to receive real-time updates on privacy and other related matters.

The Ad Law News and Views newsletter provides information on our upcoming events and a summary of recent blog posts and other publications.

Visit the Advertising and Privacy Law Resource Center for additional information, past webinars, and educational materials.

For easy access to all of our webinars, posts and podcasts, download our new Ad Law Access App.

Kelley Drye Unveils First-of-its-kind Advertising Law App
]]>
Age Appropriate Design Codes – Well Meaning, but Do They Make for Good Law? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/age-appropriate-codes-well-meaning-but-do-they-make-for-good-law https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/age-appropriate-codes-well-meaning-but-do-they-make-for-good-law Sun, 20 Mar 2022 14:25:17 -0400 Age Appropriate Design Codes – Well Meaning, but Do They Make for Good Law?

As we’ve discussed here, there’s bipartisan momentum in Congress to enact stronger privacy protections for kids and teens – and specifically, tools that would enable minors and their parents to limit algorithms and online content that fuel self-harm and addictive behaviors. These efforts, reflected in several federal bills (see here and here) and now in a California bill too, build on months of testimony by a social media insider and are modeled in large part on the UK’s Age Appropriate Design Code.

In his State of the Union address, the President added to this momentum, calling on Congress to enact stronger protection for kids – a move that was heralded in the media as a potential “game changer” for privacy that could “help clear the logjam on Capitol Hill.” (Relatedly, report language accompanying the recently signed budget bill directs the FTC to prioritize kids’ privacy in its enforcement efforts.)

It’s certainly understandable why U.S. policymakers would want to protect the privacy and safety of minors. It’s also notable that that they are focusing on an area where bipartisan action might be possible and emphasizing the safety aspects of these bills (as if the word “privacy” would jinx the effort while “safety” might garner more support). But, looking past the good intentions to protect kids, some of the concepts and language in these bills pose real challenges as to clarity and enforceability.

Focusing on just a few:

  • Best interests of the minor. The bills generally require companies to design and operate online services used by minors with the minors’ best interests as a primary consideration.
    • This language raises real questions about implementation and enforceability. While the bills sometimes include factors to consider (e.g., the types of harms to avoid), or authorize rulemakings or taskforces to flesh out the standards, this language is rife with subjectivity and will be difficult to interpret and apply.
    • For example, if a company demonstrates that it made a good faith effort to develop policies to address this issue, will that be sufficient? Will companies be able to develop a uniform set of criteria that apply to all minors when these types of judgments are normally left to parents? Will rulemakings or taskforces really be able to flesh out the standards in a way that the bill-drafters apparently concluded they couldn’t?
  • Avoiding “dark patterns” or “nudge” techniques. The bills generally state that companies should avoid design interfaces or techniques that cause excessive use of an online service, or that encourage minors to provide more data, forego privacy protections, or engage in harmful behaviors.
    • Some aspects of these standards will be easier to apply than others. For example, it seems clear that companies shouldn’t expressly offer incentives to minors to provide more personal data or change settings. Nor should they feature bold, enticing “yes” options for data collection and sharing, in contrast to tiny or hidden “no” choices. And, of course, it shouldn’t be more difficult to cancel a service than it is to sign up.
    • But so much of this lies in a grey area. Is it a “dark pattern” to allow minors to win and advance in a game which, as parents well know, keeps kids playing? What about gaming interfaces with vivid graphic pictures and details – a dominant feature of the most popular video games? Will they go the way of Joe Camel (the ubiquitous, cartoon character in tobacco ads that ended amidst controversy and litigation in the late 90s)? Is a portal used by children inherently problematic because it encourages minors to return again and again to access varied and changing content? And, of particular relevance to the concerns that are driving these efforts, will companies be expected to block content on bulimia, suicide, cutting, or sexual activity if that’s precisely the information young teens are searching for?
  • Likely to be accessed by a minor. Many of the bills’ provisions – including the best interest and dark patterns requirements, as well as provisions requiring parental controls and strong default settings – are tied to whether an online service is “likely to be accessed by a minor.”
    • This standard is very confusing and will be extremely difficult to apply. In contrast to COPPA – which covers online services “directed to children” or circumstances where an online service has actual knowledge a user is a child – this standard will require companies to anticipate access by minors even if the company hasn’t designed its service for minors, and even if it has no specific knowledge that minors are using it.
    • Although COPPA has been criticized as too narrow, this new standard could be entirely unworkable. While some companies know full well that minors are using their services, others don’t. Will this approach inevitably lead to universal identification and age-gating of all users of all online services? Given the ease with which minors can outwit age-gates, will that even be sufficient, or will companies need to set up more comprehensive data collection and monitoring systems? And would these outcomes really advance user privacy?

Certainly, the concerns driving these efforts – the harmful effects of social media on minors – are serious ones. They also unite members from different political parties, which is always a welcome development. However, as policymakers and stakeholders study these bills, they will likely (or hopefully) realize just how difficult implementation would be, sending them back to the drawing board for another try. Or maybe they will ultimately conclude that comprehensive privacy legislation is still the better approach.

]]>
How the Utah Consumer Privacy Act Stacks Up Against Other State Privacy Laws https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/how-the-utah-consumer-privacy-act-stacks-up-against-other-state-privacy-laws https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/how-the-utah-consumer-privacy-act-stacks-up-against-other-state-privacy-laws Thu, 17 Mar 2022 10:31:24 -0400 How the Utah Consumer Privacy Act Stacks Up Against Other State Privacy Laws

As companies wait to see whether the Utah Consumer Privacy Act (UCPA) becomes the fourth comprehensive state privacy law, we are providing an overview of some of the Act’s key provisions – and how they depart from comprehensive privacy laws in California, Colorado, and Virginia.

Utah’s Senate unanimously passed the UCPA on February 25. The House – also through a unanimous vote – followed on March 2. The Legislature sent the UCPA to Governor Spencer Cox on March 15. Because the Legislature adjourned on March 4, Governor Cox has 20 days from the date of adjournment – March 24 – to sign or veto the Act. If Governor Cox takes no action, the UCPA will become law, with an effective date of December 31, 2023.

In broad strokes, the UCPA is similar to the Virginia Consumer Data Protection Act (VCDPA) and Colorado Privacy Act (CPA). And, like the laws in Colorado and Virginia, the UCPA borrows some concepts from the CCPA – including a version of the right to opt out of the “sale” of personal data.

However, the UCPA pares back important features of all three of these laws. Some of the significant changes include:

  • Applicability. The UCPA’s applicability is narrower than the three other comprehensive state privacy laws. The UCPA applies only to controllers or processors that (1) do business in the state (or target Utah residents with products or services); (2) earn at least $25 million in revenue; and (3) either: (a) control or process personal data of 100,000 or more consumers in a calendar year; or (b) derive more than 50 percent of gross revenue from selling personal data and control or process data of 25,000 or more consumers. By contrast, the $25 million revenue threshold is an independent basis for the CCPA to apply to a business; and neither the CPA nor VCDPA includes a revenue-based exemption.
  • Exemptions. In addition to exempting personal data that is subject to sector-specific privacy laws and regulations, such as HIPAA, the Gramm-Leach-Bliley Act, and the Fair Credit Reporting Act, the UCPA provides that the Act does not apply to certain entities, including a tribes, institutions of higher education, and nonprofit corporations.
  • Sale and Targeted Advertising Opt-Out Rights. Although the UCPA requires controllers to provide consumers with the ability to opt out of sale and targeted advertising, the Act does not provide a right to opt out of profiling (or otherwise address profiling). Like the VCDPA, the UCPA restricts the definition of “sale” to “the exchange of personal data for monetary consideration by a controller to a third party.” This definition does not include “other valuable consideration,” found in the definitions of “sale” under the CCPA and CPA.
  • Opt-Out Consent to Process Most Sensitive Data. The UCPA does not require opt-in consent to process most sensitive data, unless the data “concern[s] a known child,” unlike the opt-in requirements of the CPA and VCDPA. Instead, the UCPA requires controllers to “present[] the consumer with clear notice and an opportunity to opt out” of sensitive data processing.
  • Other Consumer Rights. The UCPA provides consumers the right to confirm processing and to delete personal data they provided to a controller. Consumers also have the right to obtain a portable copy of personal data that the consumer “previously provided to the controller.” This “provided to” language follows the VCDPA’s access and portability right and contrasts with obligations to provide personal data “concerning” (CPA) or “about” (CCPA) a consumer. The UCPA does not provide a right of correction or accuracy.
  • Enforcement and Regulation. The UCPA does not include a private cause of action, nor does it authorize the Attorney General or other state official or agency to issue regulations. The Division of Consumer Protection, in the Utah Department of Commerce, investigates potential violations and can refer an action to the Utah Attorney General for enforcement. The Attorney General can recover actual damages for consumers and a penalty of up to $7,500 per violation, but only after a 30 day notice and right to cure period.
From a preparation and compliance standpoint, the UCPA – if it becomes law – might not be a game-changer for companies that have built their privacy programs around California’s requirements. The Kelley Drye team will explore some of the details of all four state laws – as well as compliance strategy considerations – during a webinar on March 24 beginning at 4:00 pm EDT. In the meantime, we will keep a close eye on developments in Utah and elsewhere.
Colorado Privacy Act (CPA) Virginia Consumer Data Protection Act (VCDPA) California Consumer Privacy Act (CCPA as amended by CPRA) Utah Consumer Privacy Act (UCPA)
Thresholds to Applicability Applies to a controller that (1) conducts business in CO or targets products or services targeted to CO residents and (2) meets either of these thresholds: (a) controls or processes personal data of at least 100,000 consumers in a calendar year; or (b) derives revenue or receives a discount on the price of goods or service from selling personal data or controls personal data of at least 25,000 consumers Applies to a person that (1) conducts business in VA or target products or services targeted to VA residents; and (2) meets either of these thresholds: (a) controls or processes personal data of at least 100,000 consumers; or (b) controls or processes personal data of at least 25,000 consumers and derives over 50% of gross revenue from the sale of personal data. A “business” (1) conducts business in CA and collects personal information of CA residents; and (2) (a) has $25 million or more in annual revenue for preceding calendar year as of Jan. 1 of calendar year; (b) annually buys, sells, or shares personal data of more than 100,000 consumers or households; or (c) earns more than 50% of its annual revenue from selling or sharing consumer personal information.

A controller or processor that (1) conducts business in Utah or targets products or services to UT residents; (2) has $25 million or more in annual revenue; and (3) satisfies one of these thresholds:

(a) during a calendar year, controls or processes personal data of 100,000 or more consumers, or; (b) derives over 50% of gross revenue from the sale of personal data and controls or processes personal data of 25,000 or more consumers.

Opt-in Consent Opt-in consent required to process sensitive data Opt-in consent required to process sensitive data Opt-in consent required to sell or “share” personal information of minors under age 16 Not required for sensitive data (unless the data concerns a known child, and parental consent is required under COPPA)
Opt-Out Required for targeted advertising, sales, and profiling for legal or similarly significant effects Required for targeted advertising, sales, and profiling for legal or similarly significant effects Required for profiling, cross-contextual advertising, and sale; right to limit use and disclosure of sensitive personal information Required for targeted advertising and sales
Other Consumer Rights Access, Portability, Deletion, Correction, Access, Portability, Deletion, Correction Access, Deletion, Correction, Portability Access, Portability, and Deletion
Authorized Agents Permitted for opt-out requests N/A Permitted for all consumer rights requests N/A
Appeals Must create process for consumers to appeal refusal to act on consumer rights Must create process for consumers to appeal refusal to act on consumer rights N/A N/A
Private Right of Action No No Yes, for security breaches involving certain types of sensitive personal information No
Cure Period 60 days until provision expires on Jan. 1, 2025 30 days 30-day cure period is repealed as of Jan. 1, 2023 30 days
Data Protection Assessments Required for targeted advertising, sale, sensitive data, certain profiling Required for targeted advertising, sale, sensitive data, certain profiling Annual cybersecurity audit and risk assessment requirements to be determined through regulations N/A

]]>
Lina Khan’s Privacy Priorities – Time for a Recap https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/lina-khans-privacy-priorities-time-for-a-recap https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/lina-khans-privacy-priorities-time-for-a-recap Wed, 16 Mar 2022 11:47:07 -0400 Lina Khan’s Privacy Priorities – Time for a Recap

Rumors suggest that Senator Schumer is maneuvering to confirm Alvaro Bedoya as FTC Commissioner sooner rather than later, which would give FTC Chair Khan the majority she needs to move forward on multiple fronts. One of those fronts is consumer privacy, for which Khan has announced ambitious plans (discussed here and here) that have stalled for lack of Commissioner votes. With Bedoya potentially on deck, now seems like a good time to recap those plans, as they might provide clues about what’s in the pipeline awaiting Bedoya’s vote. We focus here on three priorities Khan has emphasized in statements and interviews since becoming Chair.

Privacy Rulemakings

At the top of the list are privacy rulemakings, which could create baseline standards for the entire marketplace and enable the FTC to obtain monetary relief in its cases. (Recall that the FTC has limited authority to obtain money in its cases, especially post AMG, but that it can seek penalties or redress when it’s enforcing a rule.) Last December, Khan issued a Statement of Regulatory Priorities detailing the privacy rulemakings she wants to initiate or complete, including:

  • New rules to halt “abuses stemming from surveillance-based business models,” which could curb “lax security practices” and “intrusive surveillance,” “ensur[e] that algorithmic decision-making does not result in unlawful discrimination,” and potentially limit the use of “dark patterns” to manipulate consumers. (Yes, this is an ambitious one.)
  • Possible amendments to existing privacy rules – including the Children’s Online Privacy Protection Act (COPPA), the Health Breach Notification Rule, the Safeguards Rule (breach notification requirements), and the FACTA Identity Theft Rules (including the Red Flags Rule).
  • Possibly other new rules to “define with specificity unfair or deceptive acts or practices.”

Of note, absent Congressional legislation, any new privacy rules would need to follow the arduous process detailed in Section 18 of the FTC Act (referred to as “Mag-Moss” rulemaking). With Bedoya on board, the FTC can start these rulemakings, but they could still take years to complete, as we discuss here.

By contrast, the FTC can amend its existing privacy rules under the more manageable Administrative Procedures Act. Further, it’s already in the midst of rule reviews for all of the rules listed above (including COPPA’s, which started back in 2019). As a result, the FTC could act on these rules relatively quickly once Bedoya is on board.

Focus on Platforms

Khan has also made clear that she intends to focus on the tech platforms – which she has described as “gatekeepers” that use their critical market position to “dictate terms,” “protect and extend their market power,” and “degrade privacy without ramifications.” In a statement and accompanying staff report last September, Khan stated that such efforts would include:

  • Additional compliance reviews of the platforms currently subject to privacy orders (Facebook, Google, Microsoft, Twitter and Uber), followed by order modifications and/or enforcement as necessary.
  • As resources permit, examining the privacy implications of mergers, as well as potential COPPA violations by platforms and other online services – COPPA being of special importance as children have increasingly relied on online services during the pandemic. (Relatedly, report language accompanying the omnibus budget just signed into law directs the FTC to prioritize COPPA enforcement.)
  • Completion of the pending Section 6(b) study of the data practices of the social media companies and video streaming services, which was initiated in December 2020.

So far, we’ve seen limited action from the FTC on platforms (at least on the consumer protection side). Last October, the FTC issued a 6(b) report on the privacy practices of ISPs, but largely concluded that the topic should be addressed by the FCC. Then, in December, the FTC announced a settlement with online ad platform OpenX for COPPA violations. Given Khan’s bold plans in this area, it seems likely that there are matters in the pipeline awaiting Bedoya’s vote.

Stronger Remedies

The third major area that Khan has highlighted is obtaining stronger remedies in privacy cases – that is, considering “substantive limits”, not just procedural protections that “sidestep[] more fundamental questions about whether certain types of data collection and processing should be permitted in the first place.” By this, Khan is referring to deletion of data and algorithms, bans on conduct, notices to consumers, stricter consent requirements, individual liability, and monetary remedies based on a range of theories post AMG.

As to this priority, the FTC has moved ahead where it can (even prior to Khan’s tenure), often using strategies that have been able to garner unanimous votes. For example, its settlements with photo app Everalbum (for alleged deception) and WW International (for alleged COPPA violations) required deletion of consumer data and algorithms alleged to have been obtained illegally. Its settlement with fertility app Flo Health (for alleged deception about data sharing) required the company to notify affected consumers and instruct third parties that received their data to destroy it. The FTC also has alleged rule violations where possible, and partnered with other agencies to shore up its ability to obtain monetary relief.

But we’ve also seen signs of a more combative approach that could increase when Khan has the votes to push it forward. Of note, last September, the FTC issued an aggressive interpretation of the Health Breach Notification Rule, purporting to extend the rule’s reach (and thus its penalties) to virtually all health apps, even though a rule review was already underway. Further, FTC staff are making strong, often unprecedented demands for penalties, bans, and individual liability in consent negotiations. It’s even possible, based on an article written by former Commissioner Chopra and now-BCP Director Sam Levine, that the agency could attempt to use penalty offense notice letters (explained here) to lay the groundwork for penalties in privacy cases under Section 5(m)(1)(B). However, given the paucity of administratively litigated privacy cases (a key requirement under 5(m)(1)(B)), this would be very aggressive indeed.

* * *

For more on Khan’s privacy plans, you can read our earlier blogposts (here and here), as well as the various FTC statements and reports cited in this post. Or, if you like surprises, you can simply wait for Bedoya to be confirmed and see what happens. Needless to say, things should speed up at the FTC when he arrives.

Privacy Priorities for 2022: Tracking State Law Developments Thursday, March 24, 2022 at 4:00pm ET/ 1:00pm PT Register Here

In the absence of a federal privacy law, privacy has been at the forefront of many states’ legislative sessions this year:

  • Utah is poised to be the fourth state to enact comprehensive privacy legislation
  • Florida came close to passing legislation when the State House advanced privacy legislation by a significant margin
  • Other state legislatures have privacy bills on their calendars

Against this backdrop, state attorneys general continue to initiate investigations into companies’ privacy practices, and state agencies continue to advance privacy rulemakings under existing law.

Please join us on Thursday, March 24 at 4:00 pm ET for this webinar to learn about the latest developments in state privacy law, make sense of these developments and understand their practical impact.

]]>
Day in the Life of a Chief Privacy Officer https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/day-in-the-life-of-a-chief-privacy-officer https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/day-in-the-life-of-a-chief-privacy-officer Thu, 17 Feb 2022 00:30:48 -0500 Day in the Life of a Chief Privacy OfficerOn this special episode, Privacy and Information Security practice chair Alysa Hutnik chats with Shana Gillers, TransUnion’s Chief Privacy Officer. Alysa and Shana discuss the journey to becoming a chief privacy officer, hot topics, and what it takes to stay on top of your game in privacy today.

Watch a video version here or the audio version here.

Shana Gillers

Shoshana Gillers has served as TransUnion’s Chief Privacy Officer since September 2019. In this role Ms. Gillers oversees compliance with privacy laws across TransUnion’s global footprint and promotes a culture of responsible data stewardship.

Prior to joining TransUnion, Ms. Gillers spent four years at JPMorgan Chase, ultimately serving as Vice President and Assistant General Counsel, Responsible Banking, Data and Privacy. Previously, she served as a federal prosecutor for eight years at the U.S. Attorney’s Office in Chicago, and as a litigator for four years at WilmerHale in New York. Ms. Gillers clerked for the Hon. Robert D. Sack on the U.S. Court of Appeals for the Second Circuit and for the Hon. Aharon Barak on the Supreme Court of Israel.

Ms. Gillers received a B.A. from Columbia University, summa cum laude, and a J.D. from Yale Law School.

Alysa Z. Hutnik

Alysa chairs Kelley Drye’s Privacy and Information Security practice and delivers comprehensive expertise in all areas of privacy, data security and advertising law. Her experience ranges from strategic consumer protection oriented due diligence and compliance counseling to defending clients in FTC and state attorneys general investigations and competitor disputes.

Prior to joining the firm, Alysa was a federal clerk for the Honorable Joseph R. Goodwin, United States District Judge, Southern District of West Virginia.

Alysa received a B.A. from Haverford College, and a J.D. from the University of Maryland Carey School of Law.

]]>
Upcoming webinar on recent FTC privacy developments and predictions for 2022 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/upcoming-webinar-on-recent-ftc-privacy-developments-and-predictions-for-2022 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/upcoming-webinar-on-recent-ftc-privacy-developments-and-predictions-for-2022 Fri, 11 Feb 2022 02:56:35 -0500 FTC Continues to Focus on Incentivized Reviews

Please join us for a webinar on February 24, 2022 at 4 p.m. on recent and upcoming FTC developments. The webinar will feature Kelley Drye’s Jessica Rich and Aaron Burstein, both former FTC officials. Here’s a taste of what we’ll be discussing, building on the commentary we have posted in this blog over the past few months:

All eyes are on the FTC this year, given its recent actions, setbacks, and ambitious plans for 2022.

As we’ve reported here, Chair Lina Khan has announced an aggressive privacy agenda, that includes new regulations; emphasis on the large platforms and other “gatekeepers” in the marketplace; stringent enforcement remedies (such as data deletion, bans on conduct, strict consent requirements, and individual liability); and significant monetary relief based on a range of creative theories.

Khan has already taken steps in this direction, including by issuing a policy statement and guidance reinterpreting the Health Breach Notification Rule; announcing a ramp-up against subscription services that use “dark patterns” to trick consumers into signing up; tightening requirements under the Gramm-Leach Bliley Safeguards Rule; and making strong demands in consent negotiations. In addition, she has announced plans to initiate privacy rulemakings under the FTC’s so-called “Magnuson-Moss” authority, including a rulemaking to limit “surveillance” in the commercial marketplace.

All of this takes place against the backdrop of recent setbacks and ongoing challenges faced by the agency. Last year, the Supreme Court’s ruled in AMG that the FTC cannot obtain monetary relief under Section 13(b) of the FTC Act, it’s chief law enforcement tool. For years, Congress has declined to pass a federal privacy law to strengthen the FTC’s authority in this area. The FTC has limited resources to fulfill its broad mission. And it cannot obtain civil penalties for most first-time law violations.

We will dive into these issues and more in our upcoming webinar, focusing on the practical impact for companies subject to FTC’s jurisdiction. Please join us on Thursday, February 24 at 4:00 pm EST for this second installment of Kelley Drye's 2022 practical privacy series. Register here.

Upcoming webinar on recent FTC privacy developments and predictions for 2022
]]>
New Mexico Attorney General Settles Google Children’s Privacy Cases: A Unique Settlement Adds to a Complicated Landscape https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-mexico-attorney-general-settles-google-childrens-privacy-cases-a-unique-settlement-adds-to-a-complicated-landscape https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-mexico-attorney-general-settles-google-childrens-privacy-cases-a-unique-settlement-adds-to-a-complicated-landscape Thu, 16 Dec 2021 15:38:52 -0500 On December 13, the New Mexico Attorney General announced a settlement with Google to resolve claims regarding children’s privacy, including in the burgeoning EdTech space. The federal lawsuits Balderas v. Tiny Lab Productions, et al. and Balderas v. Google LLC, respectively, alleged COPPA and privacy violations related to collection of children’s information on game developer Tiny Lab’s apps and on Google’s G Suite for Education products. There are many features of this settlement that are worth discussing further as either potential future trends, or novel provisions.

Privacy Compliance Provisions

New Mexico’s injunction related to the Tiny Lab case includes changes to Google Play which will take effect after 120 days. Some of the specific measures include:

  • revising Google Play Families policies and including additional help pages to assist app developers in compliance;
  • requiring all developers to complete a form to indicate the targeted age group of apps;
  • using a rubric to evaluate app submissions to help determine whether it appeals to kids and check for consistency with the age group form;
  • requiring Families apps to certify they will comply with COPPA;
  • requiring all apps to only use SDKs that certify compliance with Google’s policies including COPPA;
  • requiring developers of Families apps to disclose collection of any children’s data including through third parties;
  • requiring a link to the app’s privacy policy on the Google Play store page; and
  • communicating whether an app is Child Directed to AdMob and AdMob will then follow COPPA pertaining to that data.
The content of the help pages the injunction requires do not just contain answers to frequently asked questions. They prescribe certain decisions by and limitations on third parties using the Google Play store. For example, Exhibit 3 to the injunction provides “if you serve ads in your app and your target audience only includes children, then you must use Google Play certified SDKs.”

In addition to these injunctive provisions, Google agreed to a set of voluntary enhancements to the Google Education platform intended to promote safety for students. New Mexico’s enforcement of these provisions is limited to its ability to confirm that Google has made the changes, or inquire as to the status of changes not made.

These injunctions demonstrate continued state Attorney General scrutiny regarding children’s information. And they come at a time that the Federal Trade Commission, which is responsible for issuing the COPPA Rule, is redoubling its COPPA efforts. The FTC’s ongoing COPPA Rule Review includes a number of questions regarding the intersection of COPPA and education technology. The FTC’s Statement of Regulatory Priorities, which we wrote about here, identifies COPPA as a top priority. And just this week, the FTC released its first COPPA settlement in almost 18 months.

Additional Settlement Terms Part from Historical State Settlements

Not to be ignored, several other provisions of the settlement have unique aspects that are extremely noteworthy. Google has agreed to pay New Mexico $5.5 million – with $1.65 million of that going to outside counsel for the state. The remaining payment will be used to fund the “Google New Mexico Kids Initiative” – a program jointly run by Google and New Mexico to award grants to schools, educational institutions, charitable organizations, or governmental entities. This unique allocation of the payment to the State could result in scrutiny that other State Attorney General settlements have met in the past where they attempted to designate funds to specific third party recipients. Some state legislatures may see it as an effort to appropriate funds without their involvement.

While New Mexico reserves its rights under the agreement regarding public statements, it has agreed to provide Google 24-hour notice before making any written public statement. Moreover, New Mexico agrees to consider in good faith any suggestions or input Google has, and any statement will reference the parties’ shared commitment to innovation and education. States routinely resist any efforts to negotiate press in this manner, and it is unclear how enforceable a provision like this could really be anyway. That said, this certainly reflects the cooperative nature of the agreement, in which case it’s fair to assume the State would issue press reflecting such cooperation anyway.

Google and New Mexico have also agreed to an ADR provision, requiring the state to pursue any disputes relating to the agreement in mediation prior to pursuing relief. This again is fairly unique for a State AG settlement, as is the overall form of the document (a “Settlement Agreement and Release”) – normally states will only settle matters through a consent judgment or a statutorily authorized Assurance of Compliance or Discontinuance. But just like some of the other unique provisions, agreeing to ADR may be more of a reflection of the cooperative nature of the agreement, and certainly presents opportunity for a more streamlined enforcement mechanism in the future.

It remains to be seen if these provisions will serve as a template for future state agreements with other companies, but given that state Attorneys General continue to pursue Google on a variety of fronts[1], New Mexico’s settlement will certainly be relevant in any future settlement efforts.

[1] Google Search Manipulation, Google Ad Tech, Google DOJ Search Monopoly, State of Arizona v. Google LLC geolocation privacy

]]>
Jessica Rich and Laura Riposo VanDruff, Two Former Senior FTC Officials, Join Kelley Drye’s Privacy and Advertising Practices https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/jessica-l-rich-and-laura-riposo-vandruff-two-former-senior-ftc-officials-further-bolstering-kelley-dryes-privacy-and-advertising-practices https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/jessica-l-rich-and-laura-riposo-vandruff-two-former-senior-ftc-officials-further-bolstering-kelley-dryes-privacy-and-advertising-practices Wed, 08 Sep 2021 13:33:07 -0400 Jessica L. Rich and Laura Riposo VanDruff, Two Former Senior FTC Officials Further Bolstering Kelley Drye’s Privacy and Advertising PracticesWe are thrilled that Jessica Rich and Laura Riposo VanDruff have joined the firm’s Privacy and Advertising practice groups. Both attorneys are former top officials at the Federal Trade Commission (FTC), with Rich having served as Director of the Bureau of Consumer Protection (BCP) and VanDruff as an Assistant Director in BCP’s Division of Privacy and Identity Protection (DPIP).

Jessica and Laura join our impressive list of former FTC officials, including the firm’s managing partner, Dana Rosenfeld, who served as Assistant Director of BCP and attorney advisor to FTC Chairman Robert Pitofsky, former Bureau Directors Bill MacLeod and Jodie Bernstein, as well as Aaron Burstein, having served as senior legal advisor to FTC Commissioner Julie Brill.

Jessica served at the FTC for 26 years and led major initiatives on privacy, data security, and financial consumer protection. She is credited with expanding the FTC’s expertise in technology and was the driver behind FTC policy reports relating to mobile apps, data brokers and Big Data, the Internet of Things, and federal privacy legislation. She also directed the agency’s development of significant privacy rules, including the Children’s Online Privacy Protection Rule and Gramm-Leach-Bliley Safeguards Rule. She is a recipient of the FTC Chairman’s Award, the agency’s highest award for meritorious service and the first-ever recipient of the Future of Privacy Forum’s Leadership Award. Jessica is also a fellow at Georgetown University’s Institute for Technology Law & Policy. Prior to joining Georgetown, she was an Independent Consultant with Privacy for America, a business coalition focused on developing a framework for federal privacy legislation.

Laura also brings significant experience to Kelley Drye. As Assistant Director for the FTC’s Division of Privacy & Identity Protection, Laura led the investigation and prosecution of matters relating to consumer privacy, credit reporting, identity theft, and information security. Her work included investigation initiation, pre-trial resolution, trial preparation, and trial practice relating to unreasonable software security, mobile operating system security update practices, and many other information privacy and identity protection issues. She joins the firm from AT&T where she served as an Assistant Vice President – Senior Legal Counsel advising business clients on consumer protection risks, developing and executing strategies in response to regulatory inquiries, and participating in policy initiatives within the company and across industry.

Jessica and Laura are an impressive duo and are sure to be an asset to our clients as they prepare for the future of privacy and evolving consumer protection law.

* * *

Subscribe here to Kelley Drye’s Ad Law News and Views newsletter to see another side of Jessica, Laura and others in our second annual Back to School issue. Subscribe to our Ad Law Access blog here.

]]>
Colorado Passes Privacy Bill: How Does it Stack Up Against California and Virginia? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/colorado-passes-privacy-bill-how-does-it-stack-up-against-california-and-virginia https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/colorado-passes-privacy-bill-how-does-it-stack-up-against-california-and-virginia Wed, 09 Jun 2021 14:03:30 -0400 Update: Governor Polis signed SB 21-190 into law on July 7, 2021, see our updated blog post here.

The Colorado Legislature recently passed the Colorado Privacy Act (“ColoPA”), joining Virginia and California as states with comprehensive privacy legislation. Assuming Colorado Governor Jared Polis signs the bill (SB 21-190) into law, ColoPA will go into effect on July 1, 2023.

How does the measure stack up against the VCDPA and the CCPA (as amended by CPRA)? The good news is that, in broad terms, ColoPA generally does not impose significant new requirements that aren’t addressed under the CCPA or VCDPA. Below, we compare key provisions of ColoPA against California’s and Virginia’s laws and call attention to a few areas where Colorado has struck out on its own.

  • Establishing consumer rights. As with the VCDPA and the CCPA, ColoPA provides rights for access, deletion, correction, portability, and opt out for targeted advertising, sales, and certain profiling decisions that have legal or similar effects. Unlike CCPA, Colorado consumers can only use an authorized agent for sale opt-out requests.
  • Universal opt-out requests. ColoPA also requires the Attorney General to establish technical specifications for a universal targeted advertising and sale opt-out (e.g., global privacy control) by July 1, 2023, which controllers must honor starting July 1, 2024. Note there also will be CPRA regulations on this point with compliance likely due by January 1, 2023. Unlike CPRA, which makes the global privacy control optional, controllers must comply with the universal opt-out under ColoPA.
  • Appealing consumer rights decisions. Like Virginia, ColoPA requires controllers to set up mechanisms permitting consumers to appeal a controller’s decision not to comply with a consumer’s request. The controller must then inform the consumer of its reasons for rejecting the request and also inform the consumer of his or her ability to contact the Attorney General “if the consumer has concerns about the result of the appeal.”
  • Requiring data protection assessments. Similar to GDPR, and consistent with the VCDPA, ColoPA requires data protection assessments (“DPAs”) for certain processing activities, namely, targeted advertising, sales, certain profiling, and processing of sensitive personal data. As with Virginia, the Colorado Attorney General has the right to request copies of a controller’s DPAs.
  • Consent for certain processing. Again following Virginia’s lead, ColoPA requires opt-in consent for the processing of sensitive personal information, which covers categories such as racial or ethnic origin, religious beliefs, citizenship, or genetic or biometric data used for uniquely identifying an individual. ColoPA also requires consent for processing children’s data, with a “child” being any individual under the age of 13. Unlike the VDCPA, ColoPA does not require COPPA-compliant consent for such processing, but ColoPA does exempt from the law personal data that is processed consistent with COPPA requirements.
  • Right to cure. ColoPA allows controllers to cure violations and is unique by establishing the longest right to cure, at 60 days, and also because the statute repeals the provision on January 1, 2025. Thus, while the Attorney General initially must give a controller notice and an opportunity to cure any violation before taking enforcement action, the Attorney General will be able to act without such notice from January 1, 2025 onward.
  • Establishing controller duties. ColoPA establishes certain duties for controllers, including the duties of transparency, purpose specification, data minimization, avoiding secondary use, care, avoiding unlawful discrimination, and duties regarding sensitive data. These duties create related obligations, such as providing a privacy policy, establishing security practices to secure personal data, and obtaining consent prior to processing sensitive data or children’s data.
ColoPA VCDPA CCPA
Thresholds to Applicability Conduct business in CO or produce products or services targeted to CO and (a) control or process personal data of at least 100,000 consumers; or (b) derive revenue or receive a discount on the price of goods or service from selling personal data or controls personal data of at least 25,000 consumers Conduct business in or produce products or services targeted to VA and (a) control or process personal data of at least 100,000 consumers; or (b) derive over 50% of gross revenue from the sale of personal data and process or control personal data of at least 25,000 consumers Conduct business in CA and collect personal information of CA residents and: (a) has $25 million or more in annual revenue for preceding calendar year as of Jan. 1 of calendar year; (b) annually buys, sells, or shares personal data of more than 100,000 consumers or households; or (c) earns more than 50% of its annual revenue from selling or sharing consumer personal information
Consent Requires opt-in consent for processing sensitive personal data, including children’s data Requires opt-in consent for processing sensitive personal data, and COPPA-compliant consent for processing children’s data Requires opt-in consent for sharing PI for cross-context behavioral advertising for children under 16, including parental consent for children under 13
Opt-Out Required for targeted advertising, sales, and profiling for legal or similarly significant effects Required for targeted advertising, sales, and profiling for legal or similarly significant effects Required for profiling, cross-contextual advertising, and sale; right to limit use and disclosure of sensitive personal information
Other Consumer Rights Access, Deletion, Correction, Portability Access, Deletion, Correction, Portability Access, Deletion, Correction, Portability
Authorized Agents Permitted for opt-out requests N/A Permitted for all requests
Appeals Must create process for consumers to appeal refusal to act on consumer rights Must create process for consumers to appeal refusal to act on consumer rights N/A
Private Cause of Action No No Yes, related to security breaches
Cure Period? 60 days until provision expires on Jan. 1, 2025 30 days No
Data Protection Assessments Required for targeted advertising, sale, sensitive data, certain profiling Required for targeted advertising, sale, sensitive data, certain profiling Annual cybersecurity audit and risk assessment requirements to be determined through regulations
Given the significant overlap among the three privacy laws, companies subject to ColoPA should be able to leverage VCDPA and CCPA implementation efforts for ColoPA compliance. If ColoPA is any example, other state privacy efforts may not veer too far from the paths VCDPA and CCPA have forged. The key will be to closely monitor how CalPPA and the Colorado Attorney General address forthcoming regulations and whether they add new distinct approaches for each state. Check back on our blog for more privacy law updates.

* * *

Colorado Passes Privacy Bill: How Does it Stack Up Against California and Virginia?

Subscribe here to our Ad Law News and Views newsletter and visit the Advertising and Privacy Law Resource Center for update information on key legal topics relevant to advertising and marketing, privacy, data security, and consumer product safety and labeling.

Kelley Drye attorneys and industry experts provide timely insights on legal and regulatory issues that impact your business. Our thought leaders keep you updated through advisories and articles, blogs, newsletters, podcasts and resource centers. Sign up here to receive our email communications tailored to your interests.

Follow us on LinkedIn and Twitter for the latest updates.

]]>
Ad Law Access Podcast – State Privacy Laws: How We Got Here and Where We Are Headed https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ad-law-access-podcast-state-privacy-laws-how-we-got-here-and-where-we-are-headed https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ad-law-access-podcast-state-privacy-laws-how-we-got-here-and-where-we-are-headed Thu, 01 Apr 2021 07:35:36 -0400 Ad Law Access PodcastMany states are considering comprehensive privacy legislation in the absence of a federal law. On another much anticipated episode of the Ad Law Access podcast, Alysa Hutnik and Aaron Burstein discuss pending state privacy legislation, how we got here, and some expected future legislation. Find the episode here or wherever you get your podcasts.

Contact:

Alysa Z. Hutnik [email protected]

Aaron Burstein [email protected]

For additional information, please visit:

]]>
Further Amendments to CCPA Regulations Are Approved and in Effect https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/californias-office-of-administrative-law-approved-further-revisions-to-the-attorney-generals-ccpa-regulations-on-march-15-2021 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/californias-office-of-administrative-law-approved-further-revisions-to-the-attorney-generals-ccpa-regulations-on-march-15-2021 Wed, 17 Mar 2021 10:22:46 -0400 California’s Office of Administrative Law approved further revisions to the Attorney General’s CCPA regulations on March 15, 2021. The revisions went into effect upon approval. In substance, the revisions are identical to the fourth set of modifications the Attorney General proposed on December 10, 2020, and make the following changes: (1) Notice for Sale of PI Collected Offline: Businesses that sell personal information collected offline must provide an offline notice by means such as providing paper copies or posting signs in a store, or giving an oral notice if collecting personal information over the phone. (2) Opt-Out Icon: The revised regulations provide that businesses may use an opt-out icon in addition to, but not in lieu of, notice of a right to opt out or a “Do Not Sell My Personal Information” link. (3) Do Not Sell Requests: A “Do Not Sell” request must “be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out.” The change prohibits businesses from using any method that is designed to or would have the effect of preventing a consumer from opting out. The revised regulation offers examples of prohibited opt-out practices, which include requiring a consumer to: (A) complete more steps to opt out than to re-opt in after a consumer had previously opted out; (B) provide personal information that is not necessary to implement the opt-out request; and (C) read through a list of reasons why he or she shouldn’t opt out before confirming the request. (4) Consumer Requests from Authorized Agents: A business may now require an authorized agent who submits a request to know or delete to provide proof that the consumer gave the agent signed permission to submit a request. The regulations also preserve the options business previously had of requiring the consumer to verify their identity directly to the business or directly confirming that they provided the authorized agent permission to submit the request. (5) Children’s Information: The addition of the word “or” in section 999.332 requires businesses that sell personal information of children under the age of 13 “and/or” between the ages of 13 and 15 to describe in their privacy policies how to make an opt-in to sale requests. We will continue to monitor closely further developments in CCPA regulations.California’s Office of Administrative Law approved further revisions to the Attorney General’s CCPA regulations on March 15, 2021. The revisions went into effect upon approval. In substance, the revisions are identical to the fourth set of modifications the Attorney General proposed on December 10, 2020, and make the following changes:

(1) Notice for Sale of PI Collected Offline: Businesses that sell personal information collected offline must provide an offline notice by means such as providing paper copies or posting signs in a store, or giving an oral notice if collecting personal information over the phone.

(2) Opt-Out Icon: The revised regulations provide that businesses may use an opt-out icon in addition to, but not in lieu of, notice of a right to opt out or a “Do Not Sell My Personal Information” link.

(3) Do Not Sell Requests: A “Do Not Sell” request must “be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out.” The change prohibits businesses from using any method that is designed to or would have the effect of preventing a consumer from opting out. The revised regulation offers examples of prohibited opt-out practices, which include requiring a consumer to: (A) complete more steps to opt out than to re-opt in after a consumer had previously opted out; (B) provide personal information that is not necessary to implement the opt-out request; and (C) read through a list of reasons why he or she shouldn’t opt out before confirming the request.

(4) Consumer Requests from Authorized Agents: A business may now require an authorized agent who submits a request to know or delete to provide proof that the consumer gave the agent signed permission to submit a request. The regulations also preserve the options business previously had of requiring the consumer to verify their identity directly to the business or directly confirming that they provided the authorized agent permission to submit the request.

(5) Children’s Information: The addition of the word “or” in section 999.332 requires businesses that sell personal information of children under the age of 13 “and/or” between the ages of 13 and 15 to describe in their privacy policies how to make an opt-in to sale requests.

We will continue to monitor closely further developments in CCPA regulations.

https://www.adlawaccess.com/

Kelley Drye attorneys and industry experts provide timely insights on legal and regulatory issues that impact your business. Our thought leaders keep you updated through advisories and articles, blogs, newsletters, podcasts and resource centers. Sign up here to receive our email communications tailored to your interests.

Follow us on LinkedIn and Twitter for the latest updates.

]]>
Two’s Company: Virginia Has a Comprehensive Data Privacy Law https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/virginia-has-a-comprehensive-data-privacy-law https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/virginia-has-a-comprehensive-data-privacy-law Wed, 03 Mar 2021 12:38:22 -0500

On March 2, Governor Ralph Northam signed the Virginia Consumer Data Protection Act (VCDPA) into law, making Virginia the second state to enact comprehensive privacy legislation.

With the VCDPA on the books, companies have the next 22 months to prepare for the VCDPA and the California Privacy Rights Act (CPRA) to go into effect. This post takes a look at the VCDPA provisions that are novel and require close attention during the transition period to the law’s January 1, 2023 effective date.

  • Sensitive Data: The VCDPA breaks new ground in U.S. privacy law by requiring consent to process “sensitive data” – a term that includes precise geolocation data; genetic or biometric data used to identify a person; and data revealing race or ethnicity, religious beliefs, health diagnosis, sexual orientation, or citizenship or immigration status. The definition of “consent,” in turn, tracks the GDPR definition: “freely given, specific, informed, and unambiguous” and conveyed by a “clear affirmative act.”
  • Opt-Outs: Controllers will need to offer opt-outs under three distinct circumstances. In addition to an opt-out of sale (which is limited to exchanging personal data for monetary consideration), controllers must allow consumers to opt out of (1) “targeted advertising” and (2) “profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.” Although the definition of “targeted advertising” excludes ads “based on activities within a controller’s own websites or online applications” (which also appears to include affiliates’ websites), it is unclear whether this exclusion encompasses, for example, a controller’s use of third-party data sources combined with its first party data to inform its targeting decisions. As a result, VCDPA opt-outs could have a significant impact on first-party data use as well as third-party data sharing. Notably though, the law clearly excludes from the targeted advertising definition the processing of personal data solely for measuring or reporting advertising performance, reach, or frequency.
  • Principles for Data Controllers: Section 59.1-574 articulates several broad, principles-based obligations of data controllers, including reasonable security and a duty to limit personal data collection to what is “adequate, relevant, and reasonably necessary” to fulfill purposes that have been disclosed to consumers. Companies have gained experience with similar principles under the GDPR and federal and state reasonable security requirements, but their inclusion in comprehensive privacy legislation that provides civil penalties of up to $7,500 per violation counsels in favor of taking a close look at how to demonstrate a thoughtful, well-reasoned approach to data strategies.
  • Data Protection Assessments: Controllers will need to conduct data protection assessments not only for high-risk activities but also for targeted advertising, profiling, personal data sales, and sensitive personal data processing. These assessments will be fair game for the Attorney General in any investigation of a controller’s compliance with the data protection principles and transparency requirements of section 59.1-574, though the VCDPA purports to preserve attorney-client privilege and work product protection for assessments submitted in response to a civil investigative demand. The affirmative obligation to conduct such assessments does not begin until January 2023.
  • In and Out of Scope: The Virginia law focuses on Virginia residents in their capacity as a consumer, and expressly excludes a person acting in an employment or commercial (B2B) capacity. The law also excludes GLBA-covered financial institutions and financial personal information, FCRA-covered information, HIPAA covered entities and their business associates, non-profits, and higher education. Publicly-available information is also outside the scope of regulated personal information, and extends to data from publicly-available government records or when lawfully made available to the general public.
  • No Private Right of Action: The Virginia law provides the Attorney General with the exclusive authority to enforce the law and that it should not be used as a basis to bring a private suit under the act or any other law. However, as we’ve seen with the CCPA, that type of restriction has not stopped parties from pursuing creative ways to bring private actions for privacy violations, including under other provisions of state law, such as unfair and deceptive trade practice statutes.

For better or worse, companies will need to prepare for the VCDPA without an obvious prospect of additional regulatory guidance. Unlike the regulatory structure the CCPA established – and the CPRA significantly expands, Virginia’s privacy law does not provide any state agency or official with rulemaking authority. However, the VCDPA could be just a first step. Governor Northam reportedly “will have an ongoing work group to continue to strengthen the law’s consumer protections,” and Virginia Delegate Cliff Hayes, who introduced the House version of the law, signaled that legislators are open to making such changes. It will remain to be seen the extent to which this group will recommend allocating additional funding to the Attorney General’s office to enforce the law, and the type of enforcement we may see. Historically, the office has not been as active as other state attorneys general on consumer protection related matters outside of a fraud context.

We will watch closely for changes in Virginia and progress in other state privacy bills.

]]>
Webinar Replay: Futureproofing Privacy Programs https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/webinar-replay-futureproofing-privacy-programs https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/webinar-replay-futureproofing-privacy-programs Wed, 14 Oct 2020 09:13:24 -0400 The replay for our October 13, 2020 Futureproofing Privacy Programs webinar is available here.

Building a successful privacy program requires much more than compliance with data protection laws. To thrive in today’s global, data-driven environment, companies also need to understand the political environment and public attitudes surrounding privacy in the countries in which they operate. Of course, companies must anticipate and adapt to changing privacy regulations as well. This webinar presented strategies to help meet these challenges, with a focus on setting up structures to join local awareness with global compliance approaches.

This webinar will feature Kelley Drye attorney Aaron Burstein, along with Abigail Dubiniecki and Kris Klein of nNovation LLP.

To view the webinar recording, click here.

Subscribe to the Ad Law Access blog to receive realtime updates on privacy and other related matters.

The Ad Law News and Views newsletter provides information on our upcoming events and a summary of recent blog posts and other publications.

Visit the Advertising and Privacy Law Resource Center for additional information, past webinars, and educational materials.

Advertising and Privacy Law Resource Center

]]>
Webinar Replay: California Consumer Privacy Act (CCPA) for Procrastinators https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/the-replay-for-our-july-30-2020-ccpa-for-procrastinators-webinar-is-available-here https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/the-replay-for-our-july-30-2020-ccpa-for-procrastinators-webinar-is-available-here Fri, 31 Jul 2020 16:33:17 -0400 The replay for our July 30, 2020 California Consumer Privacy Act (CCPA) for Procrastinators: What You Need To Do Now If You Haven’t Done Anything Yet webinar is available here.

The coronavirus pandemic has put many things on hold, but CCPA enforcement is not one of them. The California Attorney General’s enforcement authority kicked in on July 1, 2020, and companies reportedly have begun to receive notices of alleged violation. In addition, several class actions have brought CCPA claims. Although final regulations to implement the CCPA have yet to be approved, compliance cannot wait. If you’re not yet on the road to CCPA compliance (or would like a refresher), this webinar is for you. We covered:
  • Latest CCPA developments
  • Compliance strategies
  • Potential changes to the CCPA if the California Privacy Rights Act (CPRA) ballot initiative passes
Anyone who has not begun their CCPA compliance efforts or thinks they need a refresher should watch this webinar. To view the presentation slides, click here. To view the webinar recording, click here. Subscribe to our Ad Law News and Views newsletter to receive information on our next round of webinars and to stay current on advertising and privacy matters. Visit the Advertising and Privacy Law Resource Center for additional information for additional information, past webinars, and educational materials. Ad Law Access Podcast ]]>