While the Connecticut Unfair Trade Practices Act (CUTPA - Connecticut’s UDAP law) is broad and robust, in the privacy and cybersecurity space, the AG has additional authority derived from specific state laws such as the Data Breach Notification law and Connecticut’s Data Privacy Act (CTDPA). General Tong noted Connecticut’s dedication to enforcing consumer protection, as it relates to privacy, traces back to at least 2011 when it was the first state to create the Privacy Task Force and eventually a standalone Privacy Section in 2015.
Enforcing the CTDPA
AG Tong noted that the CTDPA reflects a “philosophical judgment of Connecticut to return rights and power of authority to consumers regarding their Personal Information.” As we have previously reported, the CTDPA provides for several rights such as the right to access, right to portability, right to correct mistakes, right to deletion, and the right to opt out of targeted advertising, sale, and profiling of personal data.
The CTDPA also creates obligations for “controllers” which are entities that alone or jointly determine the purpose and means of processing of personal data. Some of these obligations include: minimizing data collection and storage, providing transparency about the types of data collected and why, ensuring that data is secure, and obtaining consent to process sensitive data. Notably, the CTDPA also provides heightened protections for data related to teenagers, a hot topic for State AGs. Controllers must obtain consent to sell teens’ data or conduct targeted advertising to teens.
The Connecticut AG has the exclusive authority to enforce the CTDPA’s provisions, making their insights all the more valuable. However, the law provides for a cure period. This means that if the AG’s office is aware of a potential violation, the office will reach out to the entity and issue a notice of violation if the AG determines that a cure is possible. If the controller fails to cure within sixty (60) days, then the AG may bring an action against the entity. Similar to the data breach notification law discussed below, a violation is a per se violation of CUTPA.
Connecticut AG’s Advice: How to Prepare for Compliance with the CTDPA
With the CTDPA’s effective date quickly arriving on July 1, 2023, the Connecticut AG’s office provided their own recommendations on how to take steps and prepare for compliance with the new law:
Safeguards and Data Breach Notice Laws
The Connecticut Safeguards Law, referred to by the office as the basic building blocks for Connecticut’s privacy infrastructure, requires any person in possession of Personal Information (PI) to safeguard data against misuse by third parties, and destroy, erase, or make unreadable the data prior to disposal. Penalties under the Safeguards law can be significant—up to $500 per intentional violation and up to $500,000 for a single event.
Connecticut defines PI as information capable of being associated with a particular individual through one or more identifiers. The AG’s office noted that PI is broadly defined. For instance, PI includes a person’s name, but also covers other identifiers including social security numbers, driver’s license numbers, credit/debit card numbers, passport numbers, biometric information, online account credentials, and certain medical information.
Connecticut’s Breach Notification Law requires that an entity that experiences a data breach provide notice to the Connecticut AG without “unreasonable delay” within a 60-day limit. The law also requires that the entity provide two years of ID theft prevention services if social security numbers and taxpayer numbers (ITINs) are compromised. A violation of this law is a per se violation of CUTPA. Last year, Connecticut received over 1,500 data breach notifications, and the office is experienced in reviewing all types of data breaches and determining which ones to pay attention to.
Our Take
Connecticut has consistently been a leader in data security and privacy issues over the last decade, and with the passage of the CTDPA we expect to see the office double down on enforcement efforts. Businesses should pay particular attention to the compliance tips highlighted above by Ms. Lucan and General Tong, as there is little doubt the office will be actively looking for targets right out the gate on July 1. In General Tong’s words, “data privacy and the law of data privacy are here. Its obligations are here, present, and they are demanding.” Privacy laws can’t be approached as “optional” or “too cumbersome” to take precautions and manage the risks of collecting data. Law enforcement will take action where we believe people have failed to meet their obligations under the law” as that is what people in the state of Connecticut “expect and demand.”
Given Connecticut’s leadership in the multistate Attorney General community, we would not be surprised to see other states joining Connecticut in enforcement efforts, even without a comprehensive privacy law (relying on their UDAP authority as states have done for decades). Understanding your data collection and security practices is more important than ever.
***
Be sure to look out for Part II of this blogpost where we will talk about Connecticut’s UDAP law in more detail as well as priorities and more tools that the Connecticut AG’s office uses to enforce consumer protection laws. We also have an exciting blogpost recapping our conversation with the Nebraska Attorney General just around the bend. Stay tuned.
]]>Of note, Senators Blumenthal and Blackburn discussed the Kids Online Safety Act (KOSA) (their bill from last year, just re-introduced), which would impose a “duty of care” on tech companies and shield young people from harmful content. Senator Hawley, in turn, talked up his Making Age-Verification Technology Uniform, Robust, and Effective Act (MATURE Act), which would enforce a minimum age requirement of 16 for users of social media platforms. (As noted below, panelists were quite skeptical that this would work.)
The event highlighted, once again, the bipartisan interest in tackling the harms that minors face online. Here’s more detail on what happened:
First up, opening remarks from Chairman Durbin (D-Ill.), Ranking Member Graham (R-S.C.), and Senators Blumenthal (D-Conn.) and Blackburn (R-Tenn.)
Chairman Durbin kicked off the hearing by explaining that the internet and social media have become a threat to young people. He noted that while the Internet offers tremendous benefits, cyberbullies can hurt kids online via platforms like Facebook and Snapchat. Durbin stated that “we don’t have to take” the lucrative business that the platforms (who were not in attendance) have created to keep kids’ eyes glued to the screens. He said that the addictive nature of the platforms has created a mental health crisis – causing anxiety, stress, and body image issues, for example – which can lead to tragic results.
Sen. Graham announced that he and Sen. Warren (D-Mass.) are working on a bipartisan bill to create a Digital Regulatory Commission with the power to shut down websites that don’t engage in “best business practices” to protect children from sexual exploitation online. He also expressed concern about the lack of regulatory oversight for the abuses of social media.
Sen. Blumenthal promoted KOSA, and also committed to major reform of Section 230. Sen. Blackburn echoed these sentiments, describing social media as the Wild West: the kids are the product, there are very few rules, and data is taken and sold to advertisers. Blackburn added that she wants social media to be safer by default, not something that becomes safer after a consumer takes additional steps. She also said that she supports transparent audits of company practices.
Next, the Statements from Witnesses
Kristin Bride is a Survivor Parent and Social Media Reform Advocate whose son took his own life after being cyberbullied via anonymous messaging apps on Snapchat. She explained how she was ignored when she reached out to the apps for help in learning the bullies’ identities. Then, when she filed a class action against Snap, it was dismissed due to Section 230 immunity. While Snap did remove the anonymous messaging apps after she filed the class action, she has seen new apps pop up that charge children to reveal the identities of those sending harmful messages.
Emma Lembke, a sophomore in college and the founder of the Log Off movement, expressed frustration with being a passive victim of big tech, saying that social media led her to disordered eating. She also stressed the importance of including young people in efforts to effect change.
Michelle DeLaune is President and CEO of the National Center for Missing & Exploited Children. One of her chief concerns is companies’ use of end-to-end encryption. Encryption allows people to send messages that platforms cannot read or flag for harmful content. She described this as “turning off the lights” on content that exploits children.
John Pizzuro is the CEO of Raven and a former Commander of the Internet Crimes Against Children Department of the New Jersey State Police. He explained that police are overburdened with cybercrime reports, forcing them to become far more reactive and less proactive in protecting children online.
Dr. Mitch Prinstein, Chief Science Officer at the American Psychological Association, testified that many social media apps are directed to children. He said that the average teen picks up their phone over 100 times per day and spends over eight hours per day on social media, developing a clinical dependency. According to Prinstein, social media stunts kids’ ability to develop healthy relationships; increases loneliness, stress, anxiety, and exposure to hateful content; and causes lack of sleep. He supports more federal funding for research on the effects of social media, and believes that manipulating children and using their data should be illegal.
Finally, Josh Golin, Executive Director of Fairplay, said he supports policies to make the internet safe, non-exploitative, and free of Big Tech. He said that digital platforms are designed to maximize engagement because companies make more money the longer kids spend online. They use manipulative design and relentless pressure to entice kids to use the platforms as often as possible, thereby profiting from targeted ads. Golin supports limits on data collection; banning surveillance advertising based on teen's vulnerabilities; holding platforms liable for design choices that affect young people; and requiring transparency for algorithms.
Questions from the Committee
Committee members asked an assortment of questions, some related to kids’ privacy and others related to tech issues more broadly.
Sen. Durbin highlighted Section 230 immunity, which last year’s EARN IT Act would amend. Sen. Whitehouse (D-R.I.) also said he supported Section 230 reform and that he wants to see class actions like Ms. Bride’s be allowed to continue, rather than dismissed on immunity grounds. Sen. Hirono (D-HI) cautioned against a wholesale repeal of Section 230 and stressed that any reform should be done carefully. Sen. Graham reiterated his support for a Digital Regulatory Commission, while also noting that repeal or reform of Section 230 would be a step in the right direction.
Others, such as Sens. Lee (R-Utah) and Coons (D-Del.), asked questions about the complaints received by the National Center for Missing and Exploited Children, and said they support better research on the design choices of social media. Sen. Coons added that he supports new mandates for platforms, including a duty of care; limits on data collection from kids; and disclosures regarding how they manage content. (As to the latter, see Coon’s bill from last year here).
Sen. Blumenthal echoed the call for further research on social media, including as to the role that it plays in, on the one hand, harming members of the LGBTQ+ community, and, on the other, providing this community with access to important information and connections. Meanwhile, Sens. Blackburn and Grassley (R-Iowa) mentioned the link between social media and drug-overdose deaths, and Sen. Ossoff (D-Ga.) mentioned his legislation, with Grassley, to address child exploitation.
Rounding out the discussion, Sen. Cornyn (R. Tex.) described the online landscape as “designed” to “hoover up” children’s data and stated that he supports legislation to “attack the business model.” Sen. Klobuchar (D. Minn.) said she supports imposing a duty of care on companies, as well as requirements to stop companies from pushing harmful content in response to innocent queries (for example, serving content related to disordered eating in response to a search for “healthy food”). Sen. Welch said he wants to focus on companies that target more clicks to obtain more ad revenue.
Finally, Sens. Kennedy (R-La.) and Hawley both pushed for legislation that would ban children under 16 from using social media. The witnesses generally agreed that this is unrealistic.
* * *
That’s our snapshot of the hearing. The question now is whether Congress will move forward on kids’ privacy legislation and succeed in 2023 where it fell short in 2022. Some related questions are:
Stay tuned as we continue to track these and other developments related to kids’ privacy.
]]>As workforces become increasingly mobile and remote work is more the norm, employers face the challenge of balancing the protection of their employees’ personal data and privacy against the need to collect and process personal data to recruit, support and monitor their workforces. Mounting regulations attempt to curb employers’ ability to gather and utilize employee data—from its historical use in processing employee benefits and leave requests to employers’ collection, use or retention of employees’ biometric data to ensure the security of the organization’s financial or other sensitive information systems. Learn what employers can do now to protect employee data and prepare for the growing wave of data privacy laws impacting the collection and use of employee personal data.
Avoiding Price Gouging Claims Wednesday, August 3 Recently State Attorneys General, the House Judiciary Committee, and many others have weighed in on rising prices in an attempt to weed out price gouging and other forms of what they deem “corporate profiteering.” States and federal regulators are carefully looking at pricing as consumers and constituents become more sensitive to the latest changes and price gouging enforcement is an avenue states may be able to use to appease the public. Unlike other emergencies in the past, the current state of supply chain and labor shortages, along with skyrocketing costs for businesses, make it unrealistic for companies to simply put a freeze on any price increases. This webinar will cover:
• The basics of price gouging laws and related state emergency declarations and how to comply • The differences and varied complexities in state laws • General best practice tips • How AGs prioritize enforcement
* * * *
Find more upcoming sessions, links to replays and more here
]]>As discussed in State Attorneys General 101, State Attorneys General are the primary enforcers of consumer protection laws within their state and hold sweeping powers to protect the public they serve by launching investigations and litigation alone or in multi-state actions involving numerous states and territories across the country.
As requested by many, please join Kelley Drye State Attorneys General practice Co-Chair Paul Singer and Senior Associate Beth Chun for State Attorneys General 102. This short 30-minute webinar picks up where we left off and answers a number of questions regarding:
Find more upcoming sessions, links to replays and more here
]]>But let’s unpack the surprises in the draft regulations. The 66-page draft proposed CCPA regulations (and they are referred to within the document as CCPA regulations) take a prescriptive approach to privacy obligations. In concept, that is not too surprising. Of concern, in some areas, they uniquely depart from approaches set forth by other state privacy laws. The quiet release of dramatic new obligations while bipartisan Senators reportedly may be reaching consensus on federal privacy legislation that could preempt state law obligations puts companies doing business in California in a difficult position. Do they scramble to operationalize new programs to comply with the CPPA’s new requirements, if finalized? Do they wait on Congress? Do they choose a third path? For now, while these draft rules are certain to change in some respects before they are finalized, they directionally outline a new privacy baseline for the United States. We highlight certain aspects of the draft rules below, with a particular focus on accountability and risk exposure, how data can be shared with other businesses for digital advertising or other functions, and what those business agreements must include to lawfully support such business relationships and comply with the amended CCPA.
Quick and Costly Potential CPPA Enforcement
Consumers, the CPPA, and the California Attorney General’s Office all are empowered to take businesses (and contractors, service providers, and third parties) to task for perceived non-compliance with privacy obligations. Among all of the proposed changes in the draft regulations, the enforcement provisions should cause many companies, regardless of their role, to pause and evaluate whether they’ve allocated sufficient resources to address privacy compliance. While there is not a privacy private right of action under the CCPA/CPRA, the draft rules set forth a new increased, and fast tracked form of compliance monitoring and action that could be surprising to many companies and costly.
First, while there are provisions about requiring consumers to file sworn complaints, the CPPA provides that it can accept and initiate investigations on unsworn and anonymous complaints too. For every sworn complaint, the CPPA must notify the consumer complainant in writing of what actions the Agency has taken or plans to take and the reasons for action or non-action. Because the Agency has to respond to each complaint, this could turn into a routinized process of a high volume of complaints forwarded to businesses, with tight timeframes to respond in writing or else face violations and administrative fines.
The rules provide that there is “probable cause” of a privacy violation if “the evidence supports a reasonable belief that the CCPA has been violated.” There is no mention of extensions of time for good faith reasons. Under the statute, the CPPA can find a violation through a probable cause hearing if it provides notice by service of process or registered mail with return receipt to the company “at least 30 days prior to the Agency's consideration of the alleged violation.” The notice must contain a summary of the evidence, inform the company of their right to be present “in person and represented by counsel.” The “notice” clock starts as of the date of service, the date the registered mail receipt is signed, or if the registered mail receipt is not signed, the date returned by the post office. It’s possible this process occurs through the forwarding of unverified consumer complaints.
Under the draft rules, a company can request the proceeding be made public if they make a written request at least 10 business days before the proceeding. A company has a right to an in-person proceeding only if it requests the proceeding be made public. Otherwise, the proceeding may be conducted in whole or in part by telephone or video closed to the public. Participants are limited to the company representative, legal counsel, and CPPA enforcement staff. The CPPA serves as prosecutor and arbiter, and the draft rules do not define how the agency preserves its neutrality in its latter role.
The CPPA makes a determination of probable cause at such proceeding “based on the probable cause notice and any information or arguments presented at the probable cause proceeding by the parties.” If a company does not participate or appear, it waives “the right to further probable cause proceedings” (it’s not clear in the draft rules whether that is limited to the facts of that matter, or future alleged violations) and a decision can be made on the information provided to the CPPA (such as through a complainant).
The CPPA then issues a written decision and notifies the company electronically or by mail. Of concern, the draft rules provide that this determination “is final and not subject to appeal.” Under the statute, violations can result in an administrative fine of up to $2500 for each violation, and up to $7500 for each intentional violation or if the violation involves minors. Multiple parties involved can be held jointly and severally liable. It’s conceivable that violations may be calculated on any number of factors that could add up substantially, and as contemplated by these draft rules, there is no process to challenge such judgments, including if there are factual or legal disputes. One can imagine future legal proceedings that challenge a variety of the legal bases for such a structure if these rules are finalized as drafted.
Service Provider Requirements and Restrictions
Data Privacy Addendums Get a Further Tune Up, and Open Question on Whether They Need to be Bespoke. One aspect of state privacy law compliance that has consumed much resources and time are the service provider contracts. Who is a service provider? What must the contract say? What restrictions apply to service providers (or contractors)? The draft rules continue to add more obligations.
One must have a written contract in place that meets all of the requirements outlined below to even qualify as a service provider and contractor. The contract requirements are very granular, and go beyond what most current privacy addendums (or technology provider terms and conditions) look like today, and include:
The Limitations on Internal Use of Customer Data by a Service Provider/Contractor. The draft rules provide that a service provider/contractor is restricted from using customer personal data for its own purposes, except for internal use to build or improve the quality of its services, provided that the service provider/contractor does not use the personal information to perform services on behalf of another person in a manner not permitted under the CCPA. This language is notably different from the governing CCPA rules. Based on the examples outlined below, and the admonition above that the service provider cannot combine or update personal information received from another source unless permitted by the CCPA, makes it ambiguous as to when updating personal information crosses the line. From the examples, it suggests that where such functions are to facilitate personalized advertising or data sales, they would not fit within a service provider/contractor role.
Use for Analysis/Data Hygiene (Sometimes). The draft rules set forth two examples that seem to allow some analysis and data correction under particular circumstances. For example, the first illustration emphasizes that the service provider/contractor can analyze how a business customer’s consumers interact with company communications to improve overall services, and the second example highlighted that a service provider/contractor can use customer data to identify and fix incorrect personal information that, as a result, would improve services to others. The draft rules underscore, however, that a service provider/contractor could not compile (e.g., enrich/append) personal information for the purpose of sending advertising to another business or to sell such personal information.
Data Security/Fraud Prevention. Consistent with the statute, the draft rules allow service providers/contractors to use and combine customer personal information “[t]o detect data security incidents or protect against malicious, deceptive, fraudulent or illegal activity.”
Other Legal Purposes. The draft rules acknowledge that a service provider/contractor can use customer data to comply with other laws, lawful process, to defend claims, if the data is deidentified or aggregated, or does not include California personal information.
Advertising Service Provider Functions Look Limited. The draft rules acknowledge a business can engage a service provider/contractor for advertising/marketing services if the services do not combine opted out consumer data from other sources. The draft rules also affirmatively reiterate that an entity who provides cross-contextual behavioral advertising is a third party and not a service provider/contractor.
Notice at Collection. The draft rules have new language that, in the context of “notice at collection” provide that when more than one party controls personal information collection, such as in connection with digital advertising, all such parties must provide a very detailed “notice at collection” that accounts for all parties’ business practices. As an example:
Honoring Opt Outs. Section 7051 provides that third parties are directly obligated to honor opt outs, including as conveyed through a global privacy signal or otherwise on a first-party business’s site hosting the third party’s tag collecting personal information, unless the first-party business informs the third party that the consumer has consented to the sale/sharing, or “the third party becomes a service provider or contractor that complies with the CCPA and these regulations.”
* * *
There is a lot to consider and while all of these provisions remain subject to further changes, it is clear that the draft rules suggest a more exacting expectation as to privacy compliance by companies doing business in California or otherwise with California residents, and an expansive new set of obligations to tighten such compliance within the information supply chain. We will cover in future blog posts how these draft rules contemplate other business obligations, including as to obligations around obtaining consent, privacy policies, responses to consumer privacy rights, the use of sensitive personal information, and mechanics of complying with opt out of sales/shares, and global privacy controls. If you are interested in submitting comments in the rulemaking process or have questions about privacy compliance, please reach out to members of Kelley Drye’s privacy team.JOIN US
Separately, join us as Kelley Drye privacy lawyers provide observations on the proposed regulations, including which would pose the biggest challenge for businesses if implemented, and will offer strategies to plan efficiently for compliance in the face of these proposals. Register here.
]]>Protecting the privacy and safety of kids and teens online is receiving enormous attention lately from Congress, the States, the FTC, and even the White House. Further, just last month, BBB National Programs unveiled a Teenage Privacy Program Roadmap offering a comprehensive framework for companies to use in identifying and avoiding online harms impacting teens.
Amidst these developments, Kelley Drye held a webinar to discuss the unique challenges associated with teen privacy. Dona J. Fraser, Senior Vice President Privacy Initiatives, BBB National Programs, and Claire Quinn, Chief Privacy Officer, PRIVO, along with Kelley Drye’s Laura Riposo VanDruff provided an update on key concerns and developments related to teen privacy, as well as practical tips for companies seeking to address these issues.
To view the webinar recording, click here or view it on the new Ad Law Access App.
Subscribe to the Ad Law Access blog to receive real-time updates on privacy and other related matters.
The Ad Law News and Views newsletter provides information on our upcoming events and a summary of recent blog posts and other publications.
Visit the Advertising and Privacy Law Resource Center for additional information, past webinars, and educational materials.
For easy access to all of our webinars, posts and podcasts, download our new Ad Law Access App.
]]>In the absence of a federal privacy law, privacy has been at the forefront of many states’ legislative sessions this year. Against this backdrop, state attorneys general continue to initiate investigations into companies’ privacy practices, and state agencies continue to advance privacy rulemakings under existing law. Aaron Burstein, Laura VanDruff and Paul Singer, presented this webinar to help learn about the latest developments in state privacy law, make sense of these developments and understand their practical impact.
To view the webinar recording, click here or view it on the new Ad Law Access App.
Subscribe to the Ad Law Access blog to receive real-time updates on privacy and other related matters.
The Ad Law News and Views newsletter provides information on our upcoming events and a summary of recent blog posts and other publications.
Visit the Advertising and Privacy Law Resource Center for additional information, past webinars, and educational materials.
For easy access to all of our webinars, posts and podcasts, download our new Ad Law Access App.
]]>
As we’ve discussed here, there’s bipartisan momentum in Congress to enact stronger privacy protections for kids and teens – and specifically, tools that would enable minors and their parents to limit algorithms and online content that fuel self-harm and addictive behaviors. These efforts, reflected in several federal bills (see here and here) and now in a California bill too, build on months of testimony by a social media insider and are modeled in large part on the UK’s Age Appropriate Design Code.
In his State of the Union address, the President added to this momentum, calling on Congress to enact stronger protection for kids – a move that was heralded in the media as a potential “game changer” for privacy that could “help clear the logjam on Capitol Hill.” (Relatedly, report language accompanying the recently signed budget bill directs the FTC to prioritize kids’ privacy in its enforcement efforts.)
It’s certainly understandable why U.S. policymakers would want to protect the privacy and safety of minors. It’s also notable that that they are focusing on an area where bipartisan action might be possible and emphasizing the safety aspects of these bills (as if the word “privacy” would jinx the effort while “safety” might garner more support). But, looking past the good intentions to protect kids, some of the concepts and language in these bills pose real challenges as to clarity and enforceability.
Focusing on just a few:
Certainly, the concerns driving these efforts – the harmful effects of social media on minors – are serious ones. They also unite members from different political parties, which is always a welcome development. However, as policymakers and stakeholders study these bills, they will likely (or hopefully) realize just how difficult implementation would be, sending them back to the drawing board for another try. Or maybe they will ultimately conclude that comprehensive privacy legislation is still the better approach.
]]>
As companies wait to see whether the Utah Consumer Privacy Act (UCPA) becomes the fourth comprehensive state privacy law, we are providing an overview of some of the Act’s key provisions – and how they depart from comprehensive privacy laws in California, Colorado, and Virginia.
Utah’s Senate unanimously passed the UCPA on February 25. The House – also through a unanimous vote – followed on March 2. The Legislature sent the UCPA to Governor Spencer Cox on March 15. Because the Legislature adjourned on March 4, Governor Cox has 20 days from the date of adjournment – March 24 – to sign or veto the Act. If Governor Cox takes no action, the UCPA will become law, with an effective date of December 31, 2023.
In broad strokes, the UCPA is similar to the Virginia Consumer Data Protection Act (VCDPA) and Colorado Privacy Act (CPA). And, like the laws in Colorado and Virginia, the UCPA borrows some concepts from the CCPA – including a version of the right to opt out of the “sale” of personal data.
However, the UCPA pares back important features of all three of these laws. Some of the significant changes include:
Colorado Privacy Act (CPA) | Virginia Consumer Data Protection Act (VCDPA) | California Consumer Privacy Act (CCPA as amended by CPRA) | Utah Consumer Privacy Act (UCPA) | |
Thresholds to Applicability | Applies to a controller that (1) conducts business in CO or targets products or services targeted to CO residents and (2) meets either of these thresholds: (a) controls or processes personal data of at least 100,000 consumers in a calendar year; or (b) derives revenue or receives a discount on the price of goods or service from selling personal data or controls personal data of at least 25,000 consumers | Applies to a person that (1) conducts business in VA or target products or services targeted to VA residents; and (2) meets either of these thresholds: (a) controls or processes personal data of at least 100,000 consumers; or (b) controls or processes personal data of at least 25,000 consumers and derives over 50% of gross revenue from the sale of personal data. | A “business” (1) conducts business in CA and collects personal information of CA residents; and (2) (a) has $25 million or more in annual revenue for preceding calendar year as of Jan. 1 of calendar year; (b) annually buys, sells, or shares personal data of more than 100,000 consumers or households; or (c) earns more than 50% of its annual revenue from selling or sharing consumer personal information. | A controller or processor that (1) conducts business in Utah or targets products or services to UT residents; (2) has $25 million or more in annual revenue; and (3) satisfies one of these thresholds: (a) during a calendar year, controls or processes personal data of 100,000 or more consumers, or; (b) derives over 50% of gross revenue from the sale of personal data and controls or processes personal data of 25,000 or more consumers. |
Opt-in Consent | Opt-in consent required to process sensitive data | Opt-in consent required to process sensitive data | Opt-in consent required to sell or “share” personal information of minors under age 16 | Not required for sensitive data (unless the data concerns a known child, and parental consent is required under COPPA) |
Opt-Out | Required for targeted advertising, sales, and profiling for legal or similarly significant effects | Required for targeted advertising, sales, and profiling for legal or similarly significant effects | Required for profiling, cross-contextual advertising, and sale; right to limit use and disclosure of sensitive personal information | Required for targeted advertising and sales |
Other Consumer Rights | Access, Portability, Deletion, Correction, | Access, Portability, Deletion, Correction | Access, Deletion, Correction, Portability | Access, Portability, and Deletion |
Authorized Agents | Permitted for opt-out requests | N/A | Permitted for all consumer rights requests | N/A |
Appeals | Must create process for consumers to appeal refusal to act on consumer rights | Must create process for consumers to appeal refusal to act on consumer rights | N/A | N/A |
Private Right of Action | No | No | Yes, for security breaches involving certain types of sensitive personal information | No |
Cure Period | 60 days until provision expires on Jan. 1, 2025 | 30 days | 30-day cure period is repealed as of Jan. 1, 2023 | 30 days |
Data Protection Assessments | Required for targeted advertising, sale, sensitive data, certain profiling | Required for targeted advertising, sale, sensitive data, certain profiling | Annual cybersecurity audit and risk assessment requirements to be determined through regulations | N/A |
Rumors suggest that Senator Schumer is maneuvering to confirm Alvaro Bedoya as FTC Commissioner sooner rather than later, which would give FTC Chair Khan the majority she needs to move forward on multiple fronts. One of those fronts is consumer privacy, for which Khan has announced ambitious plans (discussed here and here) that have stalled for lack of Commissioner votes. With Bedoya potentially on deck, now seems like a good time to recap those plans, as they might provide clues about what’s in the pipeline awaiting Bedoya’s vote. We focus here on three priorities Khan has emphasized in statements and interviews since becoming Chair.
Privacy Rulemakings
At the top of the list are privacy rulemakings, which could create baseline standards for the entire marketplace and enable the FTC to obtain monetary relief in its cases. (Recall that the FTC has limited authority to obtain money in its cases, especially post AMG, but that it can seek penalties or redress when it’s enforcing a rule.) Last December, Khan issued a Statement of Regulatory Priorities detailing the privacy rulemakings she wants to initiate or complete, including:
Of note, absent Congressional legislation, any new privacy rules would need to follow the arduous process detailed in Section 18 of the FTC Act (referred to as “Mag-Moss” rulemaking). With Bedoya on board, the FTC can start these rulemakings, but they could still take years to complete, as we discuss here.
By contrast, the FTC can amend its existing privacy rules under the more manageable Administrative Procedures Act. Further, it’s already in the midst of rule reviews for all of the rules listed above (including COPPA’s, which started back in 2019). As a result, the FTC could act on these rules relatively quickly once Bedoya is on board.
Focus on Platforms
Khan has also made clear that she intends to focus on the tech platforms – which she has described as “gatekeepers” that use their critical market position to “dictate terms,” “protect and extend their market power,” and “degrade privacy without ramifications.” In a statement and accompanying staff report last September, Khan stated that such efforts would include:
So far, we’ve seen limited action from the FTC on platforms (at least on the consumer protection side). Last October, the FTC issued a 6(b) report on the privacy practices of ISPs, but largely concluded that the topic should be addressed by the FCC. Then, in December, the FTC announced a settlement with online ad platform OpenX for COPPA violations. Given Khan’s bold plans in this area, it seems likely that there are matters in the pipeline awaiting Bedoya’s vote.
Stronger Remedies
The third major area that Khan has highlighted is obtaining stronger remedies in privacy cases – that is, considering “substantive limits”, not just procedural protections that “sidestep[] more fundamental questions about whether certain types of data collection and processing should be permitted in the first place.” By this, Khan is referring to deletion of data and algorithms, bans on conduct, notices to consumers, stricter consent requirements, individual liability, and monetary remedies based on a range of theories post AMG.
As to this priority, the FTC has moved ahead where it can (even prior to Khan’s tenure), often using strategies that have been able to garner unanimous votes. For example, its settlements with photo app Everalbum (for alleged deception) and WW International (for alleged COPPA violations) required deletion of consumer data and algorithms alleged to have been obtained illegally. Its settlement with fertility app Flo Health (for alleged deception about data sharing) required the company to notify affected consumers and instruct third parties that received their data to destroy it. The FTC also has alleged rule violations where possible, and partnered with other agencies to shore up its ability to obtain monetary relief.
But we’ve also seen signs of a more combative approach that could increase when Khan has the votes to push it forward. Of note, last September, the FTC issued an aggressive interpretation of the Health Breach Notification Rule, purporting to extend the rule’s reach (and thus its penalties) to virtually all health apps, even though a rule review was already underway. Further, FTC staff are making strong, often unprecedented demands for penalties, bans, and individual liability in consent negotiations. It’s even possible, based on an article written by former Commissioner Chopra and now-BCP Director Sam Levine, that the agency could attempt to use penalty offense notice letters (explained here) to lay the groundwork for penalties in privacy cases under Section 5(m)(1)(B). However, given the paucity of administratively litigated privacy cases (a key requirement under 5(m)(1)(B)), this would be very aggressive indeed.
* * *
For more on Khan’s privacy plans, you can read our earlier blogposts (here and here), as well as the various FTC statements and reports cited in this post. Or, if you like surprises, you can simply wait for Bedoya to be confirmed and see what happens. Needless to say, things should speed up at the FTC when he arrives.
In the absence of a federal privacy law, privacy has been at the forefront of many states’ legislative sessions this year:
Against this backdrop, state attorneys general continue to initiate investigations into companies’ privacy practices, and state agencies continue to advance privacy rulemakings under existing law.
Please join us on Thursday, March 24 at 4:00 pm ET for this webinar to learn about the latest developments in state privacy law, make sense of these developments and understand their practical impact.
]]>Watch a video version here or the audio version here.
Shoshana Gillers has served as TransUnion’s Chief Privacy Officer since September 2019. In this role Ms. Gillers oversees compliance with privacy laws across TransUnion’s global footprint and promotes a culture of responsible data stewardship.
Prior to joining TransUnion, Ms. Gillers spent four years at JPMorgan Chase, ultimately serving as Vice President and Assistant General Counsel, Responsible Banking, Data and Privacy. Previously, she served as a federal prosecutor for eight years at the U.S. Attorney’s Office in Chicago, and as a litigator for four years at WilmerHale in New York. Ms. Gillers clerked for the Hon. Robert D. Sack on the U.S. Court of Appeals for the Second Circuit and for the Hon. Aharon Barak on the Supreme Court of Israel.
Ms. Gillers received a B.A. from Columbia University, summa cum laude, and a J.D. from Yale Law School.
Alysa chairs Kelley Drye’s Privacy and Information Security practice and delivers comprehensive expertise in all areas of privacy, data security and advertising law. Her experience ranges from strategic consumer protection oriented due diligence and compliance counseling to defending clients in FTC and state attorneys general investigations and competitor disputes.
Prior to joining the firm, Alysa was a federal clerk for the Honorable Joseph R. Goodwin, United States District Judge, Southern District of West Virginia.
Alysa received a B.A. from Haverford College, and a J.D. from the University of Maryland Carey School of Law.
]]>Please join us for a webinar on February 24, 2022 at 4 p.m. on recent and upcoming FTC developments. The webinar will feature Kelley Drye’s Jessica Rich and Aaron Burstein, both former FTC officials. Here’s a taste of what we’ll be discussing, building on the commentary we have posted in this blog over the past few months:
All eyes are on the FTC this year, given its recent actions, setbacks, and ambitious plans for 2022.
As we’ve reported here, Chair Lina Khan has announced an aggressive privacy agenda, that includes new regulations; emphasis on the large platforms and other “gatekeepers” in the marketplace; stringent enforcement remedies (such as data deletion, bans on conduct, strict consent requirements, and individual liability); and significant monetary relief based on a range of creative theories.
Khan has already taken steps in this direction, including by issuing a policy statement and guidance reinterpreting the Health Breach Notification Rule; announcing a ramp-up against subscription services that use “dark patterns” to trick consumers into signing up; tightening requirements under the Gramm-Leach Bliley Safeguards Rule; and making strong demands in consent negotiations. In addition, she has announced plans to initiate privacy rulemakings under the FTC’s so-called “Magnuson-Moss” authority, including a rulemaking to limit “surveillance” in the commercial marketplace.
All of this takes place against the backdrop of recent setbacks and ongoing challenges faced by the agency. Last year, the Supreme Court’s ruled in AMG that the FTC cannot obtain monetary relief under Section 13(b) of the FTC Act, it’s chief law enforcement tool. For years, Congress has declined to pass a federal privacy law to strengthen the FTC’s authority in this area. The FTC has limited resources to fulfill its broad mission. And it cannot obtain civil penalties for most first-time law violations.
We will dive into these issues and more in our upcoming webinar, focusing on the practical impact for companies subject to FTC’s jurisdiction. Please join us on Thursday, February 24 at 4:00 pm EST for this second installment of Kelley Drye's 2022 practical privacy series. Register here.
]]>Privacy Compliance Provisions
New Mexico’s injunction related to the Tiny Lab case includes changes to Google Play which will take effect after 120 days. Some of the specific measures include:
In addition to these injunctive provisions, Google agreed to a set of voluntary enhancements to the Google Education platform intended to promote safety for students. New Mexico’s enforcement of these provisions is limited to its ability to confirm that Google has made the changes, or inquire as to the status of changes not made.
These injunctions demonstrate continued state Attorney General scrutiny regarding children’s information. And they come at a time that the Federal Trade Commission, which is responsible for issuing the COPPA Rule, is redoubling its COPPA efforts. The FTC’s ongoing COPPA Rule Review includes a number of questions regarding the intersection of COPPA and education technology. The FTC’s Statement of Regulatory Priorities, which we wrote about here, identifies COPPA as a top priority. And just this week, the FTC released its first COPPA settlement in almost 18 months.
Additional Settlement Terms Part from Historical State Settlements
Not to be ignored, several other provisions of the settlement have unique aspects that are extremely noteworthy. Google has agreed to pay New Mexico $5.5 million – with $1.65 million of that going to outside counsel for the state. The remaining payment will be used to fund the “Google New Mexico Kids Initiative” – a program jointly run by Google and New Mexico to award grants to schools, educational institutions, charitable organizations, or governmental entities. This unique allocation of the payment to the State could result in scrutiny that other State Attorney General settlements have met in the past where they attempted to designate funds to specific third party recipients. Some state legislatures may see it as an effort to appropriate funds without their involvement.
While New Mexico reserves its rights under the agreement regarding public statements, it has agreed to provide Google 24-hour notice before making any written public statement. Moreover, New Mexico agrees to consider in good faith any suggestions or input Google has, and any statement will reference the parties’ shared commitment to innovation and education. States routinely resist any efforts to negotiate press in this manner, and it is unclear how enforceable a provision like this could really be anyway. That said, this certainly reflects the cooperative nature of the agreement, in which case it’s fair to assume the State would issue press reflecting such cooperation anyway.
Google and New Mexico have also agreed to an ADR provision, requiring the state to pursue any disputes relating to the agreement in mediation prior to pursuing relief. This again is fairly unique for a State AG settlement, as is the overall form of the document (a “Settlement Agreement and Release”) – normally states will only settle matters through a consent judgment or a statutorily authorized Assurance of Compliance or Discontinuance. But just like some of the other unique provisions, agreeing to ADR may be more of a reflection of the cooperative nature of the agreement, and certainly presents opportunity for a more streamlined enforcement mechanism in the future.
It remains to be seen if these provisions will serve as a template for future state agreements with other companies, but given that state Attorneys General continue to pursue Google on a variety of fronts[1], New Mexico’s settlement will certainly be relevant in any future settlement efforts.
[1] Google Search Manipulation, Google Ad Tech, Google DOJ Search Monopoly, State of Arizona v. Google LLC geolocation privacy
]]>Jessica and Laura join our impressive list of former FTC officials, including the firm’s managing partner, Dana Rosenfeld, who served as Assistant Director of BCP and attorney advisor to FTC Chairman Robert Pitofsky, former Bureau Directors Bill MacLeod and Jodie Bernstein, as well as Aaron Burstein, having served as senior legal advisor to FTC Commissioner Julie Brill.
Jessica served at the FTC for 26 years and led major initiatives on privacy, data security, and financial consumer protection. She is credited with expanding the FTC’s expertise in technology and was the driver behind FTC policy reports relating to mobile apps, data brokers and Big Data, the Internet of Things, and federal privacy legislation. She also directed the agency’s development of significant privacy rules, including the Children’s Online Privacy Protection Rule and Gramm-Leach-Bliley Safeguards Rule. She is a recipient of the FTC Chairman’s Award, the agency’s highest award for meritorious service and the first-ever recipient of the Future of Privacy Forum’s Leadership Award. Jessica is also a fellow at Georgetown University’s Institute for Technology Law & Policy. Prior to joining Georgetown, she was an Independent Consultant with Privacy for America, a business coalition focused on developing a framework for federal privacy legislation.
Laura also brings significant experience to Kelley Drye. As Assistant Director for the FTC’s Division of Privacy & Identity Protection, Laura led the investigation and prosecution of matters relating to consumer privacy, credit reporting, identity theft, and information security. Her work included investigation initiation, pre-trial resolution, trial preparation, and trial practice relating to unreasonable software security, mobile operating system security update practices, and many other information privacy and identity protection issues. She joins the firm from AT&T where she served as an Assistant Vice President – Senior Legal Counsel advising business clients on consumer protection risks, developing and executing strategies in response to regulatory inquiries, and participating in policy initiatives within the company and across industry.
Jessica and Laura are an impressive duo and are sure to be an asset to our clients as they prepare for the future of privacy and evolving consumer protection law.
* * *
Subscribe here to Kelley Drye’s Ad Law News and Views newsletter to see another side of Jessica, Laura and others in our second annual Back to School issue. Subscribe to our Ad Law Access blog here. ]]>The Colorado Legislature recently passed the Colorado Privacy Act (“ColoPA”), joining Virginia and California as states with comprehensive privacy legislation. Assuming Colorado Governor Jared Polis signs the bill (SB 21-190) into law, ColoPA will go into effect on July 1, 2023.
How does the measure stack up against the VCDPA and the CCPA (as amended by CPRA)? The good news is that, in broad terms, ColoPA generally does not impose significant new requirements that aren’t addressed under the CCPA or VCDPA. Below, we compare key provisions of ColoPA against California’s and Virginia’s laws and call attention to a few areas where Colorado has struck out on its own.
ColoPA | VCDPA | CCPA | |
Thresholds to Applicability | Conduct business in CO or produce products or services targeted to CO and (a) control or process personal data of at least 100,000 consumers; or (b) derive revenue or receive a discount on the price of goods or service from selling personal data or controls personal data of at least 25,000 consumers | Conduct business in or produce products or services targeted to VA and (a) control or process personal data of at least 100,000 consumers; or (b) derive over 50% of gross revenue from the sale of personal data and process or control personal data of at least 25,000 consumers | Conduct business in CA and collect personal information of CA residents and: (a) has $25 million or more in annual revenue for preceding calendar year as of Jan. 1 of calendar year; (b) annually buys, sells, or shares personal data of more than 100,000 consumers or households; or (c) earns more than 50% of its annual revenue from selling or sharing consumer personal information |
Consent | Requires opt-in consent for processing sensitive personal data, including children’s data | Requires opt-in consent for processing sensitive personal data, and COPPA-compliant consent for processing children’s data | Requires opt-in consent for sharing PI for cross-context behavioral advertising for children under 16, including parental consent for children under 13 |
Opt-Out | Required for targeted advertising, sales, and profiling for legal or similarly significant effects | Required for targeted advertising, sales, and profiling for legal or similarly significant effects | Required for profiling, cross-contextual advertising, and sale; right to limit use and disclosure of sensitive personal information |
Other Consumer Rights | Access, Deletion, Correction, Portability | Access, Deletion, Correction, Portability | Access, Deletion, Correction, Portability |
Authorized Agents | Permitted for opt-out requests | N/A | Permitted for all requests |
Appeals | Must create process for consumers to appeal refusal to act on consumer rights | Must create process for consumers to appeal refusal to act on consumer rights | N/A |
Private Cause of Action | No | No | Yes, related to security breaches |
Cure Period? | 60 days until provision expires on Jan. 1, 2025 | 30 days | No |
Data Protection Assessments | Required for targeted advertising, sale, sensitive data, certain profiling | Required for targeted advertising, sale, sensitive data, certain profiling | Annual cybersecurity audit and risk assessment requirements to be determined through regulations |
* * *
Subscribe here to our Ad Law News and Views newsletter and visit the Advertising and Privacy Law Resource Center for update information on key legal topics relevant to advertising and marketing, privacy, data security, and consumer product safety and labeling.
Kelley Drye attorneys and industry experts provide timely insights on legal and regulatory issues that impact your business. Our thought leaders keep you updated through advisories and articles, blogs, newsletters, podcasts and resource centers. Sign up here to receive our email communications tailored to your interests.
]]>Contact:
Alysa Z. Hutnik [email protected]
Aaron Burstein [email protected]
For additional information, please visit:
]]>(1) Notice for Sale of PI Collected Offline: Businesses that sell personal information collected offline must provide an offline notice by means such as providing paper copies or posting signs in a store, or giving an oral notice if collecting personal information over the phone.
(2) Opt-Out Icon: The revised regulations provide that businesses may use an opt-out icon in addition to, but not in lieu of, notice of a right to opt out or a “Do Not Sell My Personal Information” link.
(3) Do Not Sell Requests: A “Do Not Sell” request must “be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out.” The change prohibits businesses from using any method that is designed to or would have the effect of preventing a consumer from opting out. The revised regulation offers examples of prohibited opt-out practices, which include requiring a consumer to: (A) complete more steps to opt out than to re-opt in after a consumer had previously opted out; (B) provide personal information that is not necessary to implement the opt-out request; and (C) read through a list of reasons why he or she shouldn’t opt out before confirming the request.
(4) Consumer Requests from Authorized Agents: A business may now require an authorized agent who submits a request to know or delete to provide proof that the consumer gave the agent signed permission to submit a request. The regulations also preserve the options business previously had of requiring the consumer to verify their identity directly to the business or directly confirming that they provided the authorized agent permission to submit the request.
(5) Children’s Information: The addition of the word “or” in section 999.332 requires businesses that sell personal information of children under the age of 13 “and/or” between the ages of 13 and 15 to describe in their privacy policies how to make an opt-in to sale requests.
We will continue to monitor closely further developments in CCPA regulations.
Kelley Drye attorneys and industry experts provide timely insights on legal and regulatory issues that impact your business. Our thought leaders keep you updated through advisories and articles, blogs, newsletters, podcasts and resource centers. Sign up here to receive our email communications tailored to your interests.
]]>On March 2, Governor Ralph Northam signed the Virginia Consumer Data Protection Act (VCDPA) into law, making Virginia the second state to enact comprehensive privacy legislation.
With the VCDPA on the books, companies have the next 22 months to prepare for the VCDPA and the California Privacy Rights Act (CPRA) to go into effect. This post takes a look at the VCDPA provisions that are novel and require close attention during the transition period to the law’s January 1, 2023 effective date.
For better or worse, companies will need to prepare for the VCDPA without an obvious prospect of additional regulatory guidance. Unlike the regulatory structure the CCPA established – and the CPRA significantly expands, Virginia’s privacy law does not provide any state agency or official with rulemaking authority. However, the VCDPA could be just a first step. Governor Northam reportedly “will have an ongoing work group to continue to strengthen the law’s consumer protections,” and Virginia Delegate Cliff Hayes, who introduced the House version of the law, signaled that legislators are open to making such changes. It will remain to be seen the extent to which this group will recommend allocating additional funding to the Attorney General’s office to enforce the law, and the type of enforcement we may see. Historically, the office has not been as active as other state attorneys general on consumer protection related matters outside of a fraud context.
We will watch closely for changes in Virginia and progress in other state privacy bills.
]]>Building a successful privacy program requires much more than compliance with data protection laws. To thrive in today’s global, data-driven environment, companies also need to understand the political environment and public attitudes surrounding privacy in the countries in which they operate. Of course, companies must anticipate and adapt to changing privacy regulations as well. This webinar presented strategies to help meet these challenges, with a focus on setting up structures to join local awareness with global compliance approaches.
This webinar will feature Kelley Drye attorney Aaron Burstein, along with Abigail Dubiniecki and Kris Klein of nNovation LLP.
To view the webinar recording, click here.
Subscribe to the Ad Law Access blog to receive realtime updates on privacy and other related matters.
The Ad Law News and Views newsletter provides information on our upcoming events and a summary of recent blog posts and other publications.
Visit the Advertising and Privacy Law Resource Center for additional information, past webinars, and educational materials.
]]>