Ad Law Access https://www.kelleydrye.com/viewpoints/blogs/ad-law-access Updates on advertising law and privacy law trends, issues, and developments Thu, 14 Nov 2024 12:53:06 -0500 60 hourly 1 Health Data Privacy: What We’re Hearing https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/health-data-privacy-what-were-hearing https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/health-data-privacy-what-were-hearing Tue, 12 Mar 2024 09:00:00 -0400 U.S. privacy developments are moving quickly, but health data privacy is racing forward. Companies that come into contact with consumers’ health data need to track and respond to a variety of developments. Most notably, these include Washington’s My Health My Data (MHMD) Act, a similar law in Nevada, “sensitive data” and “sensitive personal information” requirements under comprehensive state privacy laws, and FTC enforcement actions and guidance that assert that a broad range of health data is sensitive. How a company responds to these developments is likely to be iterative given the lack of clarity or harmonization with these requirements, and substantial resources required to implement changes.

In terms of what is visible, regulators expect companies to make detailed, specific disclosures and obtain opt-in consent for most health data collection, use, and sharing. Getting these disclosures and consents right, however, requires a lot of preparatory work, starting with identifying health data that a company controls.

A few steps can be helpful in managing this uncertainty. First, adopting a framework to classify health data will lead to greater consistency and efficiency in this cornerstone compliance activity. Second, taking a clear-eyed view of the difficulty of obtaining a consent will help set realistic business expectations for the use of health data in this challenging regulatory environment. Third, documenting a health data privacy program will help to maintain the program over time and demonstrate compliance to regulators and commercial partners.

Framework for Data Classification: Is It Health Data?

For many companies, determining whether they process health data, and which elements of their data inventories constitute health data, is a daunting task. The exercise can involve reviewing thousands of variables, segments, or personal data elements.

At present, there is little regulatory guidance and no common taxonomy of health data definitions, making it difficult to benchmark. In addition, health data definitions vary across state and federal regulators, and adopting a national approach based on the broadest definition might be infeasible from a business perspective.

A few strategies can help manage the uncertainty:

  • Work from explainable factors. Using factors that capture the overlap among different health data definition will be helpful in establishing consistent classifications and educating business stakeholders about when they’re encountering health data. Factors that are based on the current range of health data definitions include:
    • Does the data reveal a specific health condition?
    • Does the data reveal a past or present health condition?
    • Does the data relate to a specific consumer?
    • Does the data relate to a sensitive health matter?
    • What kinds of harm (if any) could use or disclosure of the data reasonably cause to an individual.
    • Particularly important for Washington and Nevada: Does the data relate to a consumer’s past, present, or future health status?
  • Think holistically about classification. Classifying health data in a vacuum can lead to trouble. Regulatory definitions are broad, and it may be insufficient to analyze a data element on its own. Rather, it may be necessary to consider the purpose of using or disclosing a specific data element bears on whether it is health data. It’s also possible that a data element is “health data” when under the control of one entity but not another. Understanding the business processes, contractual commitments, data sources, and other factors surrounding potential health data is therefore critical. In many cases, there won’t be a clear answer to whether a data element is health data. Being able to identify what’s clearly in, and out, of this category allows businesses to devote more time to genuinely debatable cases.
  • Think about scalability and sustainability. A one-time classification effort, even if it encompasses thousands of variables, might be feasible for many companies. Maintaining these classifications over time is another story. For companies with relatively static data inventories, maintenance over time is likely less challenging. When inventories change quickly, however, a case-by-case review of data elements might be impractical. Consider setting a cadence for review and how one might designate privacy champions within the business to apply the framework on an ongoing basis, in coordination with legal support.

There’s Consent, and Then There’s MHMD Consent

While the FTC and states with comprehensive privacy laws are moving toward requiring opt-in consent for most health data processing, MHMD creates particularly stringent consent requirements. The difference between MHMD and other health data regulations lies not in the action required for consent – it must be voluntary and unambiguous – but in the narrow scope of consent that is permissible and the details that must be disclosed to make the consent informed. (Although other regulators have not been as explicitly restrictive, there is a clear trend in this direction, as we discussed in our recent posts on the FCC’s one-to-one consent order.)

Specifically, a business must disclose the following to obtain consent to collect or share consumer health data:

  1. The categories of consumer health data collected or shared;
  2. The purpose of the collection, including the specific ways in which it will be used;
  3. The categories of entities with whom the consumer health data is shared; and
  4. How the consumer can withdraw consent from future collection or sharing of consumer health data.

Meeting these standards might be infeasible for many businesses, particularly those that do not have direct relationships with consumers.

MHMD’s requirements to sell consumer health data are even more stringent. The law requires a valid authorization, which must include the name and contact information of the purchaser, be signed by the consumer, and expire within one year of signatures, among other requirements. Obtaining an authorization outside of limited circumstances is unlikely to be practical for most companies.

The main alternative to consent or authorization is to restrict collection of health data under MHMD to what fits under the necessity exception. Washington has not provided further guidance on the scope of this exception, but we expect regulators to interpret this exception narrowly.

Documentation is Key

We get it: companies are reluctant to create discoverable documents that might be used to prove that they misinterpreted health data regulations. The alternative, however, is far worse and could be used to support the argument that a company systemically failed to govern health data in a reasonable fashion. It can also lead to inconsistent practices within a company and time-consuming back-and-forth between business and legal teams.

Key documents include health data definitions, consent requirements, partner diligence processes, data subject request procedures, and model contract terms. Many of the consumer health data practices that should be documented are likely extensions of current privacy programs and processes, such as data protection assessments.

Of course, some discussions warrant protection under attorney-client privilege. Maintaining clear lines between discussions that provide legal advice and operational guidance to business teams will help draw defensible lines around privilege.

* * *

The acceleration in health data privacy regulation is adding to demands to privacy teams that are already stretched thin. Confronting the breadth of “health data” definitions and the impact of these regulations on business operations in the absence of regulatory guidance is especially challenging. For better or worse, the boundaries of health data privacy regulations will be clarified through enforcement and MHMD’s private right of action. In the meantime, understanding these laws’ core purposes and keeping a close watch on statements from regulators will be helpful starting points to setting compliance priorities.

]]>
CFPB Previews Proposals that Could Fundamentally Shift Data Broker Business https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/cfpb-previews-proposals-that-could-fundamentally-shift-data-broker-business https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/cfpb-previews-proposals-that-could-fundamentally-shift-data-broker-business Mon, 25 Sep 2023 12:00:00 -0400 In connection with its convening of a panel of small businesses to provide input on potential regulatory actions, the CFPB released an outline of its proposals to use its rulemaking authority under the Fair Credit Reporting Act (FCRA) to cover data brokers and prohibit the use of medical debt collection data in making credit decisions. While the outline does not include any specific language, it evidences the Bureau’s desire to fundamentally alter the data broker business model by expanding the definition of “consumer reporting agency” (CRA) to cover more data brokers, and limit their ability to share consumer information without a permissible purpose. The CFPB also seeks to prevent CRAs from providing credit header data to third parties for purposes beyond the scope of the FCRA. In effect, the Bureau intends to significantly curtail the sale of certain personal data for marketing purposes.

This is just the latest development showing an increased, nationwide focus on the practices of data brokers, which we have detailed in this blog, and which recently led to a strict new data broker regulation in the state of California. Depending on how the CFPB’s proposals play out, they could transform how data brokers are regulated in this country.

Background on the FCRA

The FCRA covers “consumer reports” and imposes restrictions on CRAs that create and sell these reports, furnishers that provide data to CRAs, and users that consider consumer reports when making eligibility determinations about consumers. The famously circular statute (“famous” being an admittedly relative term when discussing a federal statute) defines a consumer report to be the communication of any information by a CRA bearing on a consumer’s credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living which is used or expected to be used or collected in whole or in part for the purpose of serving as a factor in establishing the consumer’s eligibility for (A) credit or insurance to be used primarily for personal, family, or household purposes; (B) employment purposes; or (C) any other permissible purpose authorized under FCRA section 604.

Meanwhile, CRA is defined as any person that regularly engages in whole or in part in the practice of assembling or evaluating consumer credit information or other information on consumers for the purpose of furnishing consumer reports to third parties.

At the risk of oversimplifying things, in general, CRAs are those entities that assemble information about consumers for the purpose of providing reports to third parties for use in making determinations about consumers’ eligibility for credit, employment, or housing. Data brokers, on the other hand, are entities that collect information about consumers to be provided to third parties for non-FCRA purposes such as fraud prevention and marketing. Sometimes enforcers have alleged that companies purporting to be data brokers were, in fact, CRAs because they were selling consumers’ information to third parties such as background screeners and employers (see, e.g., the FTC’s cases against Spokeo and TruthFinder). More often, though, data brokers sell consumer information for marketing purposes without triggering the FCRA. There has been widespread concern that these data brokers can amass incredibly sensitive information about consumers without their knowledge, and that consumers have no control over how the data is shared.

The CFPB’s Proposal

The CFPB’s proposal would classify any report that includes data such as payment history, income, or criminal records as a consumer report. That would mean that any data broker selling this information would be a CRA and would only be able to share it for a permissible purpose – that is, for use in eligibility determinations. [The outline does not include a proposed definition of “sell” but, depending on how it is defined, the scope of the provision’s reach could be quite expansive.] So, for example, a data broker could no longer provide information about a consumer that includes her individual or household income (more on this later) to a retailer for marketing purposes. Data brokers could no longer sell criminal records to individuals that want to vet their dates.

The CFPB is also considering whether it should define “assembling and evaluating” to cover intermediaries or vendors that facilitate the transfer of consumer report information. Traditionally, companies that were mere conduits of information have not been considered to be assembling and evaluating information — and, hence, were not viewed as CRAs (see FTC’s 40 Years Report at 29). It is unclear if the Bureau intends to include “dumb pipes” in its definition of CRA, or just those vendors that clean or organize data before providing it to their clients.

While CRAs are prohibited from providing consumer reports without a permissible purpose, there has been a longstanding exception for the provision of credit header information. In particular, reports limited to identifying information such as name, address, previous address, SSN, and phone number, have been considered exempt from the definition of consumer report if they do not bear on one of the seven factors and are not used to make an eligibility determination (see 40 Years Report at 21). Relying in this exemption, CRAs have provided credit header data to purchasers for use in marketing and fraud detection purposes. The CFPB’s proposal would consider credit header data to be a consumer report and would eliminate a CRA’s ability to provide this information for fraud prevention or marketing.

The outline also includes discussion of the following topics:

Target Marketing

The Bureau is considering clarifying that CRAs cannot use any consumer report information for targeted marketing. The CFPB is concerned that CRAs may be using consumer report data to help customers target marketing, in violation of the FCRA. Per the Bureau, these CRAs may incorrectly believe that this use of data is outside the scope of the FCRA if they do not furnish the information directly to clients, but rather provide the marketing to the consumers themselves.

Aggregated and Household Data

Significantly, the CFPB is also contemplating whether aggregated and household level data should be considered a consumer report. This would be a major change. A prohibition on the use of aggregated and household level data, such as the average income in a geographic area, for marketing purposes would reverberate across the marketplace.

Consumer Consent

Consumers can permit CRAs to share their consumer reports by providing written consent. The CFPB’s outline notes that it is considering placing limitations on how (and by whom) the consent may be collected, as well as on the scope of the consent, presumably to ensure that the consent is informed and meaningful. It is also mulling mechanisms through which consent may be revoked.

Legitimate Business Need

Another aspect of a potential proposal would be to limit the scope of the permissible purpose allowing a user to obtain a consumer report when it has a legitimate business need in connection with a business transaction initiated by the consumer. The CFPB may specify that this permissible business purpose must be for a personal, family, or household purpose. A legitimate business purpose related to account review would require that the consumer report be necessary to make a determination about a consumer’s continued eligibility for the account.

Data Security

Regulators have long made clear that they see the privacy provisions of the FCRA (limiting the use of consumer reports to certain permissible purposes) as requiring CRAs to take reasonable measures to protect those reports (see, e.g., the FTC’s case against SettlementOne and the statement of Commissioners Brill, Leibowitz, Rosch, and Ramirez). The Bureau’s outline notes that its proposal may address CRAs’ data security obligations under the FCRA. In addition, the CFPB is considering whether it should hold CRAs strictly liable for data breaches by considering the unauthorized release of any consumer report to be a violation of Section 604, which prohibits furnishing a consumer report to anyone without a permissible purpose.

Disputes

Under the FCRA, consumers have the ability to dispute inaccurate information contained in their consumer reports with the CRA or directly with the furnisher of the information. Some private litigation has focused on whether CRAs and furnishers have a duty to investigate so-called legal disputes. The CFPB’s proposal would make clear that the FCRA requires investigation of both legal and factual disputes. Simply put, a legal dispute is a dispute that hinges on an interpretation of a law. The Bureau’s outline uses the example of a state foreclosure law. If a consumer disputes the accuracy of a report that lists him as having mortgage debt, the CFPB would require that the CRA investigate whether the state’s anti-deficiency statute required the debt to be extinguished.

In addition, the CFPB says it wants to tackle what it considers to be systemic issues that affect the completeness and accuracy of consumer reports – for example, outdated software or deficiencies in a furnisher’s policies and procedures to assure data accuracy. The outline notes that the CFPB is thinking about ways that CRAs and furnishers could be notified of potential systemic issues, which they would have to investigate and, if necessary, address. Among the CFPB’s proposals for consideration are requiring a mechanism where consumers could report suspected systemic issues. It is also considering whether consumers should be notified of any systemic issues that affected their reports, even if the issue was identified in response to a complaint from another consumer. This could potentially result in consumers receiving notices about issues at CRAs that may have affected their reports, but did not have a negative impact on them because the inaccurate reports were not shared.

Medical Debt Collection Information

Finally, the CFPB is considering revising Reg V which, among other things, covers medical debt collection information. The potential revisions would prohibit creditors from using this information to make credit eligibility determinations, and prohibit CRAs from including this information on consumer reports for credit eligibility. Medical debt collection information has long been a source of concern for consumers, legislators, and regulators, since it can prevent consumers from obtaining credit following a medical emergency or be used to coerce consumers into paying spurious or false unpaid medical bills. In addition, the CFPB believes there is compelling evidence that this information does not have predictive value for credit decisions. The Big 3 CRAs ceased reporting paid medical collection debt, medical collection debt under $500, and any medical collection debt that is less than one year past due. The Bureau’s proposal would further limit the ability of medical debt collection tradelines to affect a consumer’s ability to obtain credit.

Next Steps

The CFPB is accepting comments on this outline until October 30, 2023 and is especially interested in feedback from small businesses that would be affected by the rule. Once the Bureau completes this process, which is required under the Small Business Regulatory Enforcement Fairness Act of 1996 (SBREFA), it can issue a more formal rulemaking proposal which will be put out for public comment. It seems unlikely that any proposal would be announced before 2024. However, the CFPB is clear that it envisions a sea of change in the scope of the FCRA, and businesses should be ready to provide input and comment.

]]>
State AGs and Consumer Protection: What We Learned from . . . Connecticut Part I https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/state-ags-and-consumer-protection-what-we-learned-from-connecticut-part-i https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/state-ags-and-consumer-protection-what-we-learned-from-connecticut-part-i Thu, 11 May 2023 10:34:28 -0400 Our State AG webinar series continues with Connecticut Attorney General William Tong and Chief of the Privacy Consumer Protection Section Michele Lucan. During our webinar, the Connecticut AG’s office described their structure and the tools available to them to enforce the state’s consumer protection laws. In particular, as the fifth state to pass comprehensive privacy legislation, AG Tong highlighted the AG office’s privacy priorities and agenda which we will focus on here in Part I. We will explore the more general consumer protection topics in Part II. In case you missed it, here is a recording of the webinar.

While the Connecticut Unfair Trade Practices Act (CUTPA - Connecticut’s UDAP law) is broad and robust, in the privacy and cybersecurity space, the AG has additional authority derived from specific state laws such as the Data Breach Notification law and Connecticut’s Data Privacy Act (CTDPA). General Tong noted Connecticut’s dedication to enforcing consumer protection, as it relates to privacy, traces back to at least 2011 when it was the first state to create the Privacy Task Force and eventually a standalone Privacy Section in 2015.

Enforcing the CTDPA

AG Tong noted that the CTDPA reflects a “philosophical judgment of Connecticut to return rights and power of authority to consumers regarding their Personal Information.” As we have previously reported, the CTDPA provides for several rights such as the right to access, right to portability, right to correct mistakes, right to deletion, and the right to opt out of targeted advertising, sale, and profiling of personal data.

The CTDPA also creates obligations for “controllers” which are entities that alone or jointly determine the purpose and means of processing of personal data. Some of these obligations include: minimizing data collection and storage, providing transparency about the types of data collected and why, ensuring that data is secure, and obtaining consent to process sensitive data. Notably, the CTDPA also provides heightened protections for data related to teenagers, a hot topic for State AGs. Controllers must obtain consent to sell teens’ data or conduct targeted advertising to teens.

The Connecticut AG has the exclusive authority to enforce the CTDPA’s provisions, making their insights all the more valuable. However, the law provides for a cure period. This means that if the AG’s office is aware of a potential violation, the office will reach out to the entity and issue a notice of violation if the AG determines that a cure is possible. If the controller fails to cure within sixty (60) days, then the AG may bring an action against the entity. Similar to the data breach notification law discussed below, a violation is a per se violation of CUTPA.

Connecticut AG’s Advice: How to Prepare for Compliance with the CTDPA

With the CTDPA’s effective date quickly arriving on July 1, 2023, the Connecticut AG’s office provided their own recommendations on how to take steps and prepare for compliance with the new law:

  • Applicability. Entities should determine whether they meet the thresholds to trigger CTDPA obligations.
  • Data Inventory. Entities should understand what data they are collecting and where it lives, while also thinking about how to minimize data collection if possible.
  • Consumer Facing Updates. Entities should review their privacy policies to ensure they are up to date, and that entities are prepared to operationalize and effectuate the mechanisms for consumers to take advantage of their privacy rights (i.e. ensure links work).
  • Internal Updates. Entities should review and update their vendor contracts to address CTDPA requirements and conduct employee training to minimize data security risks.

Safeguards and Data Breach Notice Laws

The Connecticut Safeguards Law, referred to by the office as the basic building blocks for Connecticut’s privacy infrastructure, requires any person in possession of Personal Information (PI) to safeguard data against misuse by third parties, and destroy, erase, or make unreadable the data prior to disposal. Penalties under the Safeguards law can be significant—up to $500 per intentional violation and up to $500,000 for a single event.

Connecticut defines PI as information capable of being associated with a particular individual through one or more identifiers. The AG’s office noted that PI is broadly defined. For instance, PI includes a person’s name, but also covers other identifiers including social security numbers, driver’s license numbers, credit/debit card numbers, passport numbers, biometric information, online account credentials, and certain medical information.

Connecticut’s Breach Notification Law requires that an entity that experiences a data breach provide notice to the Connecticut AG without “unreasonable delay” within a 60-day limit. The law also requires that the entity provide two years of ID theft prevention services if social security numbers and taxpayer numbers (ITINs) are compromised. A violation of this law is a per se violation of CUTPA. Last year, Connecticut received over 1,500 data breach notifications, and the office is experienced in reviewing all types of data breaches and determining which ones to pay attention to.

Our Take

Connecticut has consistently been a leader in data security and privacy issues over the last decade, and with the passage of the CTDPA we expect to see the office double down on enforcement efforts. Businesses should pay particular attention to the compliance tips highlighted above by Ms. Lucan and General Tong, as there is little doubt the office will be actively looking for targets right out the gate on July 1. In General Tong’s words, “data privacy and the law of data privacy are here. Its obligations are here, present, and they are demanding.” Privacy laws can’t be approached as “optional” or “too cumbersome” to take precautions and manage the risks of collecting data. Law enforcement will take action where we believe people have failed to meet their obligations under the law” as that is what people in the state of Connecticut “expect and demand.”

Given Connecticut’s leadership in the multistate Attorney General community, we would not be surprised to see other states joining Connecticut in enforcement efforts, even without a comprehensive privacy law (relying on their UDAP authority as states have done for decades). Understanding your data collection and security practices is more important than ever.

***

Be sure to look out for Part II of this blogpost where we will talk about Connecticut’s UDAP law in more detail as well as priorities and more tools that the Connecticut AG’s office uses to enforce consumer protection laws. We also have an exciting blogpost recapping our conversation with the Nebraska Attorney General just around the bend. Stay tuned.

]]>
What’s in the Indiana Consumer Data Protection Act? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/whats-in-the-indiana-consumer-data-protection-act https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/whats-in-the-indiana-consumer-data-protection-act Wed, 19 Apr 2023 11:13:52 -0400 Indiana’s Consumer Data Protection Act advanced in the state legislature last week and now heads to Governor Eric J. Holcomb’s desk. The bill mirrors comprehensive privacy legislation enacted in Virginia, Utah, and Iowa, further extending the reach of privacy protections in the United States but without the complex mandates found in laws in California, Colorado, and Connecticut. Following on the heels of Iowa’s Act Relating to Consumer Data Protection, Indiana’s law is expected to be the second state privacy law enacted this year, and the seventh comprehensive state privacy law overall.

The following are highlights of the pending Indiana bill:

  • Effective Date. If codified, the Indiana law would take effect January 1, 2026.
  • Applicability. Indiana’s privacy law applies to companies that do business in Indiana and meet certain thresholds, such as processing personal data of more than 100,000 Indiana consumers, or processing personal data of 25,000 Indiana consumers while also deriving a significant percentage of income from the “sale” of personal data – 50 percent. The law does not apply to government entities (including third parties while doing business with those entities), nonprofits, public utilities, or institutions of higher education. The law also does not apply to Covered Entities or Business Associates subject to HIPAA or Financial Institutions or data subject to the Gramm-Leach-Bliley Act. Certain activities of consumer reporting agencies and furnishers (and users) of consumer reports, where regulated by the Fair Credit Reporting Act, are exempt.
  • Employee and B2B Exceptions. The Indiana law does not apply to personal data of employees or individuals acting in a commercial context.
  • Opt-Out of Sale and Targeted Advertising. The Indiana law provides a right to opt-out of the sale of personal data, defined as “the exchange of personal data for monetary consideration by a controller to a third party.” The law also creates a right to opt-out of targeted advertising, defined as “displaying of an advertisement to a consumer in which the advertisement is selected based on personal data obtained from that consumer’s activities over time and across nonaffiliated websites or online applications to predict the consumer’s preferences or interests.” These definitions mirror the Virginia law now in effect.
  • Consent to Process Sensitive Data. The Indiana law requires consent to process sensitive data, similar to the Virginia, Colorado, and Connecticut laws. Sensitive data is defined to include personal data revealing racial or ethnic origin, religious beliefs, mental or physical health diagnosis made by a health care provider, sexual orientation, citizenship and immigration status; genetic and biometric data that identifies an individual; precise geolocation data; and personal data collected from a known child. A unique element of this definition is that sensitive data only includes health information to the extent a diagnosis has been made by a health care provider.
  • Consumer Rights. The Indiana law includes the now common rights found in other state privacy laws, such as to: access personal data in a portable format, delete personal data, and correct inaccurate personal data.
  • Contract Terms. The Indiana law requires a contract between controllers and processors to include specific contractual provisions relating to the processor’s handling of personal data and the controller’s audit rights. These contract terms mirror requirements in the Virginia and Colorado laws.
  • Enforcement and Regulation. The Indiana law provides for a 30 day right to cure violations. If a business fails to cure a violation, the Attorney General may initiate an action for injunctive relief and civil penalties of up to $7,500 per violation. There is no private right of action in the law.

The following chart summarizes and compares requirements of current U.S. state privacy laws (subject to exceptions stated in each law):

  • California (CA) – California Privacy Rights Act (Effective Jan. 1, 2023)
  • Virginia (VA) – Virginia Consumer Data Protection Act (Effective Jan. 1, 2023)
  • Colorado (CO) – Colorado Privacy Act (Effective July 1, 2023)
  • Connecticut (CT) – Connecticut Act Concerning Personal Data Privacy (Effective July 1, 2023)
  • Utah (UT) – Utah Consumer Privacy Act (Effective Dec. 31, 2023)
  • Iowa (IA) – Act Relating to Consumer Data Protection (Effective Jan. 1, 2025)
  • Indiana (IN) – Indiana Consumer Data Protection Act (Effective Jan. 1, 2026)

Thresholds to Applicability

CACOVAUTCTIAIN
Conducts business in CA, Determines the purposes and means of processing personal info. of CA residents, and Meets one of the following thresholds: >$25 million in annual revenue in the preceding year, Buys/sells personal info. of > 100K consumers or households, or Earns > 50% of annual revenue from selling or sharing personal info.Conducts business in CO or targets products or services to CO residents, and Meets either of these thresholds: Processes personal data of > 100K consumers in a year; or Earns revenue or receives a discount from selling personal data and processes personal data of >25K consumers.Conducts business in VA or targets products or services to VA residents; and Meets either of these thresholds: Processes personal data of > 100K consumers; or Processes personal data of >25K consumers and derives >50% of gross revenue from the sale of personal data.Conducts business in Utah or target products or services to Utah residents, Have more than $25 million in annual revenue, and Either: During a calendar year processes personal data of >100K consumers, or Processes personal data of > 25K consumers and derive > 50% of revenue from the sale of personal data.Produces products or services that are targeted to CT residents, and In the preceding year: Processes personal data of >100K consumers (excluding payment transaction data), or Processes personal data of > 25K consumers and derive > 25% of revenue from the sale of personal data.Conducts business in IA or targets products or services to IA residents, and During a calendar year: Processes personal data of >100K consumers, or Processes personal data of >25K consumers and derives >50% of revenue from the sale of personal data.Conducts business in IN or targets products or services to IN residents, and During a calendar year: Processes personal data of >100K consumers; or Processes personal data of >25K consumers and derives >50% of revenue from the sale of personal data.

Sales

CACOVAUTCTIAIN
Right to opt-out of the sale of personal information. Opt-in consent required to “sell” personal information of minors under age 16.Right to opt-out of the sale of personal data.Right to opt-out of the sale of personal data. The definition of a “sale” requires monetary consideration.Right to opt-out of the sale of personal data. The definition of a “sale” requires monetary consideration.Right to opt-out of the sale of personal data. Opt-in consent required to “sell” personal data of minors 13 to 16.Right to opt-out of the sale of personal data. The definition of a “sale” requires monetary consideration. Right to opt-out of the sale of personal data. The definition of a “sale” requires monetary consideration.

Targeted Advertising

CACOVAUTCTIAIN
Right to opt-out of the “sharing” of personal information for purposes of cross-context behavioral advertising. Opt-in consent required to “share” personal information of minors under age 16.Right to opt-out of targeted advertising.Right to opt-out of targeted advertising.Right to opt-out of targeted advertising.Right to opt-out of targeted advertising. Opt-in consent required for processing personal data of minors 13 to 16 for targeted advertising.Although there is no explicit right to opt-out of targeted advertising, a controller must still disclose how a consumer can opt out of targeted advertising.Right to opt-out of targeted advertising.

Global Privacy Controls

CACOVAUTCTIAIN
Yes (optional subject to regulatory process)Yes, required by July 1, 2024.NoNoYes, required by Jan. 1, 2025.NoNo

Sensitive Data

CACOVAUTCTIAIN
Right to limit the use and disclosure of sensitive personal information.Consent to process sensitive data.Consent to process sensitive data.Provide notice and an opportunity to opt out of processing of sensitive data.Consent to process sensitive data.Provide notice and opportunity to opt out of processing of sensitive data.Consent to process sensitive data.

Profiling

CACOVAUTCTIAIN
Pending regulationsRight to opt-out of profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer.Right to opt-out of profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.N/ARight to opt-out of profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer.N/ARight to opt-out of profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.

Minor & Children’s Data

CACOVAUTCTIAIN
Opt-in consent required to “sell” or “share” personal information of minors under age 16.COPPA exception; obtain parental consent to process personal data concerning a known child.Process sensitive data of a known child in accordance with COPPA.Process personal data of a known child in accordance with COPPA.Process sensitive data of a known child in accordance with COPPA. Consent to sell personal data of minors 13 to 16 or process their personal data for targeted advertising.Process sensitive data concerning a known child in accordance with COPPA.Process sensitive data of a known child in accordance with COPPA.

Consumer Rights

CACOVAUTCTIAIN
Access, Deletion, Correction, PortabilityAccess, Portability, Deletion, CorrectionAccess, Portability, Deletion, CorrectionAccess, Portability, DeletionAccess, Deletion, Correction, PortabilityAccess, Portability, DeletionAccess, Deletion, Correction, Portability

Authorized Agent

CACOVAUTCTIAIN
Permitted for all consumer rights requestsPermitted for opt-out requestsN/AN/APermitted for opt-out requestsN/AN/A

Appeals

CACOVAUTCTIAIN
N/AMust create process for consumers to appeal refusal to act on consumer rightsMust create process for consumers to appeal refusal to act on consumer rightsN/AMust create process for consumers to appeal refusal to act on consumer rightsMust create process for consumers to appeal refusal to act on consumer rightsMust create process for consumers to appeal refusal to act on consumer rights

Private Right of Action

CACOVAUTCTIAIN
Yes, for security breaches involving certain types of sensitive personal informationNoNoNoNoNoNo

Cure Period

CACOVAUTCTIAIN
30-day cure period is repealed as of Jan. 1, 2023.60 days until provision expires on Jan. 1, 2025.30 days30 days60 days until provision expires on Dec. 31, 2024. Starting Jan. 1, 2025, AG may grant the opportunity to cure.90 days30 days

Data Protection Assessments

CACOVAUTCTIAIN
Annual cybersecurity audit and risk assessment requirements to be determined through regulations.Required for targeted advertising, sale, sensitive data, certain profiling.Required for targeted advertising, sale, sensitive data, certain profiling.N/ARequired for targeting advertising, sale, sensitive data, certain profiling.N/ARequired for targeted advertising, sale, sensitive data, certain profiling, and activities that present a heighted risk of harm to consumers.
]]>
DNA Diagnostics Center Settles Data Breach with Ohio and Pennsylvania Attorneys General https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/dna-diagnostics-center-settles-data-breach-with-ohio-and-pennsylvania-attorneys-general https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/dna-diagnostics-center-settles-data-breach-with-ohio-and-pennsylvania-attorneys-general Tue, 21 Mar 2023 09:35:45 -0400 On February 16, 2023, the Attorneys General of Ohio and Pennsylvania announced a settlement with Ohio-based DNA Diagnostics Center (“DDC”) for a 2021 data breach which involved 2.1 million residents nationwide, including the social security numbers of over 45,000 Ohio and Pennsylvania residents. As a part of the settlement, which resolves alleged violations of Ohio and Pennsylvania consumer protection laws, DDC will pay $400,000 in fines and will be required to implement improved security practices.

DDC, one of the world’s largest private DNA testing companies, suffered the breach in November 2021. The breach involved databases that were not used for any active business purpose, but had been acquired by DDC as a part of a 2012 acquisition of Orchid Cellmark. These databases contained the personal information of over 2 million individuals who received DNA testing services between 2004 and 2012, including names, payment information, and social security numbers. DDC claims it was unaware that this data was transferred as a part of its acquisition of Orchid.

DDC allegedly received indications of suspicious activity in the database from a security vendor as early as May 2021, but did not activate its incident response plan until August 2021 after the vendor identified signs of malware. The malware was loaded onto DDC’s network by threat actors that ultimately facilitated the extraction of patient data, which was subsequently used to extort a payment from DDC in exchange for its promised deletion. In its internal investigation of the incident, DDC found that an unauthorized third party had logged in via VPN on May 24 using a DDC account, having harvested credentials from a domain controller that provided password information for each account in the network. The Assurance of Voluntary Compliance (“AOC”) noted that at the time the hacker accessed the VPN, DDC had recently migrated to a different VPN, meaning no one should have been using the VPN that the hackers used. Furthermore, the AOD notes that the threat actor used a decommissioned server to exfiltrate the data.

Prior to the breach, DDC conducted an inventory assessment and penetration test on its systems, however, the legacy databases that stored sensitive personal information in plain text were not identified, as the assessments singularly focused on active customer data.

The Ohio and Pennsylvania Attorneys General alleged that DDC engaged in deceptive or unfair business practices by making material misrepresentations in its customer-facing privacy policy concerning its safeguarding of its customers’ personal information. The policy represented that the company implemented “reasonable measures to detect and prevent unauthorized access to [DDC’s] computer network.”

The settlement requires DDC to develop, implement, and maintain a comprehensive information security program that is reasonably designed to safeguard the security, integrity, and confidentiality of the company’s collected, stored, transmitted, and/or maintained personal information. Additionally, DDC’s information security program must include documented methods and criteria for handling information security risks to such personal information. On an annual basis, the company must also conduct comprehensive risk assessments, provide security awareness training to appropriate personnel, and evaluate the overall effectiveness of its information security program.

The settlement specifically requires DDC to implement the following safeguards:

  • The assessment of risks associated with acquired technical assets (e.g., systems applications, or devices) containing personal information and the subsequent removal of such information serving no legitimate business purpose or utility to consumers;
  • Personal information must be transmitted and stored so that it is only accessible to people and systems that need such information for a legitimate business purpose;
  • The maintenance of an updated data/asset inventory of DDC’s entire network and the disabling and/or removal of any unnecessary assets;
  • The implementation of an incident response plan that mandates DDC employees to respond to any alerts generated from the company’s security monitoring systems, along with the documentation of actions to such alerts;
  • The detection, investigation, containment, response to, eradication, and recovery from security incidents within reasonable time periods;
  • Authentication protocols to ensure that people and systems utilizing credentials are who they purport to be, including through multi-factor authentication, one-time passcodes, and location specific requirements;
  • The implementation and maintenance of logging and log monitoring policies and procedures designed to collect, manage, and analyze security logs and monitor where the company stores personal information (to identify, understand, or recover from an attack);
  • The maintenance and support of up-to-date software on DDC’s network; and
  • Technical measures for the detection and response to suspicious network activity within DDC’s network, which may include log correlation and alerting, file integrity monitoring, data integrity monitoring, SIEM systems, intrusion detection and prevention systems, and threat management systems.

Takeaways

With this settlement, we get insight into the corrective actions that two different Attorneys General believe are appropriate to prevent future incidents from occurring. Data breaches are often caused by a string of security failures as is alleged to have happened here, including: ignored security alerts, failure to monitor network activity, such as the VPN used by the threat actor that should have no longer been in use, failure to adequately data map thoroughly enough to know that the Orchid systems needed to be in scope for penetration testing, and the failure to properly decommission a server that allowed the hackers to exfiltrate data. However, with such specific injunctive terms, this settlement provides businesses with a good example of how Attorneys General interpret the requirement that they implement “reasonable” safeguards for personal information. It’s also a reminder that companies should review their privacy policies and other public-facing representations regarding their security statements, and ensure they aren’t making promises that they aren’t living up to.

]]>
FCC Seeks Comments on Updates to CPNI Breach Reporting Rule https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/fcc-seeks-comments-on-updates-to-cpni-breach-reporting-rule https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/fcc-seeks-comments-on-updates-to-cpni-breach-reporting-rule Tue, 24 Jan 2023 15:04:28 -0500 The Federal Communications Commission (“FCC” or “Commission”) is seeking comments on a Notice of Proposed Rulemaking (NPRM) to refresh its customer proprietary network information (“CPNI”) data breach reporting requirements (the “Rule”). Adopted earlier this month by a unanimous 4-0 vote of the Commission, the NPRM solicits comments on rule revisions that would expand the scope of notification obligations and accelerate the timeframe to notify customers after a data breach involving telephone call detail records and other CPNI. The FCC cites “an increasing number of security breaches of customer information” in the telecommunications industry in recent years and the need to “keep pace with today’s challenges” and best practices that have emerged under other federal and state notification standards as reasons to update the Rule.

According to the current Rule, a “breach” means that a person “without authorization or exceeding authorization, has intentionally gained access to, used, or disclosed CPNI.” As summarized in the NPRM, CPNI includes “phone numbers called by a consumer, the frequency, duration, and timing of such calls, the location of a mobile device when it is in active mode (i.e., able to signal its location to nearby network facilities), and any services purchased by the consumer, such as call waiting.” (The NPRM does not propose any changes to the definition of CPNI.)

Initially adopted in 2007 as part of a broader effort to combat pretexting – the practice of pretending to be a customer in order to obtain that customer’s telephone records – the current data breach notification rule (47 CFR § 64.2011) requires telecommunications carriers, including interconnected VoIP providers, to provide notice of a data breach involving CPNI to the Secret Service, FBI, and affected customers. The Rule also requires notifying law enforcement within seven business days at http://www.fcc.gov/eb/cpni. Carriers then must wait an additional seven business days to notify customers about a breach (barring any objection from law enforcement officials).

The NPRM solicits comments on a series of potential changes to the Rule, including:

  • Removing the intent standard: Under the current Rule, a breach is reportable when a person intentionally, and without authorization or exceeding authorization, gains access to, uses, or discloses CPNI. The NPRM proposes removing the intent standard, explaining that “inadvertent” disclosures of CPNI can still impact individuals, and that intent may not be immediately apparent “which may lead to legal ambiguity or under-reporting.” The FCC seeks comments on the benefits and burdens of this proposal and whether other data breach laws should influence the policy it adopts.
  • Adding a harm-based reporting trigger: The FCC proposes to include a harm-related reporting trigger, in an effort to avoid notifying customers about breaches that are not likely to cause harm – what the FCC terms “notice fatigue.” As an example, many data breach laws do not require notification about a data breach involving encrypted information based in part on a harm calculation. The FCC also solicits comments on how to determine and quantify “harm” in the context of CPNI.
  • Expanding the notice requirement: The NPRM asks whether the FCC has authority to include in its Rule – and should include – information that is not considered CPNI, such as Social Security numbers or other financial records,.
  • Notice to the FCC: The FCC proposes that carriers should notify the FCC, in addition to the FBI and Secret Service, about CPNI breaches. It seeks comment on the costs and benefits of requiring such notification.
  • Notice Timeline: The FCC proposes removing the seven-business day waiting period to notify customers about a CPNI data breach, instead requiring notification “without unreasonable delay” after discovery of a breach, unless a law enforcement agency requests that the carrier delay notification. The FCC tentatively concludes this approach is consistent with other laws and better serves the public interest than the current requirement.
  • Minimum Requirements for Notice Content: The current rule does not address the content of notifications, and the NPRM solicits comment on whether to adopt a floor for information that must be included in data breach notices to consumers. The FCC notes that many state data breach notification laws impose minimum content requirements, requiring notices to describe what information was subject to the breach, the date(s) of the breach, how the breach occurred, and what steps were taken to remedy the situation.

Finally, the NPRM raises the question of the FCC’s legal authority to adopt its proposed changes to its Rule, particularly in light of the fact that Congress nullified the 2016 revisions to its Rule (2016 Privacy Order) under the Congressional Review Act.

Comments on the NPRM are due on February 22, and reply comments are due on March 24.

]]>
AG Settlements Call for Stronger Data Security https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ag-settlements-call-for-stronger-data-security https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ag-settlements-call-for-stronger-data-security Thu, 10 Nov 2022 12:09:10 -0500 Early this week, a coalition of 40 attorneys general obtained two multistate settlements with Experian concerning data breaches it experienced in 2012 and 2015 that compromised the personal information of millions of consumers nationwide. The 2012 breach investigation was co-led by the Massachusetts and Illinois AG offices, and the 2015 investigation was co-led by the AGs of Connecticut, DC, Illinois, and Maryland. An additional settlement was reached with T-Mobile in connection with the 2015 Experian breach, which impacted more than 15 million individuals who submitted credit applications with T-Mobile.

In an effort to change corporate behavior, both settlements require Experian and T-Mobile to enhance their data security practices and to pay a combined amount of more than $16 million. Experian has agreed to bolster its due diligence and data security practices by adhering to the following:

  • prohibition against misrepresentations to its clients regarding the extent to which Experian protects the privacy and security of personal information;
  • implementation of a comprehensive Information Security Program,
  • incorporating zero-trust principles, regular executive-level reporting, and enhanced employee training;
  • due diligence provisions requiring the company to properly vet acquisitions and evaluate data security concerns prior to integration;
  • data minimization and disposal requirements, including specific efforts aimed at reducing use of Social Security numbers as identifiers; and
  • specific security requirements, including with respect to encryption, segmentation, patch management, intrusion detection, firewalls, access controls, logging and monitoring, penetration testing, and risk assessments.

T-Mobile has agreed to bolster its vendor oversight moving forward, including:

  • implementation of a Vendor Risk Management Program;
  • maintenance of a T-Mobile vendor contract inventory, including vendor criticality ratings based on the nature and type of information that the vendor receives or maintains;
  • imposition of contractual data security requirements on T-Mobile’s vendors and sub-vendors, including related to segmentation, passwords, encryption keys, and patching;
  • establishment of vendor assessment and monitoring mechanisms; and
  • taking appropriate action in response to vendor non-compliance, up to contract termination.

Note that the settlement with T-Mobile doesn't concern the unrelated data breach announced by T-Mobile in August 2021, which is currently under investigation by a collection of states.

Alongside the 2015 data breach settlements, Experian has agreed to pay an additional $1 million to resolve an independent multistate investigation into another Experian-owned company—Experian Data Corp. (“EDC”). Such investigation was in connection with EDC’s failure to prevent or provide notice of a 2012 data breach initiated by an identity thief posing as a private investigator who was given access to sensitive personal information stored in EDC’s commercial databases. As a result, EDC has agreed to strengthen its vetting and oversight of third parties to which it provides personal information, investigate and report data security incidents to the Attorneys General, and maintain a “Red Flags” program to detect and respond to prospective identity theft.

Although every state has a breach notification law that generally gives rise to these types of enforcement actions, companies may wonder at times what a particular State considers to be reasonable data security practices to avoid potential liability. Whether you are a CEO or privacy manager, States expect that privacy and data security awareness is interwoven into the fabric of a business’s culture. Reviewing the injunctive terms achieved in these settlements and others can be instructive in understanding AG expectations for data security practices and managing risk.

]]>
Carnival Cruise Brings Multistate Data Breach into Port https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/carnival-cruise-brings-multistate-data-breach-into-port https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/carnival-cruise-brings-multistate-data-breach-into-port Tue, 28 Jun 2022 04:57:16 -0400 Even as states continue to pass comprehensive privacy laws, Attorneys General remain active enforcing their data breach laws and utilizing their deceptive trade practice authority in the privacy space. Just last week, 46 State AGs signed on to a settlement, which took the form of an Assurance of Voluntary Compliance, with international cruise corporation Carnival for its 2019 data breach. This breach of employee email accounts purportedly exposed sensitive personal information contained in email contents, thereby impacting state consumers. The payment to the states is $1.25 million total.

While this settlement joins a long list of AG privacy cases, it serves as a useful roadmap for companies wishing to stay on top of what AGs expectations are for data security, and what type of enforcement terms you can expect if you suffer a breach.

In its agreement, Carnival has agreed to comply with state laws prohibiting unfair and deceptive trade practices, as well as specific data security and breach notification laws, specifically in connection with securing Personal Information (as defined by state statutes) against Security Incidents, defined as confirmed unauthorized access to or acquisition of a Consumer’s personal information owned, licensed, or maintained by Carnival. It also agrees to comply with consumer protection acts with respect to representations regarding privacy and security of personal information.

Within 180 days of the effective date Carnival must maintain a comprehensive information security program, appropriate to the size and complexity of operations, nature and scope of activities, and the sensitivity of personal information. Carnival must employ a Chief Information Security Officer and must further must provide security awareness and privacy training to all personnel with access to the network or responsibility for personal information every year and after hiring. Carnival also must update its written incident response and data breach notification plan to ensure compliance addressing preparation, detection and analysis, containment, eradication, and recovery workflows.

Carnival must further develop, implement and maintain retention of personal information policies, use email filtering and protection, establish encryption policies, and maintain an appropriate system to collect logs and monitor network activity through and establish policies to analyze security events and real time. Carnival must implement appropriate policies to audit accounts, ensure protected passwords, multifactor authentication for remote access, firewall policies, penetration testing, and conduct an annual risk assessment. The company also must obtain a risk assessment from a third party within 18 months of the effective date and provide a copy to the State of Washington for review.

While several of the specific provisions expire after 5 years, it should be apparent that State AGs will demand detailed compliance programs and continued oversight if they find a lapse in security practices. Ensuring you have a detailed security program now and continually seeking ways to enhance your security practices are valuable ways to minimize AG scrutiny later. Note also that some of the injunctive terms are broadly applicable even beyond the specific incident in question, which potentially can subject the company to heightened penalties should there be another, albeit unrelated, security incident.

* * * *

Join us tomorrow for State Attorneys General 102. This short 30-minute webinar picks up where State Attorneys General 101 left off and answers a number of questions regarding:
  • Pre-suit/investigation notice requirements for Attorneys General
  • Additional information on the scope of Attorneys General investigative authority and how to challenge an investigation
  • Consumer Complaints: differences among the AGs on handling and use
Register here

]]>
Day in the Life of a Chief Privacy Officer https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/day-in-the-life-of-a-chief-privacy-officer https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/day-in-the-life-of-a-chief-privacy-officer Thu, 17 Feb 2022 00:30:48 -0500 Day in the Life of a Chief Privacy OfficerOn this special episode, Privacy and Information Security practice chair Alysa Hutnik chats with Shana Gillers, TransUnion’s Chief Privacy Officer. Alysa and Shana discuss the journey to becoming a chief privacy officer, hot topics, and what it takes to stay on top of your game in privacy today.

Watch a video version here or the audio version here.

Shana Gillers

Shoshana Gillers has served as TransUnion’s Chief Privacy Officer since September 2019. In this role Ms. Gillers oversees compliance with privacy laws across TransUnion’s global footprint and promotes a culture of responsible data stewardship.

Prior to joining TransUnion, Ms. Gillers spent four years at JPMorgan Chase, ultimately serving as Vice President and Assistant General Counsel, Responsible Banking, Data and Privacy. Previously, she served as a federal prosecutor for eight years at the U.S. Attorney’s Office in Chicago, and as a litigator for four years at WilmerHale in New York. Ms. Gillers clerked for the Hon. Robert D. Sack on the U.S. Court of Appeals for the Second Circuit and for the Hon. Aharon Barak on the Supreme Court of Israel.

Ms. Gillers received a B.A. from Columbia University, summa cum laude, and a J.D. from Yale Law School.

Alysa Z. Hutnik

Alysa chairs Kelley Drye’s Privacy and Information Security practice and delivers comprehensive expertise in all areas of privacy, data security and advertising law. Her experience ranges from strategic consumer protection oriented due diligence and compliance counseling to defending clients in FTC and state attorneys general investigations and competitor disputes.

Prior to joining the firm, Alysa was a federal clerk for the Honorable Joseph R. Goodwin, United States District Judge, Southern District of West Virginia.

Alysa received a B.A. from Haverford College, and a J.D. from the University of Maryland Carey School of Law.

]]>
Upcoming Webinars https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/upcoming-webinars https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/upcoming-webinars Tue, 25 Jan 2022 08:26:08 -0500 Join Kelley Drye this week for:

Privacy Priorities for 2022: Legal and Tech Developments to Track and Tackle Wednesday, January 26 at 4:00pm ET/ 1:00pm PT

Privacy compliance is a daunting task, particularly when the legal and tech landscape keeps shifting. Many companies are still updating their privacy compliance programs to address CCPA requirements, FTC warnings on avoiding dark patterns and unauthorized data sharing, and tech platform disclosure, consent, and data sharing changes. But in the not too distant future, new privacy laws in California, Colorado, and Virginia also will go into effect. Addressing these expanded obligations requires budget, prioritizing action items, and keeping up to date on privacy technology innovations that can help make some tasks more scalable.

This joint webinar with Kelley Drye’s Privacy Team and Ketch, a data control and programmatic privacy platform, will highlight key legal and self-regulatory developments to monitor, along with practical considerations for how to tackle these changes over the course of the year. This will be the first in a series of practical privacy webinars by Kelley Drye to help you keep up with key developments, ask questions, and suggest topics that you would like to see covered in greater depth.

Register Here

State Attorney General Consumer Protection Priorities for 2022 Thursday, January 27 at 1:00pm ET

Consumer protection enforcement efforts are expected to increase dramatically this year. Recent pronouncements from State Attorneys General around the country bring privacy, big tech and the misuse of algorithms, and basic advertising related frauds into particular scrutiny.

Please join Kelley Drye State Attorneys General practice Co-Chair Paul Singer, Advertising and Marketing Partner Gonzalo Mon, Privacy Partner Laura VanDruff, and Senior Associate Beth Chun for discussion and practical information on these and other state consumer protection, advertising, and privacy enforcement trends.

Register Here
Ad Law Access Podcast and Advertising and Privacy Law Resource Center On Demand
The award-winning Ad Law Access blog and podcast will have Data Privacy Week content you can use all week long. Find the blog here and the podcast wherever you get your audio.

Subscribe to the Ad Law News and Views newsletter here and our Ad Law Access blog here.

]]>
Credential Stuffing: Cyber Best Practices from NY Attorney General’s Latest Report https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/credential-stuffing-cyber-best-practices-from-ny-attorney-generals-latest-report https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/credential-stuffing-cyber-best-practices-from-ny-attorney-generals-latest-report Thu, 13 Jan 2022 21:16:21 -0500 In guidance released last week, the New York State Office of the Attorney General urged businesses to incorporate safeguards to detect and prevent credential-stuffing attacks in their data security programs. The guidance stemmed from the AG’s finding that 1.1 million customer accounts at “well-known” companies appeared to have been compromised in credential-stuffing attacks.

Credential stuffing refers to an attack where a hacker uses stolen usernames and passwords, or “credentials,” from an online service that has suffered a data breach to access other online services, according to the AG’s report. Attackers exploit the habit of some consumers to reuse their passwords across multiple websites. Attackers may also use automated software to initiate login attempts using stolen credentials from the dark web.

“Businesses have the responsibility to take appropriate action to protect their customers’ online accounts and this guide lays out critical safeguards companies can use in the fight against credential stuffing,” New York State Attorney General Letitia James wrote in a press release accompanying the report.

Specifically, the AG report states that data security programs should incorporate safeguards against the threat of credential stuffing in four areas: (1) defending against credential-stuffing attacks, (2) detecting a credential stuffing breach, (3) preventing fraud and misuse of customer information, and (4) responding to a credential stuffing incident.

The AG recommends that businesses implement the following safeguards to mitigate the risk of successful credential-stuffing attacks. Which safeguards are appropriate to a business will depend on the size and complexity of the business, the volume and sensitivity of customer information it maintains, the risk and scale of injury, and the software and systems already in use.

Defend against a credential-stuffing attack

  • Bot Detection – Businesses can leverage bot detection software to distinguish automated log in attempts from regular “human” log in attempts, and to block malicious bots. The AG noted, however, that in its view CAPTCHA systems have been less effective than bot detection software.
  • Multi-Factor Authentication – Multi-factor authentication creates an additional hurdle to logging in to an account by requiring users to not only have appropriate credentials but also a device that issues authentication codes or biometric authentication procedures.
  • Passwordless Authentication – Passwordless authentication allows a user to access their account using an authentication procedure, such as a one-time code or emailed link.
  • Web Application Firewalls (WAF) – WAFs that guard against malicious traffic can also include safeguards that protect against credential stuffing. These safeguards include rate limiting, which blocks or throttles repeated log in attempts; HTTP request analysis, which analyzes header information and other metadata for indicators of malicious traffic; and IP address blacklists, which block IP addresses known to have engaged in attacks.
  • Preventing Reuse of Compromised Passwords – Businesses can implement procedures to prevent customers from reusing passwords that have been previously compromised, using vendors that compile such credentials.

Detecting a Credential Stuffing Breach

  • Monitoring Customer Activity – Businesses may monitor indicators of fraudulent activity to protect customer accounts.
  • Monitoring Customer Reports of Fraud – Businesses may also review reports from customers about unauthorized transactions or account access.
  • Notice of Account Activity – Businesses may notify customers of unusual account activity to help the customer identify unauthorized activity and report it to the business.
  • Threat Intelligence – Businesses may utilize threat intelligence firms that monitor dark web activity for discussion of stolen credentials or accounts.

Preventing Fraud and Misuse of Customer Information

  • Re-authentication at the Time of Purchase – To prevent attackers from leveraging stolen accounts to make a purchase, the AG states that businesses may require users to re-authenticate stored payment information. For example, the user may be required to re-enter their credit card number or CVV code, or the company might send the user an authentication code.
  • Third Party Fraud Detection – Companies may use third-party services that identify suspicious or fraudulent transactions.
  • Mitigating Social Engineering – Anticipating that some hackers may try to convince customer service personnel to authenticate their account, companies can develop policies that anticipate social engineering attacks and train relevant personnel on those attacks.
  • Preventing Gift Card Theft – The AG suggests that transferring gift cards between customer accounts and transferring funds between gift cards should be restricted or require re-authentication; and that companies should only display the last four digits of a gift card number.

Incident Response

  • Investigation – Where companies suspect an attack, the new guidance states that companies should conduct a timely investigation to determine, at a minimum, “whether customer accounts were accessed without authorization, and, if so, which accounts were impacted, and how attackers were able to bypass existing safeguards.”
  • Remediation – Companies should take action to remediate credential-stuffing attacks, according to the AG’s guidance. The AG suggests blocking attackers’ continued access to the accounts, resetting passwords, and freezing relevant accounts, where appropriate.
  • Notifying Customers – The AG states that businesses should “quickly notify each customer whose account has been, or is reasonably likely to have been, accessed without authorization.” The AG’s report states that customer notice should include the following elements:
    • Disclosing whether the particular customer’s account was accessed without authorization;
    • The timing of the attack;
    • What customer information was accessed; and
    • What actions have been taken to protect the customer.

Finally, given the evolving nature of credential stuffing-related threats, the AG warns that businesses should continually evaluate the effectiveness of applicable controls and implement new procedures where appropriate.

* * *

Since State AGs don’t typically issue guidance like this, it may be a sign that New York plans to continue to target businesses who have not followed their guidance and have thus allegedly inadequately protected against credential stuffing. While other states aren’t bound by this NY-specific guidance, other State AG offices are likely to take notice and discuss this unusual measure through their standing working groups. As a result, some states may potentially follow suit and launch their own investigations on credential stuffing.

State and federal regulators are active in this space, investigating companies’ compliance with UDAP, FTCA, and FCRA Red Flags. Including the possibility of credential stuffing in your data security risk assessment and policy review may reduce your regulatory exposure.

Please join us for Privacy Priorities for 2022: Legal and Tech Developments to Track and Tackle, a joint webinar between Kelley Drye’s Privacy Team and Ketch, a data control and programmatic privacy platform. This Data Privacy Week webinar will highlight key legal and self-regulatory developments to monitor, along with practical considerations for how to tackle these changes over the course of the year. This will be the first in a series of practical privacy webinars by Kelley Drye to help you keep up with key developments, ask questions, and suggest topics that you would like to see covered in greater depth. Register here.

Also please join us for State Attorney General Consumer Protection Priorities for 2022. This webinar will provide discussion and practical information on the topics mentioned above and other state consumer protection, advertising, and privacy enforcement trends. Register here.

]]>
FTC Advises Companies to Remediate Log4j Vulnerability https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-advises-companies-to-remediate-log4j-vulnerability https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-advises-companies-to-remediate-log4j-vulnerability Wed, 12 Jan 2022 14:19:10 -0500 FTC Advises Companies to Remediate Log4j VulnerabilityIn an unusual warning to companies running Java applications with Log4j in their environments, the Federal Trade Commission (FTC) recently cautioned that it “intends to use its full legal authority to pursue companies that fail to take reasonable steps to protect consumer data from exposure as a result of Log4j[] or similar known vulnerabilities in the future.” All companies with consumer information should take heed, assessing information security risks on their systems and devices and implementing policies to guard against foreseeable risks.

What prompted the FTC’s action?

The Apache Log4j software library is a ubiquitous Java-based logging utility. In December, the Cybersecurity and Infrastructure Security Agency (CISA) cautioned that a critical vulnerability in this popular open-source software rendered “hundreds of millions” of internet-connected devices vulnerable to attack. CISA’s Director advised that the software’s ubiquity makes the scale and potential impact of the vulnerability significant. CISA gave federal agencies until December 24, 2021, to patch the vulnerability or implement other mitigating measures.

A variety of executive branch agencies, including CISA and the White House’s National Cyber Director, promoted the FTC’s warning on social media. The FTC’s warning can be viewed as reiterating the FTC’s longstanding approach to data security (that companies must implement reasonable steps to protect consumer information from unauthorized disclosure or misuse) while simultaneously suggesting that a failure to protect against the Log4j vulnerability is per se unreasonable. The warning references the FTC’s $700 million 2019 settlement with Equifax Inc., in which the FTC alleged among other things that the company’s failure to patch a known vulnerability contributed to exposure of millions of consumers’ personal information. The FTC also notes that it is critical for companies and their vendors who rely on Log4j to act now, “in order to reduce the likelihood of harm to consumers, and to avoid FTC legal action.”

Legal context

As we’ve addressed here, there is no single federal data security law in the United States requiring companies across the marketplace to implement a uniform set of data security measures. Nonetheless, the FTC’s warning—which goes further than prior FTC business guidance like Start with Security or Stick with Security—asserts that existing laws, including the FTC Act and the Gramm Leach Bliley Act, create a duty for companies to take reasonable steps to mitigate known software vulnerabilities.

Why does this matter for companies with consumer data?

The FTC’s warning reaffirms that data security enforcement remains a priority for the current Commission’s leadership. In addition, the FTC post relays the Commission’s intent to consider the “broader set of structural issues” related to “open-source services,” which it considers to be among the “root issues that endanger user security.” This seems to be a callback to Chair Khan’s strategic vision for approaching competition and consumer protection “holistically” and focusing on what the Commission regards to be “root causes” of harm.

The FTC’s admonitions remind every company with consumer information to assess the risks to that information in their environments and in vendor environments and implement reasonable policies to guard against those risks.

* * *

Please join us for State Attorney General Consumer Protection Priorities for 2022. This webinar will provide discussion and practical information on the topics mentioned above and other state consumer protection, advertising, and privacy enforcement trends. Register here.

Also join us for Privacy Priorities for 2022: Legal and Tech Developments to Track and Tackle, a joint webinar between Kelley Drye’s Privacy Team and Ketch, a data control and programmatic privacy platform. This Data Privacy Week webinar will highlight key legal and self-regulatory developments to monitor, along with practical considerations for how to tackle these changes over the course of the year. This will be the first in a series of practical privacy webinars by Kelley Drye to help you keep up with key developments, ask questions, and suggest topics that you would like to see covered in greater depth. Register here.

]]>
Some fireworks at Bedoya’s Senate confirmation hearing, but confirmation still seems likely https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/some-fireworks-at-bedoyas-senate-confirmation-hearing-but-confirmation-still-seems-likely https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/some-fireworks-at-bedoyas-senate-confirmation-hearing-but-confirmation-still-seems-likely Thu, 18 Nov 2021 13:49:55 -0500 On November 17, the Senate Commerce Committee held its eagerly-awaited hearing on the nomination of Alvaro Bedoya, a data privacy academic from Georgetown Law, to be FTC Commissioner. Bedoya is slated to replace Rohit Chopra, who departed the agency last month to become Director of the CFPB, and Bedoya’s appointment would once again give the Democrats a voting majority. In the run-up to his hearing, some have wondered – Can we expect Bedoya to provide Chair Khan with a reliable third vote for her agenda, or will he bring a more bipartisan approach to the agency? From his answers and demeanor at the hearing, the answer is probably…both.

First, a little table-setting: Bedoya’s nomination was considered along with three others – Jessica Rosenworcel for FCC Chair and two nominees for the Department of Commerce. The hearing was well-attended by Committee members, who directed the majority of their questions to Rosenworcel. (Yes, net neutrality, broadband access, and the “homework gap” all got more attention than privacy.) All four current FTC Commissioners attended the hearing in person, in a bipartisan show of support for Bedoya, though Bedoya attended remotely due to a recent exposure to COVID.

Here are some takeaways from Bedoya’s portion of the hearing.

  • He appears likely to be confirmed, even if largely along party lines. Although Senator Wicker made a reference to Bedoya’s “strident” views and Senators Lee, Cruz, and Sullivan slammed his “extremist” tweets (see below), most of the questions (from 18 Senators!) related to Bedoya’s area of expertise (privacy), where there is more alignment between the parties than in other areas. He handled the questions well, and repeatedly expressed support for collaboration and bipartisanship (e.g., specifically mentioning that he wants to work closely with Commissioner Wilson on privacy). Democrats have the votes (in the Committee and on the Senate floor), even if they ultimately have to call in V.P. Harris to break a tie.
  • He spoke about his nomination and the issues in personal and emotional terms. Bedoya highlighted that he and his family were welcomed into this country 34 years ago. He talked about his experience as a Senate staffer, learning about the terror and harm caused by stalking apps from a shelter for battered women. He realized then and believes now that “privacy is not just about data, it’s about people.” His goal as a Commissioner would be to make sure the FTC protects people, and to help both consumers and businesses manage the multiple crises facing the country – a COVID crisis, a privacy crisis, and a small business crisis.
  • He appears likely to vote with the majority on many (or most) issues. No big surprise here, but when asked his views about various issues, he consistently supported positions that Khan, Slaughter, and (his predecessor) Chopra have supported – federal privacy legislation, Magnuson-Moss privacy rulemaking if Congress doesn’t act, pushing back against the “unprecedented consolidation” that is forcing small businesses to close, streamlining the FTC’s rulemaking and subpoena processes, reducing the power of the platforms, and reining in tracking technologies like facial recognition. As to the latter, he said he would not support banning facial recognition technologies altogether, since some applications assist with benefits like public safety and healthcare. However, he would support banning facial recognition technologies that are hidden, that lack consent, or that collect, use, and share data without limits.
  • He’s a real-live privacy expert. He clearly has the credentials, starting with his work as a Senate staffer and continuing through his years at Georgetown Law as a professor and head of a privacy think tank. But he also quickly and confidently answered all questions related to privacy – from the need for privacy legislation generally, to his views on Senator Schatz’s “duty of loyalty” and Senator Markey’s proposal to amend COPPA, to the lines he would draw on facial recognition (see above).
  • He wrote some controversial tweets, and a number of Republicans seem poised to vote “no” on his confirmation. Senator Sullivan cited a tweet from Bedoya calling the 2016 Republican convention a “White Supremacist rally.” Cruz cited tweets about ICE as a “domestic surveillance agency” and a retweet involving critical race theory and white supremacy. He also called Bedoya a “left wing activist, bomb thrower, extremist, and provocateur.” Lee ran through a series of supposedly “yes or no” questions in rapid succession, and accused Bedoya of being evasive when he tried to qualify his responses. And Wicker referred to Bedoya’s “strident” views, as noted above. As to the tweets, Bedoya apologized, saying that it was “rhetoric” and that he would put aside any partisan views if he became Commissioner. However, these Senators (and perhaps other Republicans) seem poised to vote “no” on Bedoya’s confirmation, and some have said they plan to place a “hold” on the process, which could slow it down.
  • If confirmed, he could help reduce tensions at the Commission. With acrimony among the Commissioners currently at unprecedented levels (see our recent post here), adding Bedoya to the mix could help reduce the tensions (despite the tweets). He’s known to be collegial, he worked across the aisle as a Senate staffer, he repeatedly invoked bipartisanship at the hearing, and all of the sitting Commissioners (Democrats and Republicans) showed up at the hearing to support him. That augurs well for the dynamics at the Commission, even if the votes remain split along party lines.

We will continue to monitor progress on Bedoya’s nomination and post updates as they occur.

Some fireworks at Bedoya’s Senate confirmation hearing, but confirmation still seems likely

Subscribe to Kelley Drye's Ad Law Access blog here www.adlawaccess.com/subscribe

]]>
Hope Emerges at Senate Data Security Hearing – But Will Congress Grab the Brass Ring? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/hope-emerges-at-senate-data-security-hearing-but-will-congress-grab-the-brass-ring https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/hope-emerges-at-senate-data-security-hearing-but-will-congress-grab-the-brass-ring Sun, 10 Oct 2021 10:25:23 -0400 On October 6, 2021, the Senate Commerce Committee conducted its second in a series of hearings dedicated to consumer privacy and data, this time addressing Data Security. Similar to last week’s privacy hearing, the witnesses and Senators appeared to agree that federal data security standards – whether as part of privacy legislation or on their own – are urgently needed. If there were to be consensus around legislative principles, the hearing provides clues about what a compromise might look like.

Prepared Statements. In their opening statements, the witnesses emphasized the need for minimum standards governing data security.

  • James E. Lee, Chief Operating Officer of the Identity Theft Resource Center, explained that without minimum requirements, companies lack sufficient incentives to strengthen their data security practices to protect consumer data. Lee also advocated for more aggressive federal enforcement rather than the patchwork of state actions, which, he said, produce disparate impacts for the same conduct.
  • Jessica Rich, former Director of the FTC’s Bureau of Consumer Protection and counsel at Kelley Drye, emphasized that current laws do not establish clear standards for data security and accountability. She advocated for a process-based approach to prevent the law from being outpaced by evolving technologies and to ensure that it accommodates the wide range of business models and data practices across the economy. Among her recommendations, Rich suggested that Congress provide the FTC with jurisdiction over nonprofits and common carriers and authority to seek penalties for first-time violations.
  • Edward W. Felten, former Deputy U.S. Chief Technology Officer, former Chief Technologist of the FTC’s Bureau of Consumer Protection, and current Professor of Computer Science and Public Affairs at Princeton University, focused on the need to strengthen the FTC’s technological capabilities, including increasing the budget to hire more technologists. Notably, Felten advocated for more prescriptive requirements in data security legislation such as requiring companies to store and transmit sensitive consumer data in encrypted form and prohibiting companies from knowingly shipping devices with serious security vulnerabilities.
  • Kate Tummarello, Executive Director at Engine, a non-profit organization representing startups, addressed the importance of data security for most startups. Tummarello advocated for FTC standards or guidance with flexible options. Cautioning against overburdening startups, Tummarello explained that newer companies take data security seriously because they do not have the name recognition or relationships with consumers that larger companies may have, and a single breach could be extremely disruptive. Additionally, Tummarello highlighted that the patchwork of state laws provides inconsistent and unclear data security guidance and imposes high compliance costs.

Discussing a Federal Data Security Bill

  • Preemption. Witnesses agreed that a preemptive federal law does not necessarily mean a weaker law. Rich offered a middle ground, supporting preemption, but stating the law should fully empower the state AGs to enforce it.
  • Private Right of Action. Tummarello expressed concern that lawsuits across the country would contribute to the “patchwork” of laws that increase compliance costs. However, if a private right of action were necessary, she would support only a narrow private right of action with sufficient notice and guardrails, particularly to protect startups vulnerable to bad faith litigation. Lee demurred on whether a private right of action was needed but emphasized that consumers need to be protected no matter what state they live in. Rich stated that if the legislation is strong enough – with robust protections and remedies, full enforcement authority for the states, and significant resources for the FTC – it will protect consumers without the need for a private right of action. However, Rich also described “middle grounds” that could bridge the divide.
  • Sensitive Data. Although there were some questions about what constitutes sensitive data, the witnesses agreed that both biometric data and data about children should have heightened protections. Felten addressed concerns regarding artificial intelligence and facial recognition. Lee discussed the importance of protecting biometric data because it is permanent and cannot be changed – unlike a credit card number – if it is compromised.
  • Process-Based Approach. Rich emphasized the need for a “scalable” federal law that takes a process-based approach so that it does not quickly become obsolete. She added that the FTC could issue more detailed guidance on a regular basis to highlight particular technologies and safeguards that companies should consider. In contrast, Felten supported specific safeguards that the FTC would require through rulemaking, and Tummarello supported an FTC rule or guidance that would give companies a “menu” of safeguards to consider.
  • Inclusion with Data Privacy Bill. All witnesses supported including data security provisions into a federal privacy bill, but Rich stated that a data security law could prevent considerable consumer harm as a stand-alone measure.

FTC’s Role and Enforcement.

  • FTC as Enforcer. Similar to last week’s hearing, all witnesses agreed that the FTC was the agency best equipped to oversee and enforce a federal data security law.
  • Resources Needed. Felten noted that the FTC only has about ten technologists on staff, but could use 50-60 people in technologist roles to supplement its enforcement efforts. Rich added that technologists need a career path at the FTC, and that the FTC should reexamine the complicated ethics rules governing what technologists may do after they leave the FTC’s employment.
  • First time penalties. All witnesses agreed that the FTC should be able to seek penalties for first-time violations. Tummarello, however, said that she supports first-time penalties only if there are clear rules of the road.

Overall, the hearing made clear that there are more areas of agreement than disagreement. The key questions are: (1) Can Congress resolve differences related to a private right of action, whether by ensuring strong protections without it or by compromising on a narrow private right of action? (2) Will Congress be willing to pass federal data security legislation on its own? We will continue to monitor developments on this issue and provide updates as they occur.

Hope Emerges at Senate Data Security Hearing –But Will Congress Grab the Brass Ring?
]]>
“Not Outgunned, Just Outmanned” (For Now): Senate Hearing on Privacy Law Addresses Under-resourced FTC https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/not-outgunned-just-outmanned-for-now-senate-hearing-on-privacy-law-addresses-under-resourced-ftc https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/not-outgunned-just-outmanned-for-now-senate-hearing-on-privacy-law-addresses-under-resourced-ftc Fri, 01 Oct 2021 08:45:07 -0400 On September 29, 2021, the Senate Commerce Subcommittee held a hearing titled Protecting Consumer Privacy. The senators addressed the potential $1 billion earmarked to strengthen the FTC’s privacy work, the future of a federal privacy and data protection law, and a myriad of other privacy related topics such as children’s privacy.

Prepared Statements. In their opening testimonies, the witnesses emphasized different types of needs for the FTC.

  • David Vladeck, a former Director of the FTC Bureau of Consumer Protection, strongly advocated for a federal privacy law and additional funding for the FTC to support increased efforts on technology-centered consumer protection enforcement. In his remarks, Vladeck noted that the FTC has been wholly understaffed and underfunded for forty years, despite the agency’s ever increasing responsibilities and the complexity of issues it now faces. Additionally, Vladeck emphasized the need to increase the FTC’s enforcement powers by giving the FTC rulemaking authority under the APA and civil penalty authority.
  • Morgan Reed, the president of The App Association, focused more on the need for a federal privacy law to reduce the compliance costs for small businesses. He reiterated that the patchwork of state laws increases risk and costs for small businesses.
  • Maureen Olhausen, a former Acting FTC Chairman and Commissioner, shifted the conversation from funding for the FTC to the importance of a federal privacy law. She noted that “the FTC lacks explicit authority to enforce statutory privacy requirements or promulgate privacy regulations,” and that a federal privacy law should address this gap, allowing for enforcement, along with state attorneys general.
  • Ashkan Soltani, a former FTC Chief Technologist, primarily concentrated on the urgent need for expertise at the FTC. He emphasized the importance of hiring technologists and experts, but also paying them competitive rates to retain talent. The FTC is understaffed to handle litigation matters or to monitor compliance with consent orders, particularly those that require technical fluency.

Discussing the Federal Privacy Bill. The senators appeared to be in consensus that there is a need for a federal privacy law. Senator Wicker called on the Biden Administration to provide a liaison to Congress to prioritize the enactment of a law.

  • Right to Cure. Reed was adamant that a right to cure provision be written into the bill to protect small businesses from being punitively fined for unintentional mistakes such as not responding to an email within 30 days.
  • Private Right of Action. The witnesses went back and forth on the correct approach to a private right of action. While Soltani supported a private right of action as a means to “make up for the concern that there’s not enough enforcement capacity,” Olhausen was concerned that the private right of action would not result in consumer redress, but rather attorney’s fees. Reed stated that he preferred injunctive relief as a type of private right of action. Similarly, Soltani noted that in his experience, core behavior changes come not from fines, but injunctions and restrictions imposed on the business.
  • Preemption. Vladeck, Reed, and Olhausen supported federal preemption. Soltani agreed that a federal privacy law should only be a floor, and not a ceiling. In other words, a federal privacy law should preempt less rigorous laws to set a baseline standard, but states could enact additional measures and add further protections for their constituents.
  • Carve-out. The witnesses went back and forth on whether size of business should factor into whether an entity would be covered by the bill. Vladeck emphasized that small businesses can create big harms; therefore, the legislation needs to be focused on consumer harm rather than the size of the company. Reed agreed, but reiterated the need for a right to cure for small businesses.

Funding for the FTC. Senators focused on whether the FTC needs $1 billion to achieve its goal of protecting consumers. Vladeck wholeheartedly agreed and said that an additional $100 million a year would be a good start for the FTC. For example, on the recent Google litigation, Vladeck theorized that Google had 1,000 privacy attorneys, whereas the FTC had less than 100. Vladeck noted that the funding would be earmarked for hiring more attorneys, engineers, and technologists, as well as setting up a new bureau of privacy.

Children’s Privacy. The witnesses received several questions on their thoughts on protecting children’s privacy in the aftermath of reports on how social media impacts children’s mental health. Vladeck specifically advocated for lowering the scienter standard that the FTC has to prove to show that a developer knew their technology was tracking children. This mirrors the EU’s “constructive knowledge” standard that is used for children’s privacy. Additionally, Vladeck suggested getting rid of COPPA’s safe harbor program and rethinking the age limit. All witnesses agreed that children were vulnerable to targeted ads. In response to Senator Markey’s concern for children’s privacy, all witnesses responded that they would approve of another children’s privacy bill if Congress could not enact a sweeping data protection and privacy law for adults.

]]>
Ad Law Access Podcast – Data Security 101 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/podcast-data-security-101 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/podcast-data-security-101 Mon, 11 Jan 2021 08:57:10 -0500 Ad Law Access Podcast

Our increased reliance on the Internet to conduct our daily affairs has thrust an additional spotlight on data security that much important. On another 101 edition of the Ad Law Access podcast, it will cover data security and covers five key points businesses should keep in mind as they continue to refine their data security practices based on FTC settlements and guidance.

Listen on Apple, Spotify, Google Podcasts, SoundCloud, via your smart speaker, or wherever you get your podcasts.

For more information on data security, privacy, and other topics, visit:

]]>
Ad Law Access Podcast – Data Breaches 101 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ad-law-access-podcast-data-breaches-101 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ad-law-access-podcast-data-breaches-101 Tue, 01 Dec 2020 16:48:18 -0500 Ad Law Access PodcastOn a timely new episode of Kelley Drye’s Ad Law Access podcast, Privacy and Data Security practice chair Alysa Hutnik and partner Aaron Burstein provide 101 level tips on how to manage the clock and begin to deal with data breaches when they happen.

For more information on data breaches, visit:

Advertising and Privacy Law Resource Center

]]>
Zoom Agrees to Settle FTC’s Data Security Charges https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/zoom-agrees-to-settle-ftcs-data-security-charges https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/zoom-agrees-to-settle-ftcs-data-security-charges Tue, 10 Nov 2020 19:24:59 -0500 Seven months after being called upon by members of Congress to investigate Zoom’s data security practices, a divided FTC announced on November 9 a settlement with the videoconferencing platform.

The FTC’s five-count administrative complaint alleges that Zoom deceived users about several of its security features and harmed users by circumventing security and privacy controls provided by their operating systems and browsers. The proposed consent order requires Zoom to make changes to its data security practices, implement a comprehensive information security program, and obtain independent assessments of its program for 20 years after entry of the order – but does not require the company to pay monetary relief. In separate dissents, Commissioners Chopra and Slaughter argue that the proposed relief does not go far enough.

Companies watching the FTC’s data security enforcement trends will want to take note of two main takeaways: claims about the strength of security protections in products and services warrant close scrutiny, and software deployments that weaken or circumvent other security controls on users’ devices will likely receive a tough reception from the FTC.

Allegations in the FTC’s Complaint

Deception. Although Zoom has grown rapidly during the coronavirus pandemic, much of the FTC’s complaint focuses on conduct that predates the massive shift to videoconferencing as a substitute for in-person family, business, social, and religious gatherings. Specifically, the FTC alleges that Zoom misrepresented several features of its service through blog posts, user documentation, and other publicly available statements:

  • End-to-end encryption: Zoom asserted that it used end-to-end encryption (i.e., encryption that only the parties to a communication can decipher) but did not disclose that, for most versions of its service, Zoom stored encryption keys that would also allow Zoom to decrypt users’ communications.
  • Level of encryption: Zoom claimed to use 256-bit encryption keys but apparently used 128-bit keys.
  • Unencrypted storage: Zoom stored meeting recordings in unencrypted form for 60 days before moving them to encrypted storage.
  • Disguised updates: A software update billed as providing “minor bug fixes” did not disclose that it would install a web server on users’ devices.
Unfairness. In addition, the FTC alleges that Zoom unfairly harmed users’ privacy and security interests by installing a “secret” web server as part of a 2018 update to its app for Apple Mac computers. According to the complaint, this update worked around privacy and security protections in the Safari browser and exposed Zoom users to potential phishing, denial of service, and remote code execution vulnerabilities. The complaint notes that Zoom users share health, financial, proprietary and other sensitive information but does not describe actual breaches involving such information.

Proposed Order Provisions

The Zoom order is generally consistent with recent changes in FTC data security orders, which reflect the agency’s efforts to ensure that its orders are specific enough to be enforceable, set tighter standards for security program assessments, and impose requirements for managerial oversight and order compliance. Along these lines, key requirements in the Zoom order are as follows:

  • Comprehensive Information Security Program. Zoom’s security program that Zoom must, at minimum, meet 10 families of requirements, most of which consist of multiple sub-requirements.
  • Independent Assessments. Zoom must obtain independent security assessments every other year during the order’s 20-year term. Among other requirements, the assessor must identify the evidence obtained to support its conclusions and may not rely on “primarily on assertions or attestations” by the company.
  • Annual Certifications. A “senior corporate manager” must file an annual certification stating that the company has met the requirements of the order and is not aware of any “material noncompliance” that has not been corrected or disclosed to the FTC.
  • Incident Reporting. Finally, Zoom must report to the FTC instances of unauthorized access to or acquisition of recorded or livestream video or audio content within 30 days of discovering such an incident, unless the incident affects fewer than 500 users or meets other exceptions.
Dissents: A Preview of the Next FTC?

Consistent with their dissents in a string of major privacy and data security cases (e.g., YouTube and Facebook), Commissioners Chopra and Slaughter criticize the Zoom settlement for falling short in the relief provided to consumers and the changes required in Zoom’s business practices.

Perhaps most significantly in light of the potential changes in store for the FTC under a Biden-Harris administration, Commissioners Chopra and Slaughter endorse a list of seven recommendations to “restore credibility” (in Commissioner Chopra’s words) and “improve the effectiveness” of the FTC’s enforcement efforts:

  1. Strengthen orders to emphasize more help for individual consumers and small businesses, rather than more paperwork.
  2. Investigate firms comprehensively across the FTC’s mission.
  3. Diversify the FTC’s investigative teams to increase technical rigor.
  4. Restate existing legal precedent into clear rules of the road and trigger monetary remedies for violations.
  5. Demonstrate greater willingness to pursue administrative and federal court litigation.
  6. Increase cooperation with international, federal, and state partners.
  7. Determine whether third-party assessments are effective.
With respect to Zoom in particular, Commissioner Slaughter argues that the company’s practices harmed consumers’ privacy interests and that a “more effective order” would require Zoom to address privacy and security risks in its services. Despite the greater specificity in the Zoom order compared to FTC data security orders of a few years ago, Commissioner Chopra criticizes this settlement as a “status quo approach” that does not provide for direct notice or relief for Zoom’s customers.

For more information on the FTC and other topics, visit:

Advertising and Privacy Law Resource Center

]]>
Tackling the Privacy, Data Security, and Employment Issues Related to Returning to Work https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/tackling-the-privacy-data-security-and-employment-issues-related-to-returning-to-work https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/tackling-the-privacy-data-security-and-employment-issues-related-to-returning-to-work Mon, 06 Jul 2020 23:49:25 -0400 Coronavirus testing and screening procedures are central to many companies’ return-to-work plans. Because testing and screening data is often sensitive and may help to determine whether individuals are allowed to work, companies need to be aware of the privacy and security risks of collecting this data and protect it appropriately. Failing to do so may lead to a backlash in the workplace, cause reputational damage, and invite scrutiny from regulators and plaintiffs’ attorneys.

We have created checklist of general tips to help companies navigate return-to-work privacy and data security issues. In addition to designing COVID-19 testing and screening data collection programs that fit local and state reopening conditions, companies may also wish to consult key sources of federal guidance, including the following:

For more information on returning to work, COVID-19, and other topics, please visit: Advertising and Privacy Law Resource Center

]]>
District Court Affirms Need to Turn Over Data Breach Report https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/district-court-affirms-need-to-turn-over-data-breach-report https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/district-court-affirms-need-to-turn-over-data-breach-report Fri, 26 Jun 2020 16:37:39 -0400 Earlier this month, we offered our analysis and takeaways from a Magistrate Judge’s decision that defendant Capital One was required to produce a third-party data breach assessment report as part of ongoing consumer litigation. Available here. Not surprisingly, Capital One appealed that order. On June 25, 2020, District Court Judge Anthony Trenga affirmed the decision, ordering Capital One to produce the report.

Brief Recap of the Incident and Order

In November 2015, Capital One retained FireEye, Inc. d/b/a Mandiant (“Madiant”) to provide support in case of a data breach or security incident. When a breach occurred in March 2019, Capital One’s outside counsel called on Mandiant. While they executed a new letter agreement, the analysis requested from Mandiant was the same as that outlined in the 2015 Scope of Work.

Several putative consumer class actions were filed and a multi-district litigation is currently pending in the Eastern District of Virginia, captioned In re Capital One Consumer Data Breach Litigation, Case No. 1:19-md-2915.

There is no valid argument that the Mandiant report does not qualify as relevant and responsive information; however, Capital One argued that it was shielded from discovery by the attorney work product doctrine. Plaintiffs filed a motion to compel its production. On May 26, 2020, Magistrate Judge John Anderson granted Plaintiffs’ motion, finding that Capital One failed to meet its burden of establishing a valid privilege.

District Court Affirms

Capital One objected to the Magistrate Judge’s ruling and sought relief from the District Court Judge under Federal Rule of Civil Procedure 72(a). The Magistrate Judge’s decision was subject to evaluation under a “clearly erroneous or contrary to law” standard. The Court considered whether the order failed to apply or misapplied relevant statutes, case law, or procedure.

The District Court focused on whether the report was compiled “because of the prospect of litigation.” The Court questioned whether the prospect of litigation was “the driving force behind” the preparation of the Mandiant report. Despite retention by outside counsel, the Court found that Mandiant’s investigation would have been conducted, and report compiled, in materially the same way whether or not there was litigation or counsel involved. The Court also agreed with the Magistrate Judge that Capital One’s broad distribution showed that the Mandiant report “was significant for regulatory and business reasons” and underscored that business purpose.

The Court downplayed the prospect of potential litigation. The Court agreed with the Magistrate Judge that “[t]here is no question that at the time Mandiant began its ‘incident response services’ in July 2019, there was a very real potential that Capital One would be facing substantial claims following its announcement of the data breach.” Capital One’s website confirms that the breach resulted in access to consumer and small business credit card applications from 2005 to 2019, transaction data for certain customers, and about 140,000 social security numbers and information from 80,000 bank accounts. Even before the full extent of the breach was known and a report compiled, Capital One almost certainly had reason to believe this could be a litigation event.

Rather than a subjective (or even objective) analysis of the potential for litigation, the Court focused on whether the report would have been compiled in the same form whether there was a litigation threat or not. On that point, Capital One failed to demonstrate any input, direction, or strategic guidance from its outside counsel. The report was compiled as it had been envisioned for “business critical” purposes in 2015, and without any focus on the potential for litigation. That contributed significantly to Capital One’s inability to establish a privilege.

Thus, Capital One was ordered to produce the Mandiant report “forthwith.” If it wants to press the issue further, Capital One’s next option would be to seek permission for an interlocutory review by the Fourth Circuit Court of Appeals.

Implications and Lessons

The District Court’s affirmance and acceptance of the Magistrate Judge’s order confirms the importance of having proper protocols and protections in place when engaging an external (or even internal) expert to assist with litigation-relevant analyses. As detailed in our prior post, if a written report is required, companies should keep certain key points in mind, along with one new point emphasized by the District Court as to active involvement by outside counsel in the report itself:

  • Clearly Defined Legal Scope of Work: Where a consultant has already been engaged and works with the company, the retainer signed at the direction of counsel must clearly define the terms and scope of work as distinct from the previous business relationship.
  • Paid by Legal: If a consultant is being retained to provide support for legal advice or concerning potential legal claims, that work should be managed and paid for by legal personnel.
  • Outside Counsel Active Involvement in Written Work Product: Outside counsel should be actively involved in providing input and strategic direction to the consultant as to what the consultant report addresses and incorporating legal considerations.
  • Narrow Internal Distribution: Distribution of investigation reports should be limited to those individuals necessary to complete the legal analysis and litigation work.
  • No External Non-Legal Distribution: Investigation reports should not be distributed to third parties.
  • Track Distribution: Distribution of investigation reports should be tracked so that limited distribution can be demonstrated.
  • Segregate Legal from Operational Work: Where business and legal issues or analysis are part of the same investigation, steps should be taken to segregate the legal- and litigation-related work product from business or operational reports and work.
While no protocol is guaranteed to satisfy every court, and each factual situation is unique, these guideposts improve the odds of meeting the burden required to withhold production of a consultant’s report.

Should you have any questions concerning these issues or would like advice concerning how to approach the interplay of consultants and privilege, please feel free to contact us.

Ad Law Access Podcast

]]>