The SCOPE Act in Focus—What You Should Know About Texas’s Partially-Blocked Youth Privacy Statute
As we’ve discussed previously, youth privacy continues to be an enforcement priority for many state attorneys general across the country, and Texas is no exception. In 2023, Texas passed the Securing Children Online Through Parental Empowerment (SCOPE) Act, which was designed to broadly prevent Texas children from accessing certain “harmful” content through social media platforms. But, on August 30, 2024, the Western District of Texas blocked some key aspects of the law—just days before the statute was set to go into effect on September 1, 2024. Below we describe the SCOPE Act’s requirements, which parts of the statute were affected by the court’s injunction, and what companies should be aware of moving forward.
SCOPE Act
According to the state, the SCOPE Act aims to protect children under 18 from “harmful content and data collection practices.” It applies to digital service providers (DSPs) that host platforms for users to make public profiles, post public content, and interact with other users. Note, however, that a number of entities are exempted from the statute, including small businesses as defined by the U.S. Small Business Administration, financial institutions subject to the Gramm-Leach-Bliley Act, and higher education institutes.
The law (as originally intended) imposes a number of requirements on DSPs, including:
- (1) Duty to Register: DSPs must register the age of a person who signs up for an account on their platform and prevent that person from later altering their age.
- (2) Limits on Data Collection, Sharing, and Targeted Advertising: They are required to limit their collection and use of minors’ PII and must prohibit minors from making purchases or engaging in other financial transactions through the platform unless their parent or guardian has provided consent (in which case, the service provider still must “restrict” minors’ ability to effectuate purchases). DSPs also are prohibited from collecting minors’ geolocation data, displaying targeted advertisements to minors, and sharing or selling minors’ PII (unless a parent or guardian has, using the parental controls described in (4) below, permitted such data practices).
- (3) Algorithm-Related Disclosures: They must clearly disclose (either in their terms of service, privacy policy, or other similar agreement) how they use algorithms to show content to minors, how their algorithms rank or filter content, and what PII their algorithms take into account.
- (4) Parental Tools: DSPs are further required to create and publish parental tools that allow a parent or guardian to, after their identity has been verified by the service provider, control the minor’s privacy and account settings, monitor the amount of time the minor spends on the platform, and review, download, or delete the minor’s PII. The parental controls also must give parents the ability to alter the DSP’s duties relating to data collection, data sharing, targeted advertising, and child purchases.
- (5) Advertising and Marketing: They must work to prevent advertisers on their platforms from targeting minors with advertisements that promote products or services that are unlawful for minors in Texas.
- (6) Harm Prevention: As originally intended, the SCOPE Act requires DSPs to develop and implement a strategy to prevent minors’ exposure to “harmful” material—i.e., material that “promotes, glorifies, or facilities suicide, self-harm, eating disorders, substance abuse, stalking, bullying, harassment, grooming, trafficking, child pornography, or other sexual exploitation or abuse.” (These terms are not defined in the statute.) Under these “harm prevention” provisions, DSPs must apply filtering technology to, and create a database of, such “harmful” material, and are required to use hash-sharing technology and other protocols to identify recurring harmful content. If, after this, the DSP knowingly publishes or distributes content, more than one-third of which is “harmful” to minors, it must use a “commercially reasonable age verification method” to verify the age of the user seeking access to the content.
Harm Prevention Requirements Blocked by Injunction
On July 30, 2024, free speech advocacy groups filed suit in the Western District of Texas seeking to block the SCOPE Act in its entirety, alleging that it violates the First Amendment, is void for vagueness, imposes unconstitutional prior restraints, and is preempted under Section 230 of the Communications Decency Act.
In a 38-page order, Judge Robert Pitman sided partly with the plaintiffs, preliminarily enjoining what we described above as the statute’s “harm prevention” requirements (i.e., (6), above), but leaving intact the rest of the statute (i.e., (1)-(5), above). In granting a preliminary injunction as to the harm prevention requirements, Judge Pittman expressed doubt that Texas has a compelling interest in regulating the types of content it describes in the statute as “harmful” and explained that, in any event, the regulated topics are too vague and overbroad as written. In his opinion, Judge Pitman wrote: “Under these indefinite meanings, it is easy to see how an attorney general could arbitrarily discriminate in his enforcement of the law.”
What You Should Know Moving Forward
On September 1, the majority of the SCOPE Act’s requirements (i.e., (1)-(5), above) went into effect—meaning social media platforms and similar companies should ensure they are compliant. Of particular note are the statute’s requirements that service providers must, unless a parent or guardian has permitted it through parental controls, (i) prohibit minors from making purchases or engaging in other financial transactions and (ii) forego the collection and sale of minors’ geolocation data and PII.
The SCOPE Act is just one of a number of state statutes addressing youth privacy that have been blocked by injunction this year. For example, Mississippi, Ohio, and California tried to pass similar pieces of legislation but were blocked in whole or in part by advocacy group-initiated lawsuits. While we will continue to monitor for any updates, it is clear that youth privacy continues to be a hot topic for both regulators and advocacy groups alike.