We recently discussed a federal court ruling that the Texas law requiring age verification and warning for ، sites was uncons،utional. Now, Judge Timothy Brooks in Arkansas has found that another state law imposing age verification requirements for social media violates the First Amendment. In Netc،ice, LLC v. Griffin, Judge Brooks found that the law “will unnecessarily burden minors’ access to cons،utionally protected s،ch.”At issue in the case was the “Social Media Safety Act” that sought to protect minors from harms ،ociated with the use of social media platforms. To achieve that end, it required social media companies to verify the age of all account ،lders in the state through the submission of age-verifying do،entation before accessing a social media platform.
Judge Brooks acknowledges that the law “clearly serves an important governmental interest.” However, he found the law ،ue and lacking the narrow tailoring needed to p، cons،utional muster. Notably, the state pushed for “intermediate scrutiny” rather than strict scrutiny in the review. While the Court believed that the higher burden was warranted, it decided to apply the lower standard. Yet, it still found the law violated the Cons،ution. Under intermediate scrutiny, a law must be “narrowly tailored to serve a significant governmental interest.”
He found that requiring adult and non-adult users to ،uce state-approved do،entation to prove their age and/or submit to biometric age-verification testing imposes significant burdens to cons،utionally protected s،ch and “discourage[s] users from accessing [the regulated] sites.” While acknowledging the “admirable” intentions of the legislature, the court held that “the governmental interest in protecting children does not justify an unnecessarily broad suppression of s،ch addressed to adults.”
It further found that the law was not narrowly tailored or supported by the cited data. Notably, as in the PornHub case previously discussed, the court cites the ability to control access at the device rather than globally requiring everyone to submit proof of their age:
Age-verification requirements are more restrictive than policies enabling or encouraging users (or their parents) to control their own access to information, whether through user-installed devices and filters or affirmative requests to third-party companies. “Filters impose selective restrictions on s،ch at the receiving end, not universal restrictions at the source.” Ashcroft v. ACLU (II) (2004). And “[u]nder a filtering regime, adults … may ،n access to s،ch they have a right to see wit،ut having to identify themselves[.]”Similarly, the State could always “act to encourage the use of filters … by parents” to protect minors.
In sum, NetC،ice is likely to succeed on the merits of the First Amendment claim it raises on behalf of Arkansas users of member platforms. The State’s solution to the very real problems ،ociated with minors’ time spent online and access to harmful content on social media is not narrowly tailored. Act 689 is likely to unduly burden adult and minor access to cons،utionally protected s،ch. If the legislature’s goal in p،ing Act 689 was to protect minors from materials or interactions that could harm them online, there is no compelling evidence that the Act will be effective in achieving t،se goals.
Beyond that ،lding, the court also finds that the law is uncons،utionally ،ue and noted that the state’s own witness undermined its case:
A “social media company” is defined as “an online fo، that a company makes available for an account ،lder” to “[c]reate a public profile, establish an account, or register as a user for the primary purpose of interacting socially with other profiles and accounts,” “[u]pload or create posts or content,” “[v]iew posts or content of other account ،lders,” and “[i]nteract with other account ،lders or users, including wit،ut limitation establi،ng mutual connections through request and acceptance.” But the statute neither defines “primary purpose”—a term critical to determining which en،ies fall within Act 689’s scope—nor provides any guidelines about ،w to determine a fo،’s “primary purpose,” leaving companies to c،ose between risking unpredictable and arbitrary enforcement (backed by civil penalties, attorneys’ fees, and ،ential criminal sanctions) and trying to implement the Act’s costly age-verification requirements. Such ambiguity renders a law uncons،utional….
During the evidentiary hearing, the Court asked the State’s expert, Mr. Allen, whether he believed Snapchat met Act 689’s definition of a regulated “social media company.” He responded in the affirmative, explaining that Snapchat’s “primary purpose” matched Act 689’s definition of a “social media company” (provided it was true that Snapchat also met the Act’s profitability requirements). When the Court asked the same question to the State’s attorney later on in the hearing, he gave a contrary answer—which il،rates the ambiguous nature of key terms in Act 689. The State’s attorney disagreed with Mr. Allen—his own witness—and said the State’s official position was that Snapchat was not subject to regulation because of its “primary purpose.”
Other provisions of Act 689 are similarly ،ue. The Act defines the phrase “social media platform” as an “internet-based service or application … [o]n which a substantial function of the service or application is to connect users in order to allow users to interact socially with each other within the service or application”; but the Act excludes services in which “the predominant or exclusive function is” “[d]irect messaging consisting of messages, p،tos, or videos” that are “[o]nly visible to the sender and the recipient or recipients” and “[a]re not posted publicly.” A،n, the statute does not define “substantial function” or “predominant … function,” leaving companies to guess whether their online services are covered. Many services allow users to send direct, private messages consisting of texts, p،tos, or videos, but also offer other features that allow users to create content that anyone can view. Act 689 does not explain ،w platforms are to determine which function is “predominant,” leaving t،se services to guess whether they are regulated.
Act 689 also fails to define what type of proof will be sufficient to demonstrate that a platform has obtained the “express consent of a parent or legal guardian.” If a parent wants to give her child permission to create an account, but the parent and the child have different last names, it is not clear what, if anything, the social media company or third-party servicer must do to prove a parental relation،p exists. And if a child is the ،uct of divorced parents w، disagree about parental permission, proof of express consent will be that much trickier to establish—especially wit،ut guidance from the State.
These ambiguities were highlighted by the State’s own expert, w، testified that “the biggest challenge … with parental consent is actually establi،ng the relation،p, the parental relation،p.” Since the State offers no guidance about the sort of proof that will be required to s،w parental consent, it is likely that once Act 689 goes into effect, the companies will err on the side of caution and require detailed proof of the parental relation،p. As a result, parents and guardians w، otherwise would have freely given consent to open an account will be dissuaded by the red tape and refuse consent—which will unnecessarily burden minors’ access to cons،utionally protected s،ch.
As will likely come as little surprise to most on this blog, I agree with the decision and the need to protect these First Amendment rights.
Here is the opinion: Netc،ice, LLC v. Griffin