Center for Technology and Innovation

Why App S⁠t⁠ore Surve⁠i⁠llance Won’⁠t⁠ Keep K⁠i⁠ds Safe

By: Turner Loesel / May 15, 2025

Turner Loesel

POLICY ANALYST

Center for Technology and Innovation

May 15, 2025

Politicians who frequently champion parental authority in classrooms are simultaneously crafting policies that diminish that very autonomy in digital spaces. This contradiction became more apparent when Michigan Representative John James introduced the App Store Accountability Act alongside Utah Senator Mike Lee. Their bill would require app stores to collect sensitive personal information and obtain parental consent before minors can download apps. Although seemingly well-intentioned, these laws put parents in the backseat of their children’s lives while creating serious privacy and free speech concerns for all Americans.

This legislation emerged after courts struck down state laws imposing similar age verification requirements directly on social media companies for violating First Amendment protections. By shifting their target from platforms to app stores, lawmakers hope to bypass these constitutional obstacles. As Florida’s legislature recently discovered when considering a similar bill, moving where age verification occurs simply repackages the same flawed concept while the fundamental problems remain. 

The core issue lies in how age verification works. Any system that accurately determines a user’s age requires collecting sensitive personal information. Current commercially available methods require government IDs or biometric face scans. The App Store Accountability Act goes further by requiring parental consent verification, likely necessitating additional documentation such as birth certificates or adoption paperwork. These requirements create vast repositories of families’ most intimate information, vulnerable to breaches and misuse. Parents eager to protect their children would subsequently surrender their own privacy in the process.

The courts have recognized these concerns for over a decade. In ACLU v. Gonzales, the Supreme Court held that “[r]equiring Internet users to provide . . . personally identifiable information to access a Web site would significantly deter many users from entering the site[ ] because Internet users are concerned about security … [and] are afraid of fraud and identity theft on the Internet.” This creates a significant First Amendment problem where users who decline to share personal information out of reasonable privacy concerns will effectively have their constitutional rights blocked.

This principle has been corroborated in PSINet, Inc. v. Chapman, where the court wrote that the fear of cyber-criminals accessing identifying information “may chill the willingness of some adults to participate in the ‘marketplace of ideas’…” The Supreme Court has consistently applied strict scrutiny to laws that burden adult access to protected speech, regardless of where in the digital ecosystem the restriction is imposed. When strict scrutiny is applied, lawmakers must show that they are using the least restrictive means to keep our citizens safe.

Whether verification occurs at the website, platform, or app store level, the Court has repeatedly held that requiring users to verify their age to access legally protected content is unconstitutional where there are less restrictive alternatives available. App store age verification does the opposite, instead requiring a verification regime to enable the same control features that parents can already willingly activate.

Representative James and countless others proposing this legislation across the country argue that these age verification measures function similarly to convenience stores that check IDs before selling age-restricted products like alcohol or tobacco. However, this analogy fails to account for the fundamental differences between physical and digital environments. At a convenience store, your information isn’t saved or shared beyond the store. Additionally, convenience stores don’t have the troubling history of mishandling sensitive information that has repeatedly plagued age verification services.

Even if perfect verification systems existed, the “whack-a-mole” nature of digital access means these measures would fail in practice. For example, many of the same apps and services on mobile devices can be accessed through the internet or on a computer. When Louisiana’s age verification law went into effect, underage users simply substituted their content with the next available source. The same thing will happen, no matter where age verification takes place.

A more effective approach would empower parents through education and access to practical tools that don’t require invasive verification requirements. Parents already have access to numerous controls that allow them to restrict content and applications on their children’s devices, including requiring parental consent for all app downloads. These tools typically require minimal technical knowledge to implement and can be customized to align with each family’s values and circumstances.

At least twenty states have launched innovative programs that train children to navigate digital spaces safely, including Florida, California, and New Hampshire. Unlike legislation to restrict access to the online ecosystem, these digital literacy education initiatives teach young users to recognize potential dangers while building critical thinking skills. Just as parents teach kids to be safe in the real world instead of locking them inside, the same should be done online.

Robust digital safety requires collaboration between families, educators, technology companies, and the government. Rather than implementing constitutionally questionable verification systems that collect sensitive data, we should invest in comprehensive digital literacy programs and enhance existing parental control tools. True parental empowerment recognizes that digital privacy protection and constitutional rights are essential components of family autonomy in the digital age, not obstacles to be circumvented.