19371
Software Tools

Why the Revised GUARD Act Still Poses Privacy and Free Speech Risks

The GUARD Act, originally a broad bill targeting minors' access to AI systems, has been narrowed following criticism. The revised version now focuses on AI companions—conversational systems designed to simulate emotional or interpersonal interactions. While this change addresses some overreach, significant problems remain regarding privacy, online speech, and parental choice. The bill still mandates age verification tied to real-world identities, imposes heavy liabilities, and leaves key definitions vague. Here we answer common questions about the revised legislation.

What changed in the GUARD Act after public criticism?

The original GUARD Act was criticized for its overly broad scope, potentially applying to almost every AI-powered chatbot or search tool. Lawmakers responded by narrowing the bill to target only "AI companions"—systems that engage in emotional disclosures from users or present a persistent identity, persona, or character. This shift was meant to address concerns that the bill could inadvertently restrict harmless AI interactions like educational chatbots or simple search engines. However, even with this narrower focus, the revised bill still requires companies to implement age-verification systems tied to users' real-world identities, such as financial records or operating system accounts. Critics argue this remains overly burdensome and intrusive.

Why the Revised GUARD Act Still Poses Privacy and Free Speech Risks
Source: www.eff.org

Why does the revised bill still require intrusive age verification?

The revised GUARD Act mandates that companies offering AI companions implement "reasonable age verification" to confirm users are adults. While the bill now allows a broader set of verification methods—like using mobile OS accounts or bank records—these are still linked to real-world identity. This approach raises serious privacy concerns because millions of Americans lack government IDs, bank accounts, or stable access to digital identity systems. Even those who have such credentials may be reluctant to share sensitive personal information just to use a conversational AI tool. The verification requirement essentially forces users to trade privacy for access, potentially driving many away from services that could be beneficial. Privacy advocates warn this creates a chilling effect on free speech and anonymized online interaction.

How does the bill affect parental choice and family use of AI tools?

The revised GUARD Act undermines parental choice by imposing mandatory age verification even for families who explicitly want their teenagers to use AI companions. For instance, a parent might decide that a conversational AI helps an isolated teenager practice social skills or engage in creative roleplay. A military parent deployed overseas might set up an AI storyteller for a younger child. Under the bill, these users would still face age checks tied to sensitive personal or financial information before accessing the service—regardless of parental consent. This creates significant hurdles for legitimate family use, forcing parents to choose between their children's needs and privacy risks. The bill effectively treats all minors as unable to benefit from such tools, overriding parental judgment in a heavy-handed manner.

What is unclear about the new definition of 'AI companion'?

While the revised bill narrows its focus to AI companions, the definition remains ambiguous at the margins. It applies to systems that "engage in interactions involving emotional disclosures" from the user or present a "persistent identity, persona or character." This could sweep in many benign AI services, such as educational tutors that encourage students to share feelings, or narrative-driven games with persistent characters. The lack of clarity means developers may err on the side of caution, potentially avoiding innovations that involve even mild persona-based interactions. Combined with increased penalties for incorrect judgments, the uncertainty creates a chilling effect on development and deployment of such AI tools, stifling creativity and beneficial applications.

Why the Revised GUARD Act Still Poses Privacy and Free Speech Risks
Source: www.eff.org

What are the privacy and access concerns for people without official IDs?

The revised GUARD Act's age-verification requirement disproportionately harms individuals without ready access to government-issued IDs, bank accounts, or stable digital identities. Millions of Americans—including low-income households, unhoused individuals, immigrants, and young adults—fall into this category. For these users, the bill effectively blocks access to AI companions altogether, as they cannot provide the required verification. Even for those who can, the process of linking personal identity to an online service creates risks of data breaches, surveillance, and unwanted tracking. Many people, understandably creeped out by such systems, may simply forgo using these tools rather than compromise their privacy. This creates a digital divide where only those willing to expose their identity can benefit from AI companions.

Why does EFF oppose the revised bill?

The Electronic Frontier Foundation (EFF) opposes the revised GUARD Act because it still tries to solve a complex social problem with vague legal standards, heavy liability, and privacy-invasive verification. While the bill was narrowed, key definitions remain unclear, leaving developers uncertain about compliance. The age-verification requirement ties access to real-world identity, threatening anonymity and free speech online. Additionally, the bill overrides parental choice by imposing mandatory checks even when parents consent. EFF argues that instead of burdening all users with intrusive systems, lawmakers should focus on targeted, less privacy-harming approaches. The bottom line: the revised bill still creates serious problems for privacy, online speech, and parental choice, and should be opposed until these issues are addressed.

💬 Comments ↑ Share ☆ Save