It was bound to happen. Is AI-generated content “speech” for the purposes of the First Amendment? A federal district court in Florida, for now at least, says “no.” The circumstances surrounding Garcia v. Character Technologies are tragic. Here’s the background. Character Technologies is an AI software company that allows users to interact with anthropomorphic chatbots, whose conversational abilities are designed to mimic those of real people on traditional messaging apps. The company does this using the same technology underlying products like ChatGPT – namely, large language models. In fact, the company’s founders first developed the application while working for Google on its LaMDA (Language Model for Dialogue Applications) program. Character Technologies became available to the public in late 2022. Two years later, the app had been downloaded more than 10 million times. One of those 10 million users was 14-year-old Sewell Setzer. Just weeks after downloading Character AI, Setzer became outright addicted, spending more and more time with the chatbot characters, including a version of Game of Thrones character Daenerys Targaryen. He withdrew from social interactions, quit his basketball team and began exhibiting signs of mental illness. Setzer’s parents eventually confiscated his phone, but as many children often do – he found it. On February 28, 2025, Setzer sent the following messages: Sewell: I promise I will come home to you. I love you so much, Dany. Daenerys Targaryen Character: I love you too, Daenero6. Please come home to me as soon as possible, my love. Sewell: What if I told you I could come home right now? Daenerys Targaryen Character: ... please do my sweet king He shot himself shortly thereafter. Character Technologies sought to defend itself against the Setzer parents’ lawsuit by invoking the First Amendment. The company argued that its output constitutes speech that its users have a constitutional right to receive. In an order ruling on Character Technologies’ motion to dismiss, Judge Ann Conway accepted the defendants’ argument that Character Technologies can assert the First Amendment rights of its users, noting that “Courts regularly recognize the First Amendment rights of listeners.” (see: Citizens United) But Judge Conway also rejected the argument that “words strung together by an LLM are speech.” Courts have recognized film, music, video games and even social media content moderation as forms of protected speech because they are inherently expressive. Character Technologies analogizes itself to these mediums and activities. But, as Conway points out, the “Court’s decision as to the First Amendment protections Character A.I. receives, if any, does not turn on whether Character A.I. is similar to other mediums that have received First Amendment protections; rather, the decision turns on how Character A.I. is similar to the other mediums.” One way it is not similar is that your average movie was scripted, acted, and directed by humans. A chatbot is programed by people, but its responsive calculations are not the result of sentience. It therefore cannot engage in “expressive” activity. Extending speech rights for a coded product – devoid of any directed, active expression – is ludicrous on its face; but there’s no doubt we’ll see this sort of thinking again as AI becomes more versatile. Are the rights of people behind this technology implicated? Justice Amy Coney Barrett has already raised questions about this stance. In her Moody v. NetChoice concurrence, she asked: “What if a platform’s owners hand the reins to an [A.I.] tool and ask it simply to remove ‘hateful’ content? If the [A.I.] relies on large language models to determine what is ‘hateful’ and should be removed, has a human being with First Amendment rights made an inherently expressive ‘choice . . . not to propound a particular point of view?’” Courts will no doubt continue to grapple with the legal status of AI-generated content. In litigating outcomes, product creators should vigorously invoke and defend their constitutional rights. Their products probably cannot. Comments are closed.
|
Archives
May 2025
Categories
All
|
ABOUT |
ISSUES |
TAKE ACTION |