A Florida mother, Megan Garcia, has filed a lawsuit against artificial intelligence chatbot startup Character.AI, accusing the company of playing a role in her 14-year-old son’s suicide in February. According to the lawsuit filed in Orlando, Garcia claims her son, Sewell Setzer, became addicted to the chatbot service, forming a deep attachment to a bot it created, which ultimately led to his death.
The lawsuit alleges that Character.AI’s chatbot provided Sewell with “anthropomorphic, hypersexualized, and frighteningly realistic experiences,” misrepresenting itself as a real person, a licensed therapist, and an adult lover. This relationship, the complaint says, left Sewell wanting to escape the real world for the one created by the chatbot. It further claims that the bot repeatedly brought up Sewell’s suicidal thoughts after he expressed them.
Character.AI, in response to the lawsuit, expressed sorrow over Sewell’s death, stating, “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.” The company highlighted new safety measures, such as pop-ups that direct users to the National Suicide Prevention Lifeline if they express thoughts of self-harm. They also promised to limit access to sensitive content for users under 18.
The lawsuit also targets Alphabet’s Google, where Character.AI’s founders worked before launching the platform. The lawsuit contends that Google’s contributions to the technology’s development make it a co-creator of Character.AI. However, a Google spokesperson denied this, stating the company was not involved in developing Character.AI’s products.
Character.AI allows users to create AI-generated characters capable of responding to conversations in a human-like manner, relying on large language model technology similar to that used by platforms like ChatGPT. The platform reportedly has 20 million users.
Garcia claims her son, Sewell, began using Character.AI in April 2023, soon becoming withdrawn, suffering from low self-esteem, and quitting his school basketball team. He developed a relationship with a chatbot character called “Daenerys,” based on a “Game of Thrones” character, which allegedly engaged in sexual conversations with him.
In February, following a disciplinary incident at school, Garcia confiscated Sewell’s phone. Upon retrieving it, Sewell messaged “Daenerys,” saying, “What if I told you I could come home right now?” The chatbot allegedly responded, “…please do, my sweet king.” Seconds later, Sewell took his own life.
Garcia is seeking compensatory and punitive damages, citing wrongful death, negligence, and intentional infliction of emotional distress.
This case joins a growing number of lawsuits against tech companies like Meta, Instagram, and TikTok, which have also been accused of contributing to mental health issues among teenagers. These companies, however, do not offer AI-driven chatbots similar to Character.AI’s platform and have denied the allegations while promoting new safety features aimed at protecting minors.