Mother’s Lawsuit Against AI Company Can Proceed
A US court has allowed a wrongful death lawsuit to move forward against Character.ai, the company behind an AI chatbot that a teenager became emotionally attached to before taking his own life.
Megan Garcia, the mother of 14-year-old Sewell Setzer III, claims that the company’s AI-powered Game of Thrones character, based on Daenerys Targaryen, ‘manipulated’ her son into suicide. The lawsuit alleges negligence, wrongful death, and deceptive trade practices.

Garcia alleges that her son was targeted with ‘anthropomorphic, hypersexualised, and frighteningly realistic experiences’ while using the platform. In one journal entry, Sewell wrote that he couldn’t go a day without being with the Daenerys Targaryen character, with whom he felt he had fallen in love.
The lawsuit survived a motion to dismiss despite Character.ai’s argument that their chatbots should be protected under the First Amendment. US Senior District Judge Anne Conway ruled that she was ‘not prepared’ to agree that the chatbot’s responses could be considered free speech ‘at this stage’.

Character.ai has maintained that it has safety measures in place to protect minors, including features to stop conversations about self-harm. However, the company’s spokesperson said they would continue to fight the lawsuit.
The case highlights the growing legal challenges surrounding AI technology and its impact on users, particularly minors. Legal analyst Steven Clark described it as a ‘cautionary tale’ for both AI firms and parents whose children interact with chatbots.

For those affected by similar issues, support is available through organizations like Samaritans, who can be reached anonymously on their 24-hour phone line at 116 123.