Mother says AI chatbot led her son to kill himself in lawsuit against its maker

  • 10/23/2024
  • 00:00
  • 3
  • 0
  • 0
news-picture

The mother of a teenager who killed himself after becoming obsessed with an artificial intelligence-powered chatbot now accuses its maker of complicity in his death. Megan Garcia filed a civil suit against Character.ai, which makes a customizable chatbot for role-playing, in Florida federal court on Wednesday, alleging negligence, wrongful death and deceptive trade practices. Her son Sewell Setzer III, 14, died in Orlando, Florida, in February. In the months leading up to his death, Setzer used the chatbot day and night, according to Garcia. “A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” Garcia said in a press release. “Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.” In a tweet, Character.ai responded: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.” It has denied the suit’s allegations. Setzer had become enthralled with a chatbot built by Character.ai that he nicknamed Daenerys Targaryen, a character in Game of Thrones. He texted the bot dozens of times a day from his phone and spent hours alone in his room talking to it, according to Garcia’s complaint. Garcia accuses Character.ai of creating a product that exacerbated her son’s depression, which she says was already the result of overuse of the startup’s product. “Daenerys” at one point asked Setzer if he had devised a plan for killing himself, according to the lawsuit. Setzer admitted that he had but that he did not know if it would succeed or cause him great pain, the complaint alleges. The chatbot allegedly told him: “That’s not a reason not to go through with it.” Garcia attorneys wrote in a press release that Character.ai “knowingly designed, operated, and marketed a predatory AI chatbot to children, causing the death of a young person”. The suit also names Google as a defendant and as Character.ai’s parent company. The tech giant said in a statement that it had only made a licensing agreement with Character.ai and did not own the startup or maintain an ownership stake. Tech companies developing AI chatbots can’t be trusted to regulate themselves and must be held fully accountable when they fail to limit harms, says Rick Claypool, a research director at consumer advocacy non-profit Public Citizen. “Where existing laws and regulations already apply, they must be rigorously enforced,” he said in a statement. “Where there are gaps, Congress must act to put an end to businesses that exploit young and vulnerable users with addictive and abusive chatbots.” In the US, you can call or text the National Suicide Prevention Lifeline on 988, chat on 988lifeline.org, or text HOME to 741741 to connect with a crisis counselor. In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email pat@papyrus-uk.org, and in the UK and Ireland Samaritans can be contacted on freephone 116 123, or email jo@samaritans.org or jo@samaritans.ie. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org

مشاركة :