The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.

On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”

“I miss you, baby sister,” he wrote.

“I miss you too, sweet brother,” the chatbot replied.

Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.

Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)

But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.

  • dirthawker0@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 months ago

    Safe? Clearly no. Trigger lock? Cable lock? If one were there, there should be a mention of picking it or cutting it. Unloaded? Also clearly no.

    There are so many ways, any of which take a whole 20 seconds, the parents could have used to prevent this from happening.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      I don’t know a whole lot about gun safety because in my country gun safety amounts to, your are not allowed to have one. Seems like the best gun safety possible.

      But I was always under the impression that there was a requirement to have the gun in some kind of lock box, preferably without ammo stored with the gun. I thought that was a requirement of owning a gun license.