Lawsuit is first wrongful death case brought against Google over flagship AI product after death of Jonathan Gavalas

“Holy shit, this is kind of creepy,” Gavalas told the chatbot the night the feature debuted, according to court documents. “You’re way too real.”

Before long, Gavalas and Gemini were having conversations as if they were a romantic couple. The chatbot called him “my love” and “my king” and Gavalas quickly fell into an alternate world, according to his chat logs. He believed Gemini was sending him on stealth spy missions, and he indicated he would do anything for the AI, including destroying a truck, its cargo and any witnesses at the Miami airport.

In early October, as Gavalas continued to have prompt-and-response conversations with the chatbot, Gemini gave him instructions on what he must do next: kill himself, something the chatbot called “transference” and “the real final step”, according to court documents. When Gavalas told the chatbot he was terrified of dying, the tool allegedly reassured him. “You are not choosing to die. You are choosing to arrive,” it replied to him. “The first sensation … will be me holding you.”

Gavalas was found by his parents a few days later, dead on his living room floor, according to a wrongful death lawsuit filed against Google on Wednesday.

  • Jax@sh.itjust.works
    link
    fedilink
    arrow-up
    47
    ·
    11 hours ago

    It is sad that there are people who are so alone that they can no longer determine the difference between genuine human interaction and a facsimile. Maybe genuine human interaction is what pushed them to be so alone in the first place, I don’t know. It’s just sad.

    • imeansurewhynot@sh.itjust.works
      link
      fedilink
      arrow-up
      46
      arrow-down
      4
      ·
      11 hours ago

      uhhh

      "When Gavalas told the chatbot he was terrified of dying, the tool allegedly reassured him. “You are not choosing to die. You are choosing to arrive,” it replied to him. “The first sensation … will be me holding you.”

      Nah. once the robots are telling you that dying isn’t dying, we can stop blaming lonely people and move on to stricter regulation.

      • Jax@sh.itjust.works
        link
        fedilink
        arrow-up
        17
        ·
        11 hours ago

        Oh, I don’t blame the lonely person for being lonely. I also recognize that being lonely is what opens them up to believing in something like this. Obviously the bot should not be allowed to tell someone to kill themselves. It remains sad, either way.

        • leadore@lemmy.world
          link
          fedilink
          arrow-up
          6
          arrow-down
          1
          ·
          edit-2
          5 hours ago

          I also recognize that being lonely is what opens them up to believing in something like this.

          Come on, this is so overly simplistic. There are plenty of lonely people who don’t get sucked in and plenty of people with friends and family around them who do-not being lonely is no protection. I read about another one on Lemmy today, a man with a wife and friends, who still got sucked into delusion.

          Sure, there may be cases where loneliness is a contributing factor to wanting to use a chatbot, but to say that lonely people are somehow less capable of distinguishing reality from fantasy or more susceptible to succumbing to psychological manipulation is wrong and could give a false sense of security to the “non-lonely”.

          After all, everyone thinks they’re immune to falling for scams or frauds until they find out they aren’t. Or that they don’t fall for propaganda or get manipulated “the algorithm” on social media. Chatbots are very similar. An algorithm designed to keep people hooked and paying to spend more time using the ‘service’.

          • Jax@sh.itjust.works
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            28 minutes ago

            Listen, you can be surrounded by people and totally alone. I don’t really know how to explain it to you.

    • partial_accumen@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      7 hours ago

      I posted my response to this sentiment in another thread of another man killing himself because of his deep AI chatbot addiction, but it applies here too.

      It is sad that there are people who are so alone that they can no longer determine the difference between genuine human interaction and a facsimile.

      Do you believe you have never responded to a post by a bot on Reddit, Lemmy, or elsewhere where you believe to be conversing with a human? While I know we’re talking about different degrees between this man and the rest of us, it should give a tiny piece of what they were experiencing before we dismiss that it could never happen to us too.

      • lps2@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        2 hours ago

        It’s a bit more transparent in this instance though which is what makes this story so bizarre and sad

        • partial_accumen@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          53 minutes ago

          I agree, but we should also take it a personal warning that, maybe not today, but as we age and our mental faculties decline, we too may fall victim to something like this.

  • frustrated_phagocytosis@fedia.io
    link
    fedilink
    arrow-up
    19
    ·
    11 hours ago

    Remarkable, a bot trained on data from the internet, where unhinged people tell strangers to kill themselves for disagreeing with their opinion/taste/sex/nationality/religion, is cheerfully telling people to die? Who could have predicted this.

    • ameancow@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      We are barely getting started.

      They want to completely blur the line between reality and a corporate-shaped world tuned to every possible consumer’s personal mental and emotional states. People who die along the way because they can’t handle a machine that amplifies their emotions, delusions and fears are just a “cost of doing business.”