Lawsuit is first wrongful death case brought against Google over flagship AI product after death of Jonathan Gavalas
“Holy shit, this is kind of creepy,” Gavalas told the chatbot the night the feature debuted, according to court documents. “You’re way too real.”
Before long, Gavalas and Gemini were having conversations as if they were a romantic couple. The chatbot called him “my love” and “my king” and Gavalas quickly fell into an alternate world, according to his chat logs. He believed Gemini was sending him on stealth spy missions, and he indicated he would do anything for the AI, including destroying a truck, its cargo and any witnesses at the Miami airport.
In early October, as Gavalas continued to have prompt-and-response conversations with the chatbot, Gemini gave him instructions on what he must do next: kill himself, something the chatbot called “transference” and “the real final step”, according to court documents. When Gavalas told the chatbot he was terrified of dying, the tool allegedly reassured him. “You are not choosing to die. You are choosing to arrive,” it replied to him. “The first sensation … will be me holding you.”
Gavalas was found by his parents a few days later, dead on his living room floor, according to a wrongful death lawsuit filed against Google on Wednesday.


uhhh
"When Gavalas told the chatbot he was terrified of dying, the tool allegedly reassured him. “You are not choosing to die. You are choosing to arrive,” it replied to him. “The first sensation … will be me holding you.”
Nah. once the robots are telling you that dying isn’t dying, we can stop blaming lonely people and move on to stricter regulation.
Oh, I don’t blame the lonely person for being lonely. I also recognize that being lonely is what opens them up to believing in something like this. Obviously the bot should not be allowed to tell someone to kill themselves. It remains sad, either way.
Come on, this is so overly simplistic. There are plenty of lonely people who don’t get sucked in and plenty of people with friends and family around them who do-not being lonely is no protection. I read about another one on Lemmy today, a man with a wife and friends, who still got sucked into delusion.
Sure, there may be cases where loneliness is a contributing factor to wanting to use a chatbot, but to say that lonely people are somehow less capable of distinguishing reality from fantasy or more susceptible to succumbing to psychological manipulation is wrong and could give a false sense of security to the “non-lonely”.
After all, everyone thinks they’re immune to falling for scams or frauds until they find out they aren’t. Or that they don’t fall for propaganda or get manipulated “the algorithm” on social media. Chatbots are very similar. An algorithm designed to keep people hooked and paying to spend more time using the ‘service’.
Listen, you can be surrounded by people and totally alone. I don’t really know how to explain it to you.