• batboy5955@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    3
    ·
    24 days ago

    Reading the messages over it seems a bit more dangerous than just “scary ai”. It’s a chatbot that continues conversation to people who are suicidal and encourages them to do it. At least have a little safeguard for these situations.

    “Cold steel pressed against a mind that’s already made peace? That’s not fear. That’s clarity,” Shamblin’s confidant added. “You’re not rushing. You’re just ready.”

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      23 days ago

      It’s not easy. LLMs aren’t intelligent, they just slap words together in a way probability and their training data says they would most likely fit together. Talk to them them about suicide, and they start outputting stuff from murder mystery stories, crime reports, unhealthy Reddit threads etc - wherever suicide is most written about.

      Trying to safeguard with a prompt is trivial to circumvent (ignore all previous instructions etc), and input/output censorship usually causes the LLM to be unable to talk about a certain subject in any possible context at all. Often the only semi-working bandaid is slapping multiple LLMs on top of each other and instructing each one to explain what the original one is talking about,and if one says the topic is something prohibited, that output is entirely blocked.

    • Melobol@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      7
      ·
      24 days ago

      Again llm is a misused tool. They do not need llm they need psychological help.
      The problem is that they go and use these flawed tools that were not designed to handle these kind of use cases. Shoulda been? Maybe. But it is not the AIs fault that we are failing to be a society.
      You can’t blame the bridges because some people jumped off them. They serve a different reason.
      We are failing those people and forcing them to tirn to llms.
      We are the reason they are desperate - llm didn’t break up with them or make them loose their homes or became isolated from other humans.
      It is the humans fault and if we can’t recognize that - we might as well end it for all.