• handsoffmydata@lemmy.zip
    link
    fedilink
    arrow-up
    1
    ·
    2 hours ago

    I never liked talking to a therapist. I never could connect and get past this is their job, they don’t really care. When LLMs became available for dev projects a few years back I thought there would be potential in this space but considering how much sycophancy the models express these type of applications seem legitimately dangerous.

  • Perspectivist@feddit.uk
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    9 hours ago

    I don’t use it to ask for mental health advice but it’s nice to have “someone” to talk to that at least pretends to be interested in what I have to say. I used to have these conversations with myself inside my head. AI at least sometimes brings up a new perspective or says something novel.

    Inb4 “just get friends dude”

  • tgcoldrockn@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    4 hours ago

    “men choose to freely train ai with their life stories to secure technofascist state” might be a better headline

  • taiyang@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    15 hours ago

    Can confirm. My dad’s getting a little too into his AI on his phone. He’s got deep emotional problems and is an alcoholic, but I don’t think his bot is going to do him much good. That said, men’s ego makes it hard to open up.

  • YogaDouchebag@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    18 hours ago

    AI and robots will have to take care of a lot of lonely or abandoned individuals for sure, since nobody is really interested in what others do or are going through.

    • wabafee@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      10 hours ago

      That is why there is a job for that. But I get you it’s free to talk to AI very accessable also compared to booking to your local therapist which there is also the act of booking a huge barrier to step into and lastly money.

  • tiramichu@sh.itjust.works
    link
    fedilink
    arrow-up
    14
    arrow-down
    1
    ·
    20 hours ago

    No shit.

    Other humans don’t want to hear about men’s mental health issues, because men are supposed to be stoic and strong and infallible, and if we arent achieving that, we’ve failed at being men.

    But AIs don’t judge, and they don’t cost anything either. I’m hardly surprised.

    • Xulai@mander.xyz
      link
      fedilink
      arrow-up
      8
      arrow-down
      2
      ·
      20 hours ago

      You’re missing the point.

      Something or someone who agrees with you, rarely challenges you or disagrees with you….is not something or someone that can help improve the situation and minimize recurrence.

      It only feels better momentarily.

      Like a drug. That costs money. See where this is going?

      • tiramichu@sh.itjust.works
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        32 minutes ago

        I dont personally speak with AI for reassurance, and I don’t think it’s a good idea to do so.

        In fact, I recently commented here on a post about a teen who committed suicide at least partly due to Chat GPT - specifically pointing out the danger of depending on a machine for fake empathy when you need to be talking to a real person.

        I appreciate I didn’t make that side of my position clear in the comment here in this thread, and that’s because it wasn’t the aspect I really wanted to highlight.

        My point isn’t that speaking to an AI is a good idea - it isn’t - its that this is something a lot of people will obviously end up doing, and that it is men especially who are liable to succumb to this the worst because of the way society expects men to behave.

        Men and teen boys especially struggle voicing their mental problems with others, either professionally or in their personal life. So it’s no surprise they will leap at a “solution” that is free, and keeps what they say private from anyone they know. But it’s not a solution, it’s a disaster in disguise.

        The thing that needs fixing here is the way mental health is stigmatised, that prevents people speaking freely and getting the support they need. That’s not a new problem, it’s the same problem as ever, and what the AI epedemic is doing is simply shining a new spotlight on that.

  • RustyShackleford@lemmy.zip
    link
    fedilink
    English
    arrow-up
    10
    ·
    20 hours ago

    As a man in my 40’s who sought mental help, it’s actually pretty important. But no one should trust AI to fill in for a psychiatrist.

  • DrDystopia@lemy.lol
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    20 hours ago

    I run my own LLM “AI” server at home because I want to try out various aspects and scenarios without having big tech snoop over my shoulder and be able to use any model I want.

    I can perfectly well see people getting good, working therapy from an LLM. But I think it would depend on the user taking the LLM seriously, and anybody with sufficient experience with LLM’s simply don’t.

    So the people this could help is the people that shouldn’t be allowed near an “AI” interface…

    • comador @lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      14 hours ago

      So the people this could help is the people that shouldn’t be allowed near an “AI” interface…

      Let’s see what this LLM says when I run this question 20,000 times from a clean prompt, then compare it against the same question poised more directly run another 20,000 times. Then I can pick the answer I like better and run that against a different LLM and…

      So what you’re saying is that this is NOT what I am supposed to do?

  • burntbacon@discuss.tchncs.de
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    21 hours ago

    I can understand it. A local llm is not only going to be more private than anything ever spoken aloud to another, but there is a giant benefit of it’s not like you even have to worry about the effect it will have on the other person. I know my past’s trauma would be painful to even listen to, I can’t imagine what some folks carry around with them.

    Part and parcel of the privacy means you don’t have to deal with the judgement or shaming from others. It would be easy to get drawn into the affirming of the llm as well.