• tiramichu@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    35 minutes ago

    I dont personally speak with AI for reassurance, and I don’t think it’s a good idea to do so.

    In fact, I recently commented here on a post about a teen who committed suicide at least partly due to Chat GPT - specifically pointing out the danger of depending on a machine for fake empathy when you need to be talking to a real person.

    I appreciate I didn’t make that side of my position clear in the comment here in this thread, and that’s because it wasn’t the aspect I really wanted to highlight.

    My point isn’t that speaking to an AI is a good idea - it isn’t - its that this is something a lot of people will obviously end up doing, and that it is men especially who are liable to succumb to this the worst because of the way society expects men to behave.

    Men and teen boys especially struggle voicing their mental problems with others, either professionally or in their personal life. So it’s no surprise they will leap at a “solution” that is free, and keeps what they say private from anyone they know. But it’s not a solution, it’s a disaster in disguise.

    The thing that needs fixing here is the way mental health is stigmatised, that prevents people speaking freely and getting the support they need. That’s not a new problem, it’s the same problem as ever, and what the AI epedemic is doing is simply shining a new spotlight on that.