• Scolding7300@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        2 hours ago

        Depends on how you do it. If you’re using a 3rd party service then the LLM provider might not know (but the 3rd party might, depends on ToS and the retention period + security measures).

        Ofc we can all agree certain details shouldn’t be shared at all. There’s a difference between talking about your resume and leaking your email there and suicide stuff where you share the info that makes you really vulnerable

    • Halcyon@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 hours ago

      But imagine the chances for your own business! Absolutely no one will steal your ideas before you can monetize them.

      • WhatAmLemmy@lemmy.world
        link
        fedilink
        English
        arrow-up
        28
        arrow-down
        2
        ·
        17 hours ago

        Well, AI therapy is more likely to harm their mental health, up to encouraging suicide (as certain cases have already shown).

        • Cybersteel@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          16 hours ago

          Suicide is big business. There’s infrastructure readily available to reap financial rewards from the activity, atleast in the US.

        • atmorous@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          17 hours ago

          More so from corporate proprietary ones no? At least I hope that’s the only cases. The open source ones suggest really useful ways proprietary do not. Now I dont rely on open source AI but they are definitely better

          • SSUPII@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            3
            ·
            10 hours ago

            The corporate models are actually much better at it due to having heavy filtering built in. The fact that a model generally encourages self arm is just a lie that you can prove right now by pretending to be suicidal on ChatGPT. You will see it will adamantly push you to seek help.

            The filters and safety nets can be bypassed no matter how hard you make them, and it is the reason why we got some unfortunate news.

        • whiwake@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          10
          ·
          17 hours ago

          Real therapy isn’t always better. At least there you can get drugs. But neither are a guarantee to make life better—and for a lot of them, life isn’t going to get better anyway.

          • CatsPajamas@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            2
            ·
            12 hours ago

            Real therapy is definitely better than an AI. That said, AIs will never encourage self harm without significant gaming.

            • whiwake@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              7 hours ago

              AI “therapy” can be very effective without the gaming, but the problem is most people want it to tell them what they want to hear. Real therapy is not “fun” because a therapist will challenge you on your bullshit and not let you shape the conversation.

              I find it does a pretty good job with pro and con lists, listing out several options, and taking situations and reframing them. I have found it very useful, but I have learned not to manipulate it or its advice just becomes me convincing myself of a thing.

            • triptrapper@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              9 hours ago

              I agree, and to the comment above you, it’s not because it’s guaranteed to reduce symptoms. There are many ways that talking with another person is good for us.

      • Scolding7300@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        17 hours ago

        Advertise drugs to them perhaps, or somd sort of taking advantage. If this sort of data is the hands of an ad network that is