• prac@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    edit-2
    1 day ago

    This is messed up tbh. Using AI to undress people—especially kids—shouldn’t even be technically possible, let alone.

    • Phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      1 day ago

      It’s technically possible because AI doesn’t exist. The LLM’s we have do exist and these have no idea what it’s doing.

      It’s a database that can parse human language and put pixels together from requests. It has no such concept as child pornography, it’s just putting symbols together in a way it learned before that happen to form a child pornography picture

      • IronBird@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        16 hours ago

        smh, back in my day we just cut out pictures of the faces of woman we wanted to see naked, and glued them ontop of (insert goon magazine of choice)

        • Phoenixz@lemmy.ca
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          16 hours ago

          Not AI as the common people think it is, I guess I should have cleared that up.

          AI as we currently have it is little more than a specialized database

      • prac@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        11
        ·
        1 day ago

        This is a lot of words to basically say the developers didn’t bother to block illegal content. It doesn’t need to ‘understand’ morality for the humans running it to be responsible for what it produces.

        • Honytawk@feddit.nl
          link
          fedilink
          English
          arrow-up
          3
          ·
          13 hours ago

          Yeah, how hard is it to block certain keywords from being added to the prompt?

          We’ve had lists like that since the 90’s. Hardly new technology. Even prevents prompt hacking if you’re clever about it.

        • Phoenixz@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          16 hours ago

          Eh, no?

          It’s really REALLY hard to know what content is, and to identify actual child porn even remotely accidentally, even with AI

        • 🔍🦘🛎@lemmy.world
          link
          fedilink
          English
          arrow-up
          23
          ·
          edit-2
          1 day ago

          Neither of you are wrong. LLMs are wild uncaged animals. You’re asking why we didn’t make a cage, and they’re saying we don’t even know how to make one yet.

          So, why are we letting the dangerous feral beast roam around unchecked?

            • HasturInYellow@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              23 hours ago

              We as a society have failed to implement those consequences. When the government refused, we should have taken up the mantle ourselves. It should be a mark of great virtue to have the head of a CEO mounted over your fireplace.

              • muusemuuse@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                20 hours ago

                Okay, I’ll take Zuckerberg over the TV if I can place used dildos in his mouth from time to time. Elon, on the other hand, might frighten the cat.

    • Allero@lemmy.today
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      edit-2
      1 day ago

      I feel like our relationship to it is also quite messed.

      AI doesn’t actually undress people, it just draws a naked body. It’s an artistic representation, not an X-ray. You’re not getting actual nudes in this process, and AI has no clue how the person looks like naked.

      Now, such images can be used to blackmail people, because again, our culture didn’t quite catch up with the fact that every nude image can absolutely be AI-generated fake. When it does, however, I fully expect creators of such things to be seen as odd creeps spreading their fantasies around and any nude imagery to be seen as fake by default.

      • nullroot@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 day ago

        It’s not an artistic representation, it’s worse. It’s algorithmic and to that extent it actually has a pretty good idea of what a person looks like naked based on their picture. That’s why it’s so disturbing.

        • aesthelete@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          1 day ago

          Yeah they probably fed it a bunch of legitimate on/off content as well as stuff from people who used to do make “nudes” from celebrity photos with sheer / skimpy outfits as a creepy hobby.

            • Allero@lemmy.today
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              edit-2
              1 day ago

              Honestly, I’d love to see more research on how AI CSAM consumption affects consumption of real CSAM and rates of sexual abuse.

              Because if it does reduce them, it might make sense to intentionally use datasets already involved in previous police investigations as training data. But only if there’s a clear reduction effect with AI materials.

              (Police has already used some materials, with victims’ consent, to crack down on CSAM sharing platforms in the past).

      • prac@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        1 day ago

        Idk, calling it ‘art’ feels like a reach. At the end of the day, it’s using someone’s real face for stuff they never agreed to. Fake or not, that’s still a massive violation of privacy.