While Grok has introduced belated safeguards to prevent sexualised AI imagery, other tools have far fewer limits

“Since discovering Grok AI, regular porn doesn’t do it for me anymore, it just sounds absurd now,” one enthusiast for the Elon Musk-owned AI chatbot wrote on Reddit. Another agreed: “If I want a really specific person, yes.”

If those who have been horrified by the distribution of sexualised imagery on Grok hoped that last week’s belated safeguards could put the genie back in the bottle, there are many such posts on Reddit and elsewhere that tell a different story.

And while Grok has undoubtedly transformed public understanding of the power of artificial intelligence, it has also pointed to a much wider problem: the growing availability of tools, and means of distribution, that present worldwide regulators with what many view as an impossible task. Even as the UK announces that creating nonconsensual sexual and intimate images will soon be a criminal offence, experts say that the use of AI to harm women has only just begun.

  • arin@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    9
    ·
    1 day ago

    Photoshop has existed since the 90’s, and so have scissors and glue. It’s not AI harming women, it’s the shitty retarded conservatives that don’t know how to use photoshop or have any creativity for scissors and glue, using new technology to be the same retarded clown they were when they failed primary school.

    • TheRealKuni@piefed.social
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      1 day ago

      Nonsense. AI makes the process trivial and, with extreme ease, more realistic than the photoshop/scissors and glue of yore.

      Could you imagine finding out kids at your school were passing around extremely realistic nude pictures of you? Or having any argument you make be shut down by something producing a lurid picture of you? Even if it’s fake, that’s gotta do a number on people.

      This is different.

      • hector@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        6 hours ago

        It is different, but why are we only worried about woman being victimized? Men are not fair game to abuse either, and are not responsible for the accumulated sins of men any more than a citizen is not responsible for their government.

        • TheRealKuni@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          5 hours ago

          I didn’t say anything about women or men. I never said men are to blame for the accumulated sins of men. I never said men were fair game to abuse. I’m not sure where you’re getting this nonsense.

      • FishFace@piefed.social
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        7
        ·
        1 day ago

        Why are people concerned about having a fake (but realistic) nude photo of themselves being shared around?

        Not because people are looking at their actual naked body, obviously, because they aren’t. Rather, it’s because of what the people sharing those images are thinking and feeling while doing so; it’s because those people are sharing fake nudes as a way to sexually demean their victim. That aspect is wholly identical regardless of how exactly they are doing it. Sharing fake nudes should be treated the same regardless of the method: as sexual bullying. Maybe we didn’t recognise how serious it was when it was rare and required effort, but we also shouldn’t over-correct now.

        • MountingSuspicion@reddthat.com
          link
          fedilink
          arrow-up
          8
          ·
          1 day ago

          Also, AI continues to get more indistinguishable from actual images. If someone shares revenge porn but acts like it’s AI, the victim should not have to prove one way or the other. Currently, I think real or AI should be treated the same, but it’s possible I’m overlooking some unintended consequences of that.

    • Kühlschrank@lemmy.world
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      1 day ago

      I hear that argument a lot but the old method required access to the software and some actual skill with it. With Grok any smooth brain that can write at a fifth grade level has the ability to publicly victimize the women and/or girls in their life.