A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • kent_eh@lemmy.ca
    link
    fedilink
    English
    arrow-up
    100
    arrow-down
    9
    ·
    8 months ago

    People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me

    Because now it’s faster, can be generated in bulk and requires no skill from the person doing it.

    • ArmokGoB@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      40
      arrow-down
      18
      ·
      8 months ago

      I blame electricity. Before computers, people had to learn to paint to do this. We should go back to living like medieval peasants.

    • Vespair@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      7 months ago

      no skill from the person doing it.

      This feels entirely non-sequitur, to the point of damaging any point you’re trying to make. Whether I paint a nude or the modern Leonardi DaVinci paints a nude our rights (and/or the rights of the model, depending on your perspective on this issue) should be no different, despite the enormous chasm that exists between our artistic skill.

    • Bob Robertson IX @discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      37
      ·
      8 months ago

      A kid at my high school in the early 90s would use a photocopier and would literally cut and paste yearbook headshots onto porn photos. This could also be done in bulk and doesn’t require any skills that a 1st grader doesn’t have.

      • ChexMax@lemmy.world
        link
        fedilink
        English
        arrow-up
        28
        arrow-down
        9
        ·
        8 months ago

        Those are easily disproven. There’s no way you think that’s the same thing. If you can pull up the source photo and it’s a clear match/copy for the fake it’s easy to disprove. AI can alter the angle, position, and expression on your face in a believable manor making it a lot harder to link the photo to source material

        • Bob Robertson IX @discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          5
          ·
          8 months ago

          This was before Google was a thing, much less reverse lookup with Google Images. The point I was making is that this kind of thing happened even before Photoshop. Photoshop made it look even more realistic. AI is the next step. And even the current AI abilities are nothing compared to what they are going to be even 6 months from now. Yes, this is a problem, but it has been a problem for a long time and anyone who has wanted to create fake nudes of someone has had the ability to easily do so for at least a generation now. We might be at the point now where if you want to make sure you don’t have fake nudes created of you, then you don’t have images of yourself published. However now that everyone has high quality cameras in their pockets, this won’t 100% protect you.

    • Dkarma@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      21
      ·
      8 months ago

      Not relevant. Using someone’s picture never ever required consent.