Apps and websites that use artificial intelligence to undress women in photos are soaring in popularity, according to researchers.

In September alone, 24 million people visited undressing websites, according to the social network analysis company Graphika.

Many of these undressing, or “nudify,” services use popular social networks for marketing, according to Graphika. For instance, since the beginning of this year, the number of links advertising undressing apps increased more than 2,400% on social media, including on X and Reddit, the researchers said. The services use AI to recreate an image so that the person is nude. Many of the services only work on women.

These apps are part of a worrying trend of non-consensual pornography being developed and distributed because of advances in artificial intelligence — a type of fabricated media known as deepfake pornography. Its proliferation runs into serious legal and ethical hurdles, as the images are often taken from social media and distributed without the consent, control or knowledge of the subject.

  • Meowoem@sh.itjust.works
    link
    fedilink
    arrow-up
    18
    arrow-down
    2
    ·
    9 months ago

    Another day another puritan panic

    Yes we should ban open source and consumer gfx cards and everything else to stop the possibility that someone might have a sexual thought -in fact all humans should be blinded at birth to avoid this!

    Won’t someone think of the children!!!

  • GregorGizeh@lemmy.zip
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    9 months ago

    This panic over fake porn is the wrong response. In fact, encourage it, make it so ubiquitous that there is always fake porn of everyone, everywhere and nobody gives two shits about nude leaks or revenge porn any more.

    • JGrffn@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      2
      ·
      9 months ago

      Surely you can see how this also isn’t a fitting solution. Just… Go down the age brackets and see how increasingly uncomfortable it all becomes to tolerate this. There’s already been cases of AI porn of highschoolers made by highschoolers. We can keep going down the victim age line, or up the perpetrator age line. It gets bad pretty fast regardless of how ubiquitous this might become in the future.

      • GregorGizeh@lemmy.zip
        link
        fedilink
        arrow-up
        8
        ·
        9 months ago

        There isn’t another answer though. The tech is there, it will only get worse to the point we can’t recognize a fake any more. It makes much more sense to lean into it and make sure before that point is reached that it doesn’t ruin peoples lives any more to have that stuff circulated.

      • Radioactive Radio@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        Even with the age brackets. Porn of someone you know is just weird. We’ve progressed too far into the wrong direction. Imagine if people put this much effort into curing cancer.

  • tegs_terry@feddit.uk
    link
    fedilink
    English
    arrow-up
    9
    ·
    9 months ago

    It’ll work in their favour eventually. There won’t be any more revenge porn because it can just be dismissed as fake immediately.

  • paddirn@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    9 months ago

    Combining this with AR goggles, you really will be able to see everyone else naked when you’re giving a speech.

  • 𝐘Ⓞz҉@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    9 months ago

    Disgusting! Where are people downloading this app from so that I can avoid the website or that app store?

    • SchizoDenji@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      9 months ago

      Models from CivitAi, and Automatic1111/ComfyUI for front-end. The rest is just a learning curve for all the features and prompting.

  • guyrocket@kbin.social
    link
    fedilink
    arrow-up
    7
    ·
    10 months ago

    This article seems to imply that this cat can go back in the bag. I’m not at all sure about that.

  • CleoTheWizard@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    9 months ago

    I’ve made this comment other places, I’ll make it here. This tech changes much less than people think. If anything, this will protect people from leaking of nudes because people will assume it’s probably fake.

    All this is is an advanced form of fantasy that has existed ever since photoshop has existed. And importantly, we will deal with these nude photos the exact same way we deal with real nudes.

    Meaning, if you catch people distributing fake photos of their classmates, the punishment should be the same as if they were real. And they need to be severe.

    The reason this is a problem and I’m concerned for young women is that protections for sexual harassment online have already been abysmal. So this will make things worse and since we don’t protect women very well in the US, I expect major issues. Basically, the problems aren’t new, but our lack of action will make this awful. Treated correctly, this is a non-issue and these photos should be kept in private.

  • Karlos_Cantana@kbin.social
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    9 months ago

    Nowhere in the article did it mention what the apps were. I’d just like to know what they are so I can avoid them.

    • DominusOfMegadeus@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      I actually thought about this. I even turned on the VPN and googled these sites. NGL I thought about people I know. Then I thought that as soon as I uploaded my or their images, and did this, both the original and anything generated are out there, and can never be pulled back. And who knows where they might pop up, or what they might be turned into? And then I got kinda scared, and closed that shit out.

  • SPRUNT@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    By 2030 we’ll have AR glasses with cameras, microphones, speakers, and a built in AI assistant that will digitally remove everyone’s clothing in real time.

  • queermunist she/her@lemmy.ml
    link
    fedilink
    arrow-up
    2
    arrow-down
    6
    ·
    10 months ago

    Okay, so we need to get unique codes tattooed onto our genitals. That way if your nudes show up, you can always know for sure if they’re real or fake (and, importantly, who tf leaked your nudes)