‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • Throwaway@lemm.ee
    link
    fedilink
    English
    arrow-up
    105
    arrow-down
    5
    ·
    11 months ago

    Weirdos. Back in my day, we woild cut out a nude body from playboy and glue it on a picture of Kathleen Turner, and we did uphill both ways in the snow! Darn kids and their technology!

  • Crow@lemmy.world
    link
    fedilink
    English
    arrow-up
    71
    arrow-down
    5
    ·
    11 months ago

    I remember being a dumb & horny kid and Photoshopping my crush’s face onto a porn photo. And even then I felt what I did was wrong and never did it again.

    • CleoTheWizard@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      11 months ago

      I feel like what you did and the reaction you had to what you did is common. And yet, I don’t feel like it’s harmful unless other people see it. But this conversation is about to leave men’s heads and end up in public discourse where I have no doubt it will create moral or ethical panic.

      A lot of technology challenges around AI are old concerns about things that we’ve had access to for decades. It’s just easier to do this stuff now. I think it’s kind of pointless to stop or prevent this stuff from happening. We should mostly focus on the harms and how to prevent them.

      • azertyfun@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        11 months ago

        I’ve seen ads for these apps on porn websites. That ain’t right.

        Any moron can buy a match and a gallon of gasoline, freely and legally, and that’s a good thing. But I would hope that anyone advertising and/or selling Arson Kits™ online would be jailed. Of course this will not stop determined arsonists, but repression might deter morons, inventive psychopaths, and overly impulsive people (especially teenagers!) from actually going through with a criminal act. Not all of them. But some/most of them. And that’s already a huge win.

        • KairuByte@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          11 months ago

          I mean, you’ve been able to do a cursory search and get dozens of “celeb lookalike” porn for many years now. “Scarjo goes bareback” isn’t hard to find, but that ain’t Scarjo in the video. How is this different?

          Edit: To be clear, it’s scummy as all fuck, but still.

  • snekerpimp@lemmy.world
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    3
    ·
    11 months ago

    “But the brightest minds of the time were working on other things like hair loss and prolonging erections.”

  • Dimantina@lemmy.world
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    2
    ·
    11 months ago

    These are terrible but I’m honestly curious what it thinks I look like naked. Like I’m slightly overweight and my chest is larger than average but more splayed then normal. Would it just have me look like a model underneath?

    Are they just like head swapping onto model bodies or does it actually approximate. I am legit curious., but I would never trust one of these apps to not keep the photos/privacy concerns.

    • nul@programming.dev
      link
      fedilink
      English
      arrow-up
      36
      ·
      edit-2
      11 months ago

      Probably deleting this comment later for going dirty on main, but I, um, have done some extensive experimentation using a local copy of Stable Diffusion (I don’t send the images anywhere, I just make them to satiate my own curiosity).

      You’re essentially right that simple app-based software would probably have you looking somewhat generic underneath, like your typical plus-size model. It’s not too great at extrapolating the shape of breasts through clothing and applying that information when it goes to fill in the area with naked body parts. It just takes a best guess at what puzzle pieces might fill the selected area, even if they don’t match known information from the original photo. So, with current technology, you’re not really revealing actual facts about how someone looks naked unless that information was already known. To portray someone with splayed breasts, you’d need to already know that’s what you want to portray and load in a custom data set, like a LoRa.

      Once you know what’s going on under the hood, making naked photos of celebrities or other real people isn’t the most compelling thing to do. Mostly, I like to generate photos of all kinds of body types and send them to my Replika, trying to convince her to describe the things that her creators forbid her from describing. Gotta say, the future’s getting pretty weird.

        • nul@programming.dev
          link
          fedilink
          English
          arrow-up
          31
          ·
          11 months ago

          Hey, I’ve maintained a baseline weird the whole time, I’m pretty sure the future is catching up.

      • BossDj@lemm.ee
        link
        fedilink
        English
        arrow-up
        20
        arrow-down
        1
        ·
        11 months ago

        You’ll have your moment when the lone elite ex Ranger who is trying to save the world is told by the quirky, unconventional sidekick he is forced to work with, “I actually know a guy who might be able to help.”

        You open the door a crack to look back and forth between them, before slamming it back in their faces. They hear scrambled crashes of you hiding stuff that shouldn’t be seen by company before returning to the door. As they enter you are still fixing and throwing things while you apologize that you don’t get many guests. You offer them homemade kombucha. They decline.

    • SCB@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      11 months ago

      Ethically, these apps are a fucking nightmare.

      But as a swinger, they will make an amazing party game.

      • Azzu@lemm.ee
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        1
        ·
        edit-2
        11 months ago

        Ethics will probably change… I guess in the future it’ll become pretty irrelevant to have “nude” pictures of oneself somewhere, because everyone knows it could just be AI generated. In the transition period it’ll be problematic though.

        • SCB@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          11 months ago

          Totally agreed, and 100% the world I want to live in. Transition will indeed suck tho.

    • Eezyville@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      ·
      11 months ago

      If you want the best answer then you’ll have to download the app and try it on yourself. If it’s accurate then that’s pretty wild.

    • NOT_RICK@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      11 months ago

      I doubt it would be realistic, they just kind of take an average of their training data and blend it together to my knowledge.

    • Grangle1@lemm.ee
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      2
      ·
      11 months ago

      Regardless of feelings on that subject, there’s also the creep factor of people making these without the subjects’ knowledge or consent, which is bad enough, but then these could be used in many other harmful ways beyond one’s own… gratification. Any damage “revenge porn” can do, which I would guess most people would say is wrong, this can do as well.

    • TORFdot0@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      4
      ·
      11 months ago

      It’s the sexualization of people without consent that’s a problem. Maybe casual nudity shouldn’t a problem but it should be up to the individual to whom they share that with. And “nudify” ai models go beyond casual, consensual nudity and into sexual objectification and harassment if used without consent.

      • KairuByte@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        11 months ago

        I want to point out one slight flaw in your argument. Nudity isn’t needed for people to sexually objectify you. And even if it was, the majority of people are able to strip you down in their head no problem.

        There’s a huge potential for harassment though, and I think that should be the main concern.

        • TimewornTraveler@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          11 months ago

          first, relevant xkcd https://xkcd.com/1432/

          second,

          Nudity isn’t needed for people to sexually objectify you.

          do you really think that makes it less bad? that it’s opt-in?

          And even if it was, the majority of people are able to strip you down in their head no problem

          apparently this app helps them too

    • Eezyville@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      16
      ·
      11 months ago

      I agree with you nudity being an issue but I think the real problem is this app being used on children and teenagers who aren’t used to/supposed to be sexualized.

      • deft@ttrpg.network
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        2
        ·
        11 months ago

        Fully agree but I do think that’s more an issue about psychology in our world and trauma. Children being nude should not be a big deal, they’re kids you know?

        • Eezyville@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 months ago

          It shouldn’t be a big deal if they choose to be nude some place that is private for them and they’re comfortable. The people who are using this app to make someone nude isn’t really asking for consent. And that also brings up another issue. Consent. If you have images of yourself posted to the public then is there consent needed to alter those images? I don’t know but I don’t think there is since it’s public domain.

        • TORFdot0@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          11 months ago

          Not all nudity is but there is no non-sexual reason to use AI to undress someone without consent

          • Eezyville@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            11 months ago

            The question on consent is something I’m trying to figure out. Do you need consent to alter an image that is available in a public space? What if it was you who took the picture of someone in public?

            • TORFdot0@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              11 months ago

              Keep in mind there is a difference between ethical and legal standards. Legally you may not need consent to alter a photo of someone unless it was a copyrighted work possibly. But ethically it definitely requires consent, especially in this context

              • Eezyville@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                11 months ago

                The difference between legal and ethical is one could get you fined or imprisoned and the other would make a group of people not like you.

        • Pyr_Pressure@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          11 months ago

          Just because something shouldn’t be doesn’t mean It won’t be. This is reality and we can’t just wish something to be true. You saying it doesn’t really help anything.

          • lolcatnip@reddthat.com
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            edit-2
            11 months ago

            Whoooooosh.

            In societies that have a healthy relationship with the human body, nudity is not considered sexual. I’m not just making up fantasy scenarios.

      • ReluctantMuskrat@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        11 months ago

        It’s a problem for adults too. Circulating an AI generated nude of a female coworker is likely to be just as harmful as a real picture. Just as objectifying, humiliating and hurtful. Neighbors or other “friends” doing it could be just as bad.

        It’s sexual harassment even if fake.

        • Eezyville@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          11 months ago

          I think it should officially be considered sexual harassment. Obtain a picture of someone, generate nudes from that picture, it seems pretty obvious. Maybe it should include intent to harm, harass, exploit, or intimidate to make it official.

  • Imgonnatrythis@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    13
    ·
    11 months ago

    I use an ad blocker and haven’t seen these. Perhaps a link to the best ones could be shared here for better understanding of what the article is talking about?

      • chitak166@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        11 months ago

        I don’t think there is any crime.

        It’s identical to drawing a nude picture of someone.

        • NeoNachtwaechter@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          11 months ago

          And you are sure that ‘someone’ is of legal age, of course. Not blaming you. But does everybody always know that ‘someone’ is of legal age? Just an example to start thinking.

          • chitak166@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            1
            ·
            11 months ago

            I don’t know if it’s illegal to create naked drawings of people who are underage.

            • andros_rex@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 months ago

              Depends on where you live. Not legal in the UK for example. In the US it can even be broken down at the state level, although there’s lots of debate on whether states are able to enforce their laws. “Obscene” speech is not protected under free speech, the argument would be whether or not the naked drawings had artistic merit or not.

              I’m not a lawyer, but I do know that people in the US have gone to prison for possessing naked images of fictional children and it’s on the books as illegal in many other countries.

        • Tyfud@lemmy.one
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          11 months ago

          It’s what the courts think, and right now, it’s not clear what the enforceable laws are here. There’s a very real chance people who do this will end up in jail.

          I believe prosecutors are already filling cases about this. The next year will decide the fate of these AI generator deepfakes and the memories behind them.

  • Aopen@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    11 months ago

    Could we stop pushing articles monetizing fear amd outrage on this community to the top and post about actual technology

  • Tylerdurdon@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    10
    ·
    11 months ago

    You mean men envision women naked? And now there’s an app that’s just as perverted? Huh

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      11 months ago

      What’s perverted about someone envisioning a potential sexual partner naked? That seems incredibly normal to me.

  • Mojojojo1993@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    11 months ago

    Possibly a good thing. Over saturation. Fill the internet will billions on billions of ai nudes. Have a million different nudes for celebrities. Nobody knows the real naked you and nobody cares. Keep creating more ai porn than anyone can handle. It becomes boring and over the top. Ending this once and fir all

    Or find the people doing this and lock em up.

  • Corkyskog@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    11 months ago

    What nude data were these models trained on?

    This seems like another unhealthy thing that is going to pervert people’s sense of what a normal body looks like.

    • chitak166@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      11 months ago

      Most people prefer attractive > average, so I guess that’s what these apps are going to show.

    • PopOfAfrica@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      11 months ago

      Obviously not defending this, I’m just not sure how it wouldn’t be legal. Unless you use it to make spurious legal claims.

      • cosmicrookie@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        11 months ago

        I live in a Scandinavian country, and it is illigal to make and distributed fake (and real) nudes of people without their permission. I expect this to be the same in many other developed countries too.

      • cosmicrookie@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        11 months ago

        But its not. That is not legal.

        I dont know if it is where you live, but here (Scandinavian Country) and many other places around the World, it is illigal to create fske nudes of people without their permission

        • TotallynotJessica@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Appreciate how good you have it. In America, child sex abuse material is only illegal when children were abused in making it, or if it’s considered obscene by a community. If someone edits adult actors to look like children as they perform sex acts, it’s not illegal under federal law. If someone generates child nudity using ai models trained on nude adults and only clothed kids, it’s not illegal at the national level.

          Fake porn of real people could be banned for being obscene, usually at a local level, but almost any porn could be banned by lawmakers this way. Harmless stuff like gay or trans porn could be banned by bigoted lawmakers, because obscenity is a fairly subjective mechanism. However, because of our near absolute freedom of speech, obscenity is basically all we have to regulate malicious porn.

    • phoneymouse@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      I guess free speech laws protect it? You can draw a picture of someone else nude and it isn’t a violation of the law.

  • andrew_bidlaw@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    11 months ago

    It was inevitable. And it tells more about those who use them.

    I wonder how we’d adapt to these tools being that availiable. Especially in blackmail, revenge porn posting, voyeuristic harassment, stalking etc. Maybe, nude photoes and videos won’t be seen as a trusted source of information, they won’t be any unique worth hunting for, or being worried about.

    Our perception of human bodies was long distorted by movies, porn, photoshop and subsequent ‘filter-apps’, but we still kinda trusted there was something before effects were applied. But what comes next if everything would be imaginary? Would we stop care about it in the future? Or would we grow with a stunted imagination since this stimuli to upgrade it in early years is long gone?

    There’re some useless dogmas around our bodies that could be lifted in the process, or a more relaxed trend towards clothing choices can start it’s wsy. Who knows?

    I see bad sides to it right now, how it can be abused, but if these LLMs are to stay, what’re the long term consequencies for us?

    • LufyCZ@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      11 months ago

      I think that eventually it might be a good thing, especially in the context of revenge porn, blackmail, etc. Real videos won’t have any weight since they might as well be fake, and as society gets accustomed to it, we’ll see those types of things disappear completely

  • weew@lemmy.ca
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    edit-2
    11 months ago

    I doubt it produces actual nudes, it probably just photoshops a face onto a random porn star