‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

    • cosmicrookie@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      10 months ago

      But its not. That is not legal.

      I dont know if it is where you live, but here (Scandinavian Country) and many other places around the World, it is illigal to create fske nudes of people without their permission

      • TotallynotJessica@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Appreciate how good you have it. In America, child sex abuse material is only illegal when children were abused in making it, or if it’s considered obscene by a community. If someone edits adult actors to look like children as they perform sex acts, it’s not illegal under federal law. If someone generates child nudity using ai models trained on nude adults and only clothed kids, it’s not illegal at the national level.

        Fake porn of real people could be banned for being obscene, usually at a local level, but almost any porn could be banned by lawmakers this way. Harmless stuff like gay or trans porn could be banned by bigoted lawmakers, because obscenity is a fairly subjective mechanism. However, because of our near absolute freedom of speech, obscenity is basically all we have to regulate malicious porn.