A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

  • quindraco@lemm.ee
    link
    fedilink
    arrow-up
    16
    arrow-down
    3
    ·
    9 months ago

    It’s not defamation. And the new law will likely fail to hold up to 1A scrutiny, if the description of it is accurate (it often is not, for multiple reasons that include these bills generally changing over time). This is more of a free speech issue than photoshopping someone’s head onto someone else’s nude body, because no real person’s head or body is involved, just an inhumanly good artist drawing a nude, and on top of that the law punishes possession, not just creation.

    An example question any judge is going to have for the prosecutor if this goes to trial is how the image the law bans is meaningfully different from writing a lurid description of what someone looks like naked without actually knowing. Can you imagine going to jail because you have in your pocket a note someone else wrote and handed you that describes Trump as having a small penis? Or a drawn image of Trump naked? Because that’s what’s being pitched here.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      9 months ago

      It actually proposes “possession with the intention to distribute” which just show what a meme law this is. How do you determine the intention to distribute for an image?

      And I disagree with your take that this can’t be defamation. Quick googling says the general consensus is that this would fall in the defamation family of laws which makes absolute sense since a deepfake is an intentional misrepresentation.

      • Sagifurius@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        I guess if you have AI generate the senate house speaker fucking her in the ass in an alley full of trash while she holds money bags, it’s then political satire and protected?