A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

  • Merlin404@lemmy.world
    link
    fedilink
    arrow-up
    111
    arrow-down
    4
    ·
    8 months ago

    Tragic that they were a celebrity that had to go through it for them to do something. But when children or others have it happened to them, they just shrug…

      • Viking_Hippie@lemmy.world
        link
        fedilink
        arrow-up
        52
        arrow-down
        11
        ·
        8 months ago

        Probably helps that she’s super white too.

        This has been happening to AOC constantly since before she was first sworn in and it’s been crickets.

        When it happens once to the media’s favourite white billionaire, though? THAT’S when they start to take it seriously.

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    76
    arrow-down
    9
    ·
    edit-2
    8 months ago

    What a weird populist law tbh. There’s already an established law framework that covers this: defamation. Not a lawyer but it seems like this should be addressed instead of writing up some new memes.

    They’ll use this as an opportunity to sneak in more government spyware/control is my guess.

    • quindraco@lemm.ee
      link
      fedilink
      arrow-up
      16
      arrow-down
      3
      ·
      8 months ago

      It’s not defamation. And the new law will likely fail to hold up to 1A scrutiny, if the description of it is accurate (it often is not, for multiple reasons that include these bills generally changing over time). This is more of a free speech issue than photoshopping someone’s head onto someone else’s nude body, because no real person’s head or body is involved, just an inhumanly good artist drawing a nude, and on top of that the law punishes possession, not just creation.

      An example question any judge is going to have for the prosecutor if this goes to trial is how the image the law bans is meaningfully different from writing a lurid description of what someone looks like naked without actually knowing. Can you imagine going to jail because you have in your pocket a note someone else wrote and handed you that describes Trump as having a small penis? Or a drawn image of Trump naked? Because that’s what’s being pitched here.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        8 months ago

        It actually proposes “possession with the intention to distribute” which just show what a meme law this is. How do you determine the intention to distribute for an image?

        And I disagree with your take that this can’t be defamation. Quick googling says the general consensus is that this would fall in the defamation family of laws which makes absolute sense since a deepfake is an intentional misrepresentation.

        • Sagifurius@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          I guess if you have AI generate the senate house speaker fucking her in the ass in an alley full of trash while she holds money bags, it’s then political satire and protected?

    • General_Effort@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      8 months ago

      Even better: Intentional infliction of emotional distress

      There are business interests behind this. There is a push to turn a likeness (and voice, etc.) into an intellectual property. This bill is not about protecting anyone from emotional distress or harm to their reputation. It is about requiring “consent”, which can obviously be acquired with money (and also commercial porn is an explicit exception). This bill would establish this new kind of IP in principle. It’s a baby step but still a step.

      You can see in this thread that proposing to expand this to all deepfakes gets a lot of upvotes. Indeed, there are bills out there that go all the way and would even make “piracy” of this IP a federal crime.

      Taylor Swift could be out there, making music or having fun, while also making money from “her consent”, IE by licensing her likeness. She could star in movies or makes cameos by deepfaking her on some nobody actor. She could license all sorts of youtube channels. Or how about a webcam chat with Taylor? She could be an avatar for ChatGPT, or she could be deepfaked onto one of those Indian or Kenyan low-wage workers who do tech support now.

      We are not quite there yet, technologically, but we will obviously get there soonish. Fakes in the past were just some pervs who were making fan art of a sort. Now the smell of money is in the air.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        8 months ago

        This seems like the most likely scenario tbh. I’m not sure whether personal likeness IP is a bad thing per se but one thing is sure - it’s not being done to “protect the kids”.

        • General_Effort@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          8 months ago

          personal likeness IP is a bad thing

          It is. It means that famous people (or their heirs, or maybe just the rights-owner) can make even more money from their fame without having to do extra work. That should be opposed out of principle.

          The extra money for the licensing fees has to come from somewhere. The only place it can come from is working people.

          It would mean more inequality; more entrenchment of the current elite. I see no benefit to society.

          • Dr. Moose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            Not necessarily I’m optimistic that this could lead to empowering status and personality as main resources and push money out of society.

            • General_Effort@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              8 months ago

              How so? Fame is already a monetizable resource. The main changes that I see are that 1) no opportunity to show their face and make their voice heard needs to be missed for lack of time, and 2) age no longer needs to be a problem.

    • doctorcrimson@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      4
      ·
      edit-2
      8 months ago

      When you steal a person’s likeness for profit or defame them, then that’s a CIVIL matter.

      This bill will make AI sexualization a CRIMINAL matter.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Where do you see that?

        The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act would add a civil right of action for intimate “digital forgeries” depicting an identifiable person without their consent, letting victims collect financial damages from anyone who “knowingly produced or possessed” the image with the intent to spread it.

        • doctorcrimson@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          8 months ago

          Here:

          A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence.

          • Dr. Moose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            8 months ago

            That doesn’t seem to be correct. More like a typo as criminalize =/= criminal law.

  • sphericth0r@kbin.social
    link
    fedilink
    arrow-up
    25
    arrow-down
    1
    ·
    8 months ago

    I believe libel laws already exist, but when you’re in Congress you must make laws in a reactionary way otherwise considered thought and reason might begin to permeate the law. We wouldn’t want that.

    • FuglyDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      2
      ·
      8 months ago

      there is a place for deep fakes in satire. (albeit, they should be known as such,)

      • AlternatePersonMan@lemmy.world
        link
        fedilink
        arrow-up
        13
        arrow-down
        5
        ·
        8 months ago

        I agree with the right to satire, but probably not as a deep fake. Comics, skits, etc., sure. Deep fakes are too convincing for an alarming number of folks.

        • FuglyDuck@lemmy.world
          link
          fedilink
          English
          arrow-up
          23
          arrow-down
          1
          ·
          8 months ago

          so how do you feel about skilled impersonators?

          what if they’re convincing? or are we going to allow just the shitty ones? or only if they offend the subject?

          what you’re proposing is a very slippery slope.

            • FuglyDuck@lemmy.world
              link
              fedilink
              English
              arrow-up
              9
              ·
              8 months ago

              Nope. Defamation requires some malicious intent to be illegal. It also requires more or less blatant lies to be maintained.

              Particularly since most satire and most impersonators both go to reasonable lengths to ensure that there’s is minimal confusion as to reality,

          • Zahille7@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            8 months ago

            I’m so glad someone posted a link to Sassy Justice. I thought it was a hilarious little experiment from the South Park guys

        • MagicShel@programming.dev
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          8 months ago

          An alarming number of folks think the world is flat and the moon is made of cheese. We need a better standard than that.

              • MagicShel@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                8 months ago

                The folks who think it’s made of cheese also think we faked the moon landing.

                Which raises a question… Could someone press for moon landing proof to be suppressed on the grounds that they believe it is a deep fake? I guess that depends on how sexy you find moon cheese.

  • leaky_shower_thought@feddit.nl
    link
    fedilink
    arrow-up
    20
    arrow-down
    1
    ·
    8 months ago

    individuals who produced or possessed the forgery with intent to distribute it

    this is going to be a wild ride.

    there’s a scenario where the creator is not the leaker but angry people with forks won’t even care of the distinction.

    • Asafum@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      8 months ago

      This is exactly what has me irritated about this whole nonsense… People have been doing that since Photoshop existed, but big scary AI is in the news now so we are going to attack it full force because people are using it in the way they’ve used everything that has similar capabilities…

      Still no action on our actual issues though, just some performative bullshit to assist the truly needy of our society, billionaires…

    • General_Effort@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      8 months ago

      Only if they do it badly. The bill defines anything as a “digital forgery” that is made with “technological means”:

      to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual

  • Serinus@lemmy.world
    link
    fedilink
    arrow-up
    32
    arrow-down
    19
    ·
    8 months ago

    I don’t get it. Why care? It’s not her.

    Maybe if they’re making money of off her likeness. But without a money trail it just seems like chasing ghosts for not much reason.

    • shiroininja@lemmy.world
      link
      fedilink
      arrow-up
      23
      arrow-down
      7
      ·
      8 months ago

      Because it’s gross, and they do it to minors now. and all they need are pictures of your kids from your social media profile. They even use AI to undress them.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        19
        arrow-down
        2
        ·
        8 months ago

        Generating sexual images of minors is already illegal. And these images can be generated by anyone modestly technical on their computer, so you can’t go after people for creating or posessing the images (except if they look too young), only distribution.

        This is unfortunately theater and will do basically nothing. How does a person even know if they are deep fakes? Or consensual? Hell what’s too close of a likeness, because some of those images didn’t look that much like her and at least one was not even realistic.

        I’m not saying it’s cool people are doing this, just that enforcement of this law is going to be a mess. You wind up with weird standards like how on Instagram you can show your labia but only through sheer material. Are deep fakes fine if you run them through an oil painting filter?

        • yamanii@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          Are deep fakes fine if you run them through an oil painting filter?

          Probably since nobody could mistake an oil painting for the real person, it’s not a deep fake anymore.

          • MagicShel@programming.dev
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            8 months ago

            I have about a 99% success rate at identifying AI full body images of people. People need to learn to look better. They look just as fake as the oil paintings.

              • MagicShel@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                8 months ago

                I think that’s relevant when the defense against oil paintings is that you can tell they aren’t real. The line can’t be “you can’t tell they are fake” because… well… you can identify AI artwork 99% of the time and the other 1% is basically when the pose is exactly so to conceal the telltale signs and the background is extremely simple so as to give nothing away.

      • fishos@lemmy.world
        link
        fedilink
        English
        arrow-up
        24
        arrow-down
        9
        ·
        8 months ago

        And here we have the real answer: prudism. “It’s gross”. And of course “think of the children”. You don’t have a real answer, you have fear mongering

        • MagicShel@programming.dev
          link
          fedilink
          arrow-up
          13
          arrow-down
          1
          ·
          8 months ago

          I agree the issue is one of puritan attitudes toward sex and nudity. If no one gave a fuck about nude images, they wouldn’t be humiliating, and if they weren’t humiliating then the victim wouldn’t really even be a victim.

          However we live in the world we live in and people do find it embarrassing and humiliating to have nude images of themselves made public, even fakes, and I don’t think it’s right to tell them they can’t feel that way.

          They shouldn’t ever have been made to feel their bodies are something to be embarrassed about, but they have been and it can’t be undone with wishful thinking. Societal change must come first. But that complication aside, I agree with you completely.

        • shiroininja@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          8 months ago

          Not at all. Think of the consequences of someone’s nudes were leaked or an onlyfans account was made with images of them, and an employer sees it. They’re already firing teachers for being on there. And a lot of times they’re used in extortion. Not to mention your image is your property. It is you. And nobody else has rights to that.

            • shiroininja@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              8 months ago

              You don’t have to take nudes anymore to have nudes leaked. There are Ai that strip clothes from pictures. People have been making csam off of pictures of peoples kids on their Instagram profiles,etc.

      • TigrisMorte@kbin.social
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        8 months ago

        And if I feel that cooking carrots is gross and cooked carrots shouldn’t be fed to minors or miners? Should that be illegal as well?

    • Selkie@lemm.ee
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      Its like having your nudes leaked but you never sent any, pretty fucked

    • Deceptichum@kbin.social
      link
      fedilink
      arrow-up
      8
      arrow-down
      5
      ·
      8 months ago

      Because it’s her image?

      I’d be fucking furious if someone was sharing say a fake photo of me fucking a watermelon. Doesn’t matter if it’s physically me or not, people would think it was.

      • dont_lemmee_down@lemm.ee
        link
        fedilink
        arrow-up
        7
        arrow-down
        2
        ·
        8 months ago

        Would they though? I’d argue nobody thinks those were pictures of Taylor Swift. I’d go further and say that it helps in the sense that you can always deny even real pictures arguing they were AI.

    • gila@lemm.ee
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      There is a money trail when it’s legal. You get blatant advertising of services where you pay to upload your own photos to make deepfakes with them, on all kinds of sites (ahem, Pornhub). That’s a level of access that can’t be ignored, especially if it’s a US-based company providing the service, taking payment via Visa/Master etc. Relegate it to the underground where it belongs.

      • Serinus@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        8 months ago

        I’d be more okay if the law were profit based, because that’s much easier to enforce.

        I don’t like laws that are near impossible to enforce unless they’re absolutely necessary. I don’t think this one is absolutely necessary.

        • gila@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          8 months ago

          I don’t think general enforcement against deepfake porn consumption is a practical application of this proposed law in civil court. Practical applications are shutting down US-based deepfake porn sites and advertising. As far as possessors go, consider cases of non-celebrities being deepfaked by their IRL acquaintances. In a scenario where the victim is aware of the deepfake such that they’re able to bring the matter of possession to court, don’t you agree it’s tantamount to sexual harrassment? All I’m seeing there is the law catching up to cover disruptive tech with established legal principle

  • Copernican@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    8 months ago

    So what happens if a person allows their likeness to be 3d modeled and textured for something like a video game, and that 3d model is used to create explicit images. Is that not a problem (or maybe a different kind of problem) because it’s not a deepfake and instead a use of a digital asset?

    • doctorcrimson@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      Technically the terms of use of a persons likeness would be defined in a contract in the case of a product, but since unauthorized use is already not a legal or protected activity in any way then I believe the bill’s intention is to add potentially fines or prison time to offenders on top of opening them up to legal liability.

      If the studio had an actor’s written consent then it would be left up to the courts as a civil matter, only.

    • General_Effort@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      Copying the asset out of the game file might be a copyright violation but you’re usually allowed to make private copies. IDK to what a degree copyright extends to images made with such an asset. (Funny story. The WWE released a video game with likenesses of their wrestlers (performers? actors? artists? IDK). A tattooist sued because that showed a design of theirs on the skin of a wrestler and won. So much for “my body belongs to me”.)

      As far as this bill is concerned. This bill defines anything as a"digital forgery" that is made with “technological means […] to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual". IDK how good the reasonable person is at spotting CGI. Quick google says that the average juror is about 50 years old. Make of that what you will.

  • alienanimals@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    8 months ago

    It’s already impossible to stop.

    Also, doing something ONLY when a billionaire complains, is a very bad look.