Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • vane@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 day ago

    Maybe let’s assume all digital images are fake and go back to painting. Wait… what if children start painting deepfakes ?

    • argl@feddit.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      16 hours ago

      Can’t afford this much cheese today to find just the right slice for every bikini photo…

  • SabinStargem@lemmy.today
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    4
    ·
    1 day ago

    Deepfakes might end up being the modern version of a bikini. In the olden days, people wore these to the beach. Having less was scandalous and moral decay. Yet, now we wear much less.

    Our grandchildren might simply not give a damn about their nudity, because it is assumed that everyone is deepfaking everyone.

    • atomicorange@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      19 hours ago

      These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.

      • SabinStargem@lemmy.today
        link
        fedilink
        English
        arrow-up
        2
        ·
        12 hours ago

        It can be both. The cornerstone of why nudity can be abused, is that society makes it shameful to be bare. If some generations from now that people can just shrug and not care, that is one less tool an abuser can use against people.

        In any case, I am of the mind that people of my generation might be doing their own version of the Satanic Panic, or the reaction against rap music. For better or worse, older people cannot relate to the younger.

      • Gsus4@mander.xyz
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        9 hours ago

        Unless it is used to pretend that it is a real video and circulated for denigration or blackmail, it is very much not at all like assault. And also, deepfakes do not have the special features hidden under your clothes, so it is possible to debunk those if you really have to.

    • youmaynotknow@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      16
      ·
      1 day ago

      In my case, other kids would not have survived trying to pull off shit like this. So yeah, I’m also glad I’m not a kid anymore.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    57
    arrow-down
    2
    ·
    2 days ago

    For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.

    That’s just on it’s face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That’s not pedophilia. It’s wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.

    It shouldn’t be treated the same as when an adult man generates it; there should be nuance. I’m not saying it’s ok for a thirteen year old to generate said content: I’m saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.

    I’m so glad I went through puberty at a time when this kind of shit wasn’t available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 day ago

      In general, even up here in woke-ville, punishments have gotten a lot more strict for kids. There’s a lot more involvement of police, courts, jail. As a parent it causes me a lot of anxiety - whatever happened to school being a “sandbox” where a kid can make mistakes without adult consequences, without ruining their lives? Did that ever exist?

      • BlackPenguins@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        10 hours ago

        I can already picture that as an Onion headline:

        New York Renames State to ‘WokeVille’. NYC to follow.

      • jwmgregory@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 day ago

        it existed if society liked you enough.

        fascists just have a habit of tightening that belt smaller and smaller, is what’s going on.

    • Lka1988@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      edit-2
      2 days ago

      As a father of teenage girls, I don’t necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.

      • seralth@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        2 days ago

        There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.

        Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.

        • Lka1988@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          16
          arrow-down
          7
          ·
          2 days ago

          ruining the life of a 13 year old boy for the rest of his life with no recourse

          And what about the life of the girl this boy would have ruined?

          This is not “boys will be boys” shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).

          I don’t think it’s unreasonable to expect an equivalent punishment that has the potential to ruin his life.

          • Vinstaal0@feddit.nl
            link
            fedilink
            English
            arrow-up
            13
            arrow-down
            1
            ·
            1 day ago

            It is not abnormal to see different punishment for people under the age of 18. Good education about sex and what sexual assault does with their victims (same with guns, drugs including alcohol etc).

            You can still course correct the behaviour of a 13 year old. There is also a difference between generating the porn and abusing it by sharing it etc.

            The girls should be helped and the boys should be punished, but mainly their behaviour needs to be correcte

          • youmaynotknow@lemmy.ml
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            6
            ·
            edit-2
            1 day ago

            Parents are responsible for their kids. The punishment, with the full force of the law (and maybe something extra for good measure), should fall upon the parents, since they should have made sure their kids knew how despicable and illegal doing this is.

            Yeah, I agree, we shouldn’t ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl’s life.

            • some_guy@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              11
              arrow-down
              1
              ·
              1 day ago

              Yeah, I agree, we shouldn’t ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl’s life.

              You’re a fucking asshole. This isn’t like prosecuting parents who let a school shooter have access to guns. The interenet is everywhere. Parents are responsible for bringing up their children to be socially responsible. A thirteen year old kid is anything but responsible (I mean their mentality / maturity, I’m not giving them a pass).

              Go hang out with conservatives who want more policing. Over here, we’ll talk about social programs you fucking prick.

              • youmaynotknow@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                4
                ·
                1 day ago

                I am an asshole, that’s never been in question, and I fully own it. Having said that, no amount of “social programs” is going to have any effect if fucking parents don’t raise their kids right.

                I’m entirely against surveillance, except when it comes to parents and keeping a close eye on everything their kids watch, browse or otherwise access (evidently making it known to the kids that “I can see EVERYTHING you see and do”).

                So, yeah, hang the imbecile parents that should not have had kids in the first place because a fucking social program or school would raise them instead. Fuck off.

                • Lka1988@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 day ago

                  social program

                  And thanks to the assholes in Congress who just passed the Big Betrayal Bill, those are all going away.

            • Lka1988@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              1 day ago

              Teenagers are old enough to understand consequences.

              In fact, my neighborhood nearly burned down last week because a teenager, despite being told “no” and “stop” multiple times - including by neighbors - decided to light off fireworks on the mountainside right behind the neighborhood.

              Red arrow is my house. We were damn lucky the wind was blowing the right direction. If this had happened the day before, the neighborhood would be gone.

              • jsomae@lemmy.ml
                link
                fedilink
                English
                arrow-up
                3
                ·
                16 hours ago

                some day I hope to be brave enough to post pictures of my house on the internet

      • some_guy@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        10
        ·
        2 days ago

        Yes, absolutely. But with recognition that a thirteen year old kid isn’t a predator but a horny little kid. I’ll let others determine what that punishment is, but I don’t believe it’s prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we’re ratcheting up the punishment, but still not adult prison.

        • Lka1988@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          4
          ·
          2 days ago

          I did say equitable punishment. Equivalent. Whatever.

          A written apology is a cop-out for the damage this behaviour leaves behind.

          Something tells me you don’t have teenage daughters.

          • some_guy@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 day ago

            No kids. That’s why I say others should write the punishments. A written apology wasn’t meant as the only punishment. It was in addition to community service and other stipulations.

    • Agent641@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      2 days ago

      Punishment for an adult man doing this: Prison

      Punishment for a 13 year old by doing this: Publish his browsing and search history in the school newsletter.

  • dinckel@lemmy.world
    link
    fedilink
    English
    arrow-up
    123
    arrow-down
    1
    ·
    2 days ago

    Lawmakers are grappling with how to address …

    Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

  • Daftydux@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    15
    ·
    2 days ago

    Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.

    It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.

  • wewbull@feddit.uk
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    16
    ·
    2 days ago

    Honestly I think we need to understand that this is no different to sticking a photo of someone’s head on a porn magazine photo. It’s not real. It’s just less janky.

    I would categorise it as sexual harassment, not abuse. Still serious, but a different level

    • lath@lemmy.world
      link
      fedilink
      English
      arrow-up
      45
      arrow-down
      6
      ·
      2 days ago

      Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.

      • LostXOR@fedia.io
        link
        fedilink
        arrow-up
        19
        ·
        2 days ago

        Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

        • Lka1988@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          edit-2
          2 days ago

          I would consider that as qualifying. Because it’s targeted harassment in a sexually-explicit manner. All the girl would have to do is claim it’s her.

          Source: I’m a father of teenage daughters. I would pursue the individual(s) who started it and make them regret their choices.

        • lath@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          3
          ·
          2 days ago

          I don’t know personally. The admins of the fediverse likely do, considering it’s something they’ve had to deal with from the start. So, they can likely answer much better than I might be able to.

        • surewhynotlem@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          8
          ·
          2 days ago

          Drawing a sexy cartoon that looks like an adult, with a caption that says “I’m 12”, counts. So yeah, probably.

          • cole@lemdro.id
            link
            fedilink
            English
            arrow-up
            2
            ·
            15 hours ago

            This actually is quite fuzzy and depends on your country and even jurisdiction in your country

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        20
        arrow-down
        15
        ·
        2 days ago

        Disagree. Not CSAM when no abuse has taken place.

        That’s my point.

        • Zak@lemmy.world
          link
          fedilink
          English
          arrow-up
          20
          arrow-down
          2
          ·
          2 days ago

          I think generating and sharing sexually explicit images of a person without their consent is abuse.

          That’s distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I’m morally uncomfortable criminalizing an act that has no victim.

        • Lka1988@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          edit-2
          2 days ago

          Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child’s identity.

          Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material… CSHAM, or maybe just CSAM, you know, to remember it more easily.

        • atomicorange@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          7
          ·
          2 days ago

          If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

          If so, how is the psychological effect of a convincing deepfake any different?

          • General_Effort@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 days ago

            If someone puts a camera in a locker room, that means that someone entered a space where you would usually feel safe. It implies the potential of a physical threat.

            It also means that someone observed you when you were doing “secret” things. One may feel vulnerable in such situations. Even a seasoned nude model might be embarrassed to be seen while changing, maybe in a dishevelled state.

            I would think it is very different. Unless you’re only thinking about the psychological effect on the viewer.

          • BombOmOm@lemmy.world
            link
            fedilink
            English
            arrow-up
            17
            arrow-down
            11
            ·
            edit-2
            2 days ago

            Taking secret nude pictures of someone is quite a bit different than…not taking nude pictures of them.

            It’s not CSAM to put a picture of someone’s face on an adult model and show it to your friend. It’s certainly sexual harassment, but it isn’t CSAM.

            • atomicorange@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              4
              ·
              2 days ago

              How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

              • BombOmOm@lemmy.world
                link
                fedilink
                English
                arrow-up
                10
                arrow-down
                8
                ·
                edit-2
                2 days ago

                It’s absolutely sexual harassment.

                But, to your question: you can’t just say something has underage nudity when the nudity is of an adult model. It’s not CSAM.

                • atomicorange@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  14
                  arrow-down
                  6
                  ·
                  2 days ago

                  Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.

        • lath@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          5
          ·
          2 days ago

          There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.

          Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.

          Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.

          The intention to exploit.

    • lurch (he/him)@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      2 days ago

      I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        2
        ·
        2 days ago

        That could be a socially healthy place to end up at. I don’t see it anytime soon though. Just look at the other response I got.

        • Hemingways_Shotgun@lemmy.ca
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          2 days ago

          Sure. That might end up being a socially healthy place for adults to end up.

          But it will never work that way for young teens. Their brains aren’t done baking yet. They don’t have the emotional maturity to understand that enough to be “okay with it because it’s just a fake”.

          That’s why we protect kids rather than just telling them “hey it’s okay…it’s only a fake.”

        • BombOmOm@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          2 days ago

          Anyone with half a brain will certainly claim as much. Even if people don’t fully believe it, it will blunt the most serious of social consequences.

    • SharkAttak@kbin.melroy.org
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      2 days ago

      Furthermore, we generally assume malicious intent, but I wouldn’t be surprised if teenagers were using the app to ‘get’ big boobs etc., we all have seen those shopped pictures with deformed background 😁

    • Hemingways_Shotgun@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      5
      ·
      2 days ago

      I’m not even going to begin describing all the ways that what you just said is fucked up.

      I’ll just point out that online deepfake technology is FAR more accessible to the average 13 year old to use on their peers than “porno mags” were in our day.

      You want to compare taking your 13 year old classmates photo off of Facebook, running it through an AI and in five seconds creating photo-realistic adult content featuring them, and compare that to getting your dad’s skin-mag from under his mattress when he’s not home, cutting your classmates face out of a yearbook, taping it on, then sneaking THAT into the computer lab at school so that you can photocopy it and pass it around in home room, and then putting the skin-mag BACK under the mattress before your dad finds out.

      Is that right…is THAT what you’re trying to say? Are those the two things that you’re trying say are equivalent?

      • SheeEttin@lemmy.zip
        link
        fedilink
        English
        arrow-up
        12
        ·
        2 days ago

        Yes, we all know it’s fucked up. The point is that we don’t need a new class of laws just because it’s harassment and bullying ✨with AI✨.

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    2 days ago

    I’m sure the laws will focus on protecting IP - specifically that of AI companies or megacorps, the famous and powerful, but not the small creators of content or the rabble negatively affected by AI abuse.

    The rest of us will have to suffer through presenting whatever damaging and humiliating video to a court. If you can even afford a lawyer to do so. Then be offered a judgement that probably won’t be paid or won’t cover the damage done by an image that will never be able to be erased from the internet. Those damages could include the suicide of young people bullied and humiliated by such deepfakes.

  • electric_nan@lemmy.ml
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    2 days ago

    My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!

  • mhague@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    2
    ·
    2 days ago

    So is this a way to take away rights by making it about kids?

    I mean what the fuck. We did much less and got punished right? It didn’t matter if we were on the property. Schools can hold students accountable for conduct with other students.

    The leaded-gas adults of the time had no problem dealing with the emergence of cell phones. It was a distraction. They didn’t need lawmakers to call it something specific. My Pokemon cards caused fights and were banned. No lawmakers needed.

    The problem is surely with the interaction between parents and schools. Or maybe it’s just the old way of thinking. Maybe it’s better to have police and courts start taking over discipline in schools.

    • krashmo@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      2 days ago

      How is a school going to regulate what kids do outside of school property? They could ban cell phones on campus but that’s not going to change what happens after hours.

      • Lv_InSaNe_vL@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        Schools can already do that though. You can get in trouble for bullying outside of school, and when i was a student athletes i had pretty strict restrictions on what i was allowed to do because i was an “ambassador” for the school.

          • Lv_InSaNe_vL@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 hours ago

            Overall, I would say so yeah.

            For the bullying thing, not everyone’s parents are available or willing to discipline their kids.

            And for the athletics thing, personally I believe that athletics is more about developing young adults into good people rather than the sport itself. And my school had a bunch of other things like grade minimums, required volunteer hours, we would wear dress shirts and ties before meets, and some other things like that.

    • ColeSloth@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      2 days ago

      All your examples are of things that were stopped while at school, so your argument doesn’t really carry over. You still had your pokemon cards everywhere else.

    • cley_faye@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      I’d rather these laws be against abusing and exploiting child, as well as against ruining their lives. Not only that would be more helpful, it would also work in this case, since actual likeness are involved.

      Alas, whether there’s a law against that specific use case or not, it is somewhat difficult to police what people do in their home, without a third party whistleblower. Making more, impossible to apply laws for this specific case does not seem that useful.

      • Vinstaal0@feddit.nl
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        There is also a difference between somebody harassing somebody with nude pictures (either real or not) than somebody jerking off to them at home. It does become a problem when an adult masturbated to pictures of children, but children to children. Let’s be honest, they will do it anyway.

  • danciestlobster@lemmy.zip
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    2 days ago

    I don’t understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn’t that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it’s source material is would be the obvious choice here

    • wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 hours ago

      You know how when you look at a picture of someone and you cover up the clothed bits, they look naked. Your brain fills in the gaps with what it knows of general human anatomy.

      It’s like that.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2 days ago

      This is mostly about swapping faces. You take a video and a photo of someone’s face. Software can replace the face of someone in the video with that face. That’s been around for a decade or so. There are other ways of doing it.

      When the face belongs to an underage individual, and the video is pornographic…

      LLMs only do text.