A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

    • Sharkwellington@lemmy.one
      link
      fedilink
      arrow-up
      57
      arrow-down
      5
      ·
      11 months ago

      Right, there are plenty of reactive measures available but the only proactive measures are either restricting availability of the source photos used or restricting use of the deep fake tools used. Everything beyond that is trying to put the genie back in the bottle.

      • interceder270@lemmy.world
        link
        fedilink
        arrow-up
        52
        arrow-down
        3
        ·
        11 months ago

        At some point, communities and social circles need to be able to moderate themselves.

        Disseminating nudes of peers should be grounds for ostracizing, but it really depends on the quality of people around you.

        • MotoAsh@lemmy.world
          link
          fedilink
          arrow-up
          21
          arrow-down
          2
          ·
          11 months ago

          That doesn’t work. It’s nothing but an inconvenience to not talk to your neighbors or those around you. They’d just get even worse and make even worse friends online.

          Ostracization doesn’t work. Ever. Period. If they’re bad enough, banishment works. Ostracization is just literally ignoring the problem.

          • interceder270@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            4
            ·
            11 months ago

            Ostracization doesn’t work. Ever. Period. If they’re bad enough, banishment works. Ostracization is just literally ignoring the problem.

            That’s just wrong. Unless you’re hanging around shitty people, ignoring the bad ones by definition works.

            • SuddenDownpour@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              5
              ·
              11 months ago

              A lot of social circles are dominated by either shitty people or by people too insecure to take a confronting attitude towards those shitty people.

              • interceder270@lemmy.world
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                11 months ago

                Yeah, it does. It’s literally how most friend groups keep undesirables from hanging out with them.

                • MotoAsh@lemmy.world
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  11 months ago

                  Your friend group is not a sufficient model for all friend groups. They’re a fundamentally different set. All sets are not the same as the other, and taken as a whole it is fundamentally different than any individual group. I’m talking about all groups. Not your group.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        29
        arrow-down
        2
        ·
        11 months ago

        It’s not possible to restrict deep fake technology at this point. It’s out there. Accessible to everyone who wants it and has a computer at home.

        • Sharkwellington@lemmy.one
          link
          fedilink
          arrow-up
          5
          arrow-down
          2
          ·
          11 months ago

          And that’s the point I was making, nobody can be “protected” from widely available photos being used on widely available programs. Best we can do is deter but that isn’t a guarantee.

    • ahornsirup@sopuli.xyz
      link
      fedilink
      arrow-up
      15
      arrow-down
      2
      ·
      11 months ago

      Even if you don’t want to consider it CSAM, it is, at the very least, sexual harassment. The kids making and circulating these pictures and videos should be facing consequences. And the fear of consequences does offer some degree of protection at least.

      • pohart@programming.dev
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        11 months ago

        It looks like pretty severe sexual harassment at best. Unfortunately the people I think are most likely to do it are teenagers with poor self control who don’t realize the severity.

        I think if schools can implement appropriate restorative responses and education on the harm done that could be much more effective than decaigan punishments after the fact.

      • Thief_of_Crows@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        11 months ago

        Should a teenager face consequences for drawing a picture of their classmate naked? What if they do it well? How is this at all different?

        • ahornsirup@sopuli.xyz
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          11 months ago

          If they distribute the drawing, yes. And the difference is that a drawing is immediately recognisable as a drawing, but an AI generated image or video isn’t necessarily easily recognisable as not being real, so the social consequences for the person depicted can be much worse.

      • Sybil@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        11 months ago

        no. the article mentions"protecting" people several times. I don’t see how anyone is protected by the proposed laws.

  • foggy@lemmy.world
    link
    fedilink
    arrow-up
    86
    arrow-down
    5
    ·
    11 months ago

    Methinks this problem is gonna get out of fucking hand. Welcome to the future, it sucks.

    • Goldmage263@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      11 months ago

      AI is out of the bag for all the good and bad it will do. Nothing will be safe on the internet, and hasn’t been for a long time now. Either we will get government monitored AI results or use AI to combat misuse of AI. Either way isn’t preventative. The next wild west frontier is upon us, and it’s full of bandits in hiding.

  • Marxism-Fennekinism@lemmy.ml
    link
    fedilink
    English
    arrow-up
    36
    ·
    edit-2
    11 months ago

    Maybe I’m just naive of how many protections we’re actually granted but shouldn’t this already fall under CP/CSAM legislation in nearly every country?

        • rchive@lemm.ee
          link
          fedilink
          arrow-up
          1
          arrow-down
          4
          ·
          11 months ago

          If you make a picture today of someone based on how they looked 10 years ago, we say it’s depicting that person as the age they were 10 years ago. How is what age they are today relevant?

        • rchive@lemm.ee
          link
          fedilink
          arrow-up
          2
          arrow-down
          5
          ·
          11 months ago

          If you make a picture today of someone based on how they looked 10 years ago, we say it’s depicting that person as the age they were 10 years ago. How is what age they are today relevant?

          • GeneralVincent@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            11 months ago

            I’m unsure of the point you’re trying to make?

            It’s relevant in this case because the age they are today is underage. A picture of them 10 years ago is underage. And a picture of anyone made by AI to deep fake them nude is unethical irregardless of age. But it’s especially concerning when the goal is to depict underage girls as nude. The age thing specifically could get a little complicated in certain situations ig, but the intent is obvious most of the time.

            • rchive@lemm.ee
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              11 months ago

              I’m obviously not advocating or defending any particular behavior.

              Legally speaking, why is what age they are today relevant rather than the age they are depicted as in the picture? Like, imagine we have a picture 20 years from now of someone at age 37. It’s legally fine until it’s revealed it was generated in 2023 when the person in question was 17? If the exact same picture was generated a year later it’s fine again?

              • DogMuffins@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                11 months ago

                Basically, yes.

                Is the person under-age at the time the image was generated? and … Is the image sexual in nature?

                If yes, then generating or possessing such an image ought to be a crime.

          • DogMuffins@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            IDK why this dumb thought experiment makes me so grumpy everyone someone invokes it, but you’re going to have to explain how it’s relevant here.

            • Lemming6969@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              11 months ago

              How many pieces do you have to change before it’s not closely enough related? If every piece is modified, is it the same base image? If it’s not the same image, when does it cease to represent the original and must be reassessed? If it’s no longer the image of a real person, given the extreme variety in both real and imagined people, how can an AI image ever be illegal? If you morph between an image of a horse and an illegal image, at what exact point does it become illegal? What about a person and an illegal image? What about an ai generated borderline image and an illegal image? At some point, a legal image changes into an illegal image, and that point is nearly impossible to define. Likewise, the transition between a real and imagined person is the same, or the likeness between two similar looking real, but different, or imagined people.

              • DogMuffins@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                1
                ·
                11 months ago

                that point is nearly impossible to define

                As with any law, there will undoubtedly be cases in which it is difficult to discern whether or not a law has been broken, but courts decide on innocence or guilt in such cases every day. A jury would be asked to decide whether a a reasonable third party is likely to conclude on the balance of probabilities that the image depicts a person who is under 18.

                Whether or not the depicted person is real or imagined is not relevant in many / most jurisdictions.

  • Aceticon@lemmy.world
    link
    fedilink
    arrow-up
    37
    arrow-down
    3
    ·
    edit-2
    11 months ago

    There might be an upside to all this, though maybe not for these girls: with enough of this people will eventually just stop believing any nude pictures “leaked” are real, which will be a great thing for people who had real nude pictures leaked - which, once on the Internet, are pretty hard to stop spreading - because other people will just presume they’re deepfakes.

    Mind you, it would be a lot better if people in general culturally evolved beyond being preachy monkeys who pass judgment on others because they’ve been photographed in their birthday-suit, but that’s clearly asking too much so I guess simply people assuming all such things are deepfakes until proven otherwise is at least better than the status quo.

  • ZombiFrancis@sh.itjust.works
    link
    fedilink
    arrow-up
    32
    arrow-down
    2
    ·
    11 months ago

    In previous generations the kid making fake porn of their classmates was not a well liked kid. Is that reversed now? On the basis of quality of tech?

    • Omega@lemmy.world
      link
      fedilink
      arrow-up
      19
      arrow-down
      1
      ·
      11 months ago

      That kid that doodles is creepy. But deep fakes probably feel a lot closer to actual nudes.

  • calypsopub@lemmy.world
    link
    fedilink
    arrow-up
    52
    arrow-down
    23
    ·
    11 months ago

    So as a grown woman, I’m not getting why teenage girls should give any of this oxygen. Some idiot takes my head and pastes it on porn. So what? That’s more embarrassing for HIM than for me. How pathetic that these incels are so unable to have a relationship with an actual girl. Whatever, dudes. Any boy who does this should be laughed off campus. Girls need to take their power and use it collectively to shame and humiliate these guys.

    I do think anyone who spreads these images should be prosecuted as a child pornographer and listed as a sex offender. Make an example out of a few and the rest won’t dare to share it outside their sick incels club.

    • WoahWoah@lemmy.world
      link
      fedilink
      arrow-up
      61
      arrow-down
      2
      ·
      11 months ago

      That’s fine and well. Except they are videos, and it is very difficult to prove they aren’t you. And the internet is forever.

      This isn’t like high school when you went to high school.

      Agreed on your last paragraph.

      • Margot Robbie@lemmy.world
        link
        fedilink
        arrow-up
        22
        arrow-down
        4
        ·
        11 months ago

        Then nude leak scandals will quickly become a thing of the past, because now every nude video/picture can be assumed to be AI generated and are always fake until proven otherwise.

        That’s the silver lining of this entire ordeal.

        Again, this is a content distribution problem more than an AI problem, the liability should be on those who willingly host these deepfake content than on AI image generators.

        • finestnothing@lemmy.world
          link
          fedilink
          arrow-up
          12
          arrow-down
          2
          ·
          11 months ago

          That would be great in a perfect world, but unfortunately public perception is significantly more important than facts when it comes to stuff like this. People accused of heinous crimes can and do lose friends, their jobs, and have their life ruined even if they prove that they are completely innocent

          Plus, something I’ve already seen happen is someone says a nude is fake and are then told they have to prove that it’s fake to get people to believe them… which is very hard without sharing an actual nude that has something unique about their body

          • derpgon@programming.dev
            link
            fedilink
            arrow-up
            3
            arrow-down
            4
            ·
            11 months ago

            The rest of the human body has more unique traits than the nude parts. Freckles, birthmarks, scars, tattoos. Those are traits that are not possible to replicate unless the person specifically knows.

            Now that I think about it, we all proobably need a tattoo. That should clear anyone instantly.

            • Llewellyn@lemm.ee
              link
              fedilink
              arrow-up
              4
              ·
              11 months ago

              You can ask an AI to draw a blurred version of the tattoo. Or to mask the tattooed area with, I don’t know, piece of clothes or something.

            • WoahWoah@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              edit-2
              11 months ago

              Yes I’m sure a hiring manager is going to involve themselves that deeply in the pornographic video your face pops up in.

              HR probably wouldn’t even allow a conversation about it. That person just never gets called back.

              And then the worse part is the jobs that DO hire you. Now you have to question why they are hiring you. Did they not see the fake porn video? Or did they see it.

              The entire thing is damaging and ugly.

              • derpgon@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                11 months ago

                If you are already an employee, then they, will want to keep you and look into the matter.

                If you are not an employee yet - is HR really looking up porn of everyone?

        • toonicycle@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          2
          ·
          11 months ago

          I mean they obviously shouldn’t have to, but if nude photos of you got leaked in your community, people would start judging you negatively, especially if you’re a young woman. Also in these cases where they aren’t adults it would be considered cp.

    • ILikeBoobies@lemmy.ca
      link
      fedilink
      arrow-up
      30
      arrow-down
      3
      ·
      11 months ago

      So they do it and share it around to slut shame you

      You try to find a job and they find porn of you

      It’s a lot worse than you’re making it out to be when it’s not you that gets to make that decision

      • DogMuffins@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        7
        ·
        11 months ago

        IMO the days of searching for porn of prospective employees are over. With the advent of AI generated porn, what would be the point of that?

        • Couldbealeotard@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          11 months ago

          There are so many recent articles linked on Lemmy about people losing their job over making porn. The days of losing jobs over porn is now more than ever.

          • DogMuffins@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            11 months ago

            Seriously? Maybe we don’t read the same stuff but that’s not something I’ve noticed.

            I just can’t imagine how that’s possible. I wish someone would fire me over porn so I could sue them for unfair dismissal as well as defamation and or libel.

      • Basil@lemmings.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        2
        ·
        edit-2
        11 months ago

        So as a grown woman

        Right? Literally not what’s being discussed. Obviously they’ll be more mature and reasonable about it. Teenagers won’t be

      • calypsopub@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        3
        ·
        11 months ago

        I wasn’t very representative even when I WAS a teenager. I was bullied quite a bit, though.

        • atzanteol@sh.itjust.works
          link
          fedilink
          arrow-up
          5
          ·
          edit-2
          11 months ago

          And can you imagine those bullies creating realistic porn of you and sharing it with everyone at school? You may have been strong enough to endure that - but it’s pretty unrealistic to expect everyone to be able to do so. And it’s not a moral failing if somebody is unable to. This is the sort of thing that leads to suicides.

    • foo@programming.dev
      link
      fedilink
      arrow-up
      16
      arrow-down
      6
      ·
      11 months ago

      What if the deep fake was so real it was hard to tell? Now if the deep fake was highly invasive and humiliating? Can you see the problem?

      • DogMuffins@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        4
        ·
        11 months ago

        I think that the point this comment is trying to make is that because it has become so easy to make these images, their existence is not very meaningful. All deep fakes are very realistic. You can’t tell fakes from originals.

        Like as an adult, if I saw an “offensive” image of a co-worker, my first assumption would be that it’s probably AI generated, my first thought would be “which asshole made this image” rather than “I can’t believe my co-worker did [whatever thing]”.

      • calypsopub@lemmy.world
        link
        fedilink
        arrow-up
        7
        arrow-down
        12
        ·
        11 months ago

        Not really. The more extreme it is, the more easily people will believe you when you say it’s a deep fake. Everyone who matters (friends and family) will know it’s not you. The more this sort of thing becomes commonplace, the more people will simply shake their heads and move on.

        • mrsgreenpotato@discuss.tchncs.de
          link
          fedilink
          arrow-up
          7
          ·
          11 months ago

          People kill themselves over much more mundane things than this. I think you overestimate teenagers unfortunately, not everyone can handle it as lightly as you would. Telling people to just “shake it off” will simply not work most of the time.

          • calypsopub@lemmy.world
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            11 months ago

            Sadly, you have a point. Somebody with good support at home and a circle of friends can weather this sort of thing, but others may feel helpless or hopeless. There needs to be an effective place to turn to for kids who are being bullied. Unfortunately that doesn’t seem to exist.

  • NightAuthor@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    10
    ·
    11 months ago

    I wonder what the prevalence of this kind of behavior is like in countries that aren’t so weird about sex.

    • atzanteol@sh.itjust.works
      link
      fedilink
      arrow-up
      46
      arrow-down
      13
      ·
      11 months ago

      This has nothing to do with “being weird about sex” and everything to do with men treating women poorly.

      You can expect this to be worse in nations where women don’t have as many rights and/or where misogyny is accepted as part of life.

      • NightAuthor@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        6
        ·
        11 months ago

        Sounds plausible, we just abolished Roe, so…. It’s not looking great for the future of this issue in the US.

    • Lmaydev@programming.dev
      link
      fedilink
      arrow-up
      18
      arrow-down
      9
      ·
      11 months ago

      Kids don’t know or understand the damage this can cause someone.

      They see it as a joke most of the time.

      It needs to be made illegal and the kids properly educated about why.

      It’s easy as an adult to condemn these children but we have a lot more life experience.

      • NightAuthor@lemmy.world
        link
        fedilink
        English
        arrow-up
        24
        arrow-down
        14
        ·
        11 months ago

        So, you’re saying both:

        1. It’s childish behavior
        2. It should be made illegal

        So… you think the solution to childish behavior is putting kids in jail?

        *deep breath* lemme try to see a more logical interpretation….

        Wait, you did mention education, ok I musta missed that on my first read.

        So educate the kids, and if they don’t learn… jail

            • Oshka@kbin.social
              link
              fedilink
              arrow-up
              27
              arrow-down
              1
              ·
              11 months ago

              That is correct. You punish and educate children who do things wrong. Timeout’s a new concept to you?

              • NightAuthor@lemmy.world
                link
                fedilink
                English
                arrow-up
                6
                arrow-down
                16
                ·
                11 months ago

                You may be from the US, where this isn’t really a concept, but there is significant evidence that you actually can teach better with proper rewards for good behaviour than you can with punishment for bad behaviour.

                I’m actually not sure what the science says about doing both together (maybe I’d read on it more if I actually had kids), but personal experience and discussions at least indicate that parents who punish consistently, rarely couple it with equivalent rewards/praise.

                But maybe you and/or your parents are different.

                Personally, I just got punished a lot for having ADHD. Not that they knew it at the time, but it turns out that’s effectively what was happening. And for people with ADHD, small immediate rewards are WAY more effective than potential, delayed punishment, even if severe.

                • Behaviorbabe@kbin.social
                  link
                  fedilink
                  arrow-up
                  10
                  ·
                  11 months ago

                  Science says we teach alternative behaviors and provide positive reinforcement for socially appropriate behaviors. Punishment (which isn’t just jail, it can be stuff like detention if we’re not losing our heads here) if it’s not paired with a replacement behavior is the least effective. Usually you reserve punishment for “danger to self or others” behaviors…

                  Now, as to where this behavior falls. Having AI generated porn of yourself all over the internet as a young girl in some of the puritan towns in the US? That could be an absolute nightmare for the victim this of course something has to occur. Perhaps punishment would be best direct towards those who should know better (parents). Here, the harm being to others…how can we replace this particular behavior? Yes, education, but there also needs to be something better for these kiddos to be doing with their time.

                  Further reading can be found in punishment, reinforcement, functional replacement behavior.

        • Lmaydev@programming.dev
          link
          fedilink
          arrow-up
          8
          arrow-down
          6
          ·
          edit-2
          11 months ago

          Didn’t say jail, you did. I in fact didn’t talk about punishment at all.

          But there has to be consequences.

          If kids steal we don’t just throw them straight in jail. But it is a possible consequence.

          We’re also talking about 14 year olds not literal children.

            • yamanii@lemmy.world
              link
              fedilink
              arrow-up
              3
              arrow-down
              2
              ·
              11 months ago

              What’s with remnants of reddit and pretending teenagers are kids? They aren’t, they are teens, they can even make babies with themselves, drive and vote.

          • NightAuthor@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            6
            ·
            11 months ago

            Illegal necessarily implies punishment, as far as I understand.

            Also, 14 year olds are children. But the trajectory of this conversation is clear, and it’s not going anywhere.

            • Lmaydev@programming.dev
              link
              fedilink
              arrow-up
              7
              ·
              11 months ago

              Well that’s the result when you put words in peoples’ mouths, instead of trying to have a discussion.

              • NightAuthor@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                4
                ·
                11 months ago

                If I don’t make any logical steps given the limited words provided in a conversation, then communication becomes impossibly slow. Therefore i feel that I have to make such logical steps. Because text based communication, in the current times, is a bandwidth constraint on the passage of concepts between two human minds. In this case, because of said bandwidth constraint existing between your brain and mine, I made the step and assumed that when you mention making something illegal, that you meant that governments should prohibit the act and do as they (in my understanding) typically do and enforce said prohibition with threat of incarceration. That may have been an oversimplified view of the judicial system, there are other means of enforcement, but I’m only really familiar with the idea of children either being incarcerated or maybe given community service, but I usually (I’m not sure why) given to believe that community service isn’t usually a statutory punishment, but rather a discretionary adjustment that a judge can afford someone. It’s also worth noting that I have concerns about the way in which minorities are disproportionately sentenced, procecuted, and ultimately harmed by the judicial system. Concerns which bias my thoughts when the subject is raised. But I’d like to make clear that I’m using the term bias a bit more strictly, as in every human has a bias against/for basically everything.

                So, if I may take another leap, it seems you’re implying that you are specifically talking about me, and not using “you” in the general sense. And I’ll assume you’re actually referring to this current conversation, and claiming that I caused this outcome because I put words in your mouth. Oh, and by that you’re saying (again, these are my assumptions) that I’m claiming that you said something which you never actually said.

                So maybe, if you take some logical leaps for the sake of me being able to type this in my life time, you can see that I was not necessarily trying to maliciously misconstrue what it is that you were saying.

                And in case it’s not clear, the above is conveyed with mild contempt for you.

                • Lmaydev@programming.dev
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  11 months ago

                  It’s actually called a straw man logical fallacy.

                  You exaggerated what I said and then attacked your exaggeration.

      • TheOneWithTheHair@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        6
        ·
        edit-2
        11 months ago

        the kids properly educated about why.

        https://dare.org/

        • DARE is celebrating its 40th anniversary.
        • It has officer-led classroom lessons that reach 2,500,000 K-12 students per year.
        • “Enriching students across the US and 29+ countries around the world”

        If your argument is “The educators just need to make sure the kids learn that this is not a joke”, DARE has been educating students about the dangers of illegal drugs for 40 years.

        Overdoses claimed more than 112,000 American lives from May 2022 to May 2023, according to the Centers for Disease Control and Prevention, a 37 percent increase compared with the 12-month period ending in May 2020.

        https://www.pbs.org/newshour/health/how-dozens-of-u-s-adolescents-are-dying-of-drug-overdoses-each-month-shown-in-3-charts

        You might persuade some, but the problem will not go away.

        • Crewman@sopuli.xyz
          link
          fedilink
          arrow-up
          13
          ·
          11 months ago

          DARE is known as a bad program, because it goes for fear mongering rather actual education. Everyone knows someone who uses marijuana, and they’re teeth haven’t all fallen out and they’re haven’t turned into a psychotic murderer. [VOX - Why anti-drug campaigns like DARE fail

          ](https://www.vox.com/2014/9/1/5998571/why-anti-drug-campaigns-like-dare-fail)

          There are good and bad ways to go about education. Like comprehensive sex education vs abstinence only, even though they’re covering the same topic, actual education is much more effective than just say no. [NLM Abstinence-only and comprehensive sex education study

          ](https://pubmed.ncbi.nlm.nih.gov/18346659/)

          • TheOneWithTheHair@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            11 months ago

            That was my point. DARE didn’t stop drug use. Any education will persuade some. However, unless the students and their families buy in at 100%, this problem isn’t going away.

            About 130 million adults in the U.S. have low literacy skills according to a Gallup analysis of data from the U.S. Department of Education. This means more than half of Americans between the ages of 16 and 74 (54%) read below the equivalent of a sixth-grade level.

            https://www.apmresearchlab.org/10x-adult-literacy

            The starkest differences were seen by education group. Returning to the first question given above, in many countries adults with a “low” level of education (the equivalent of completing secondary school) had less than a 50% chance of getting the question correct. In places like Canada and United States, this fell to as low as 25%.

            https://phys.org/news/2018-03-high-adults-unable-basic-mathematical.html

            Education alone is not going to make this go away.

            • Crewman@sopuli.xyz
              link
              fedilink
              arrow-up
              3
              ·
              11 months ago

              I 100% agree that education alone will not resolve the issue, but I believe education can help the efficacy of other approaches.

            • Rai@lemmy.dbzer0.com
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              11 months ago

              I did a report on the dangers of LSD when I was young.

              I learned it’s impossible to overdose on and nobody has died directly as a result of it.

              I had never been so interested in trying something out. “Okay so the world becomes crazy for 4-8 hours and you see crazy stuff and everything is hilarious and you can’t die at all and all you gotta do it be in a comfy set and setting”

              God damn, Imma clean out a vial and watch Enter the Void

              Quick edit: staring at my MacBook Pro turned into fractals and it’s just fucking anodized aliminium wtf that’s cool

        • LilB0kChoy@lemm.ee
          link
          fedilink
          arrow-up
          11
          ·
          edit-2
          11 months ago

          DARE is not a good example to hold up because the program doesn’t work.

          Although some studies reveal that DARE has the positive effects of promoting positive police- juvenile relations and imparting accurate information about drugs and drug use, but it does not appear to deter drug use.

          Edit: to clarify, DARE has always been flawed and ineffective. There was a study in 1994 that showed this yet it didn’t stop or change the program.

        • Riskable@programming.dev
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          11 months ago

          You’re using DARE as a positive example‽ the DARE program is widely considered to be an enormous failure. Here’s a decent rundown:

          https://www.talkitoutnc.org/dare-program-effectiveness/#:~:text=program failed to live up,rate of teen drug use.

          (But if you just search it up you’ll find hundreds of similar articles)

          I was in school when the DARE program was quite strongly promoted and I specifically remember being fed endless misinformation about drugs. It was never about educating children it was about trying to scare them with bullshit.

          “If they were wrong about marijuana being addicting they’re probably wrong about everything else…”

          …aaaaand that’s how young people ended up trying all sorts of new things they shouldn’t have.

        • lolcatnip@reddthat.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          11 months ago

          Other people seem to think you’re holding up Dare as a positive example. I can tell you’re not, but I don’t think it’s a great negative example either. So much of the content is fear mongering bullshit that anyone who actually encounters drugs in real life will see through it.

          Education works a lot better when you teach kids things that aren’t directly contradicted by their experiences or their peers’.

      • interceder270@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        3
        ·
        edit-2
        11 months ago

        I think at some point kids need to learn that there won’t be someone stopping them from doing bad things.

        They need to suffer the consequences of their actions through social rejection. If the microcosm is so shitty that it doesn’t ostracize people who disseminate nudes, then the people in it deserve to suffer until they improve.

        This should be one of the easiest ways to identify shitbags, but I understand a lot of social hierarchies put shitbags at or near the top.

    • OurTragicUniverse@kbin.social
      link
      fedilink
      arrow-up
      11
      arrow-down
      11
      ·
      11 months ago

      Boys and men are pretty similar the world over. Some are always going to be creeps who do shit like this, it doesn’t matter what culture they’re in.

  • TheEighthDoctor@lemmy.world
    link
    fedilink
    arrow-up
    26
    arrow-down
    3
    ·
    11 months ago

    What’s the fundamental difference between a deep fake and a good Photoshop and why do we need more laws to regulate that?

    • UlrikHD@programming.dev
      link
      fedilink
      arrow-up
      23
      arrow-down
      1
      ·
      11 months ago

      Lower skill ceiling. One option can be done by pretty much anyone at a high volume output, the other would require a lot training and are not available for your average basement dweller.

      Good luck trying to regulate it though, Pandora’s box is opened and you won’t be able to stop the FOSS community from working on the tech.

        • UlrikHD@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          11 months ago

          Photoshop (if it does?) and any other believable face swap apps use some sort of neural networks, which is exactly the problematic tech we are talking about.

  • Gork@lemm.ee
    link
    fedilink
    arrow-up
    20
    arrow-down
    2
    ·
    11 months ago

    President Joe Biden signed an executive order in October that, among other things, called for barring the use of generative AI to produce child sexual abuse material or non-consensual “intimate imagery of real individuals.” The order also directs the federal government to issue guidance to label and watermark AI-generated content to help differentiate between authentic and material made by software.

    Step in the right direction, I guess.

    How is the government going to be able to differentiate authentic images/videos from AI generated ones? Some of the AI images are getting super realistic, to the point where it’s difficult for human eyes to tell the difference.

    • apex32@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      11 months ago

      That’s a cool quiz, and it’s from 2022. I’m sure AI has improved since then. Would love to see an updated version.

    • CommanderCloon@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      11 months ago

      I wouldn’t call this a step in the tight direction. A call for a step yeah, but it’s not actually a step until something is actually done

  • Treczoks@lemm.ee
    link
    fedilink
    arrow-up
    20
    arrow-down
    3
    ·
    11 months ago

    The problem is how to actually prevent this. What could one do? Make AI systems illegal? Make graphics tools illegal? Make the Internet illegal? Make computers illegal?

      • Treczoks@lemm.ee
        link
        fedilink
        arrow-up
        16
        arrow-down
        2
        ·
        11 months ago

        Isn’t it already? Has it provided any sort of protection? Many things in this world are illegal, and nobody cares.

        • Jimmyeatsausage@lemmy.world
          link
          fedilink
          arrow-up
          9
          arrow-down
          5
          ·
          11 months ago

          Yes, I would argue that if CSAM was legal, there would be more of it…meaning it being illegal provides a level of protection.

          • yamanii@lemmy.world
            link
            fedilink
            arrow-up
            6
            arrow-down
            1
            ·
            11 months ago

            I wonder why are you being downvoted, something being illegal puts fear in most people to not do it.

            • 31337@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              11 months ago

              I’ve been wondering about this lately, but I’m not sure how much of an effect this has. There are millions of people in prison, and many of those will go on to offend again. Making things illegal can be seen as an agreement to a social contract (in a democracy), drive the activity underground (probably good thing in many cases), and prevent businesses (legal entities) from engaging in the activity; but I’m not sure how well it works on an individual level of deterrence. Like, if there were no laws, I can not really think of a law I would break that I wouldn’t already break regardless. I guess I’d just be more open about it.

              Though, people who cause harm to others should be removed from society, and ideally, quickly rehabilitated, and released back into society as a productive member.

      • CAVOK@lemmy.world
        link
        fedilink
        arrow-up
        10
        arrow-down
        2
        ·
        edit-2
        11 months ago

        It is where I’m at. Draw Lisa Simpson nude and you get a visit from the law. Dunno what the punishment is though. A fine? Jail? Can’t say.

        Edit: Apparently I was wrong, it has to be a realistic drawing. See here: 2010/0064/COD doc.nr 10335/1/10 REV 1

        • Rodeo@lemmy.ca
          link
          fedilink
          arrow-up
          6
          arrow-down
          4
          ·
          11 months ago

          What about making depictions of other crimes? Should depictions of theft be illegal? Depictions of murder?

          Why should depictions of one crime be made illegal, but depictions of other heinous crimes remain legal?

          • Jimmyeatsausage@lemmy.world
            link
            fedilink
            arrow-up
            7
            arrow-down
            2
            ·
            11 months ago

            Because a picture of someone robbing my house doesn’t revictimize me. Even if it’s simulated, every time they run into some rando who recognizes them or every time a potential employer runs a background/social media check, it impacts the victim again

            • Rodeo@lemmy.ca
              link
              fedilink
              arrow-up
              4
              arrow-down
              2
              ·
              edit-2
              11 months ago

              Who is being victimized with a drawing of Lisa Simpson?

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      arrow-up
      10
      arrow-down
      11
      ·
      11 months ago

      Require consent to take a person’s picture and hold them liable for whatever comes from them putting it on a computer.

      • jimbo@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        2
        ·
        11 months ago

        That’s a whole fucking can of worms we don’t need to open. Just make faking porn a crime similar to publishing revenge porn.

        • afraid_of_zombies@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          11 months ago

          Nah. Use my image and pay me what I want. If I can’t make a Mickey Mouse movie they shouldn’t be able to make a porn staring me. Does a corporatation have more rights to an image than I have to my image?

          • jimbo@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            11 months ago

            That really depends on what you consider “using my image”. Are you going to demand that people pay you because you were wandering around in the background of their family photo or their YouTube video? Will you ask to be compensated when people post group photos that include you on their social media? Does mom owe you money for all those pictures she took of you as a kid?

      • Treczoks@lemm.ee
        link
        fedilink
        arrow-up
        6
        arrow-down
        18
        ·
        11 months ago

        You already need consent to take a persons picture. Did it help in this case? I don’t think so.

          • Treczoks@lemm.ee
            link
            fedilink
            arrow-up
            4
            arrow-down
            3
            ·
            11 months ago

            Sorry, I forgot that the US is decades behind the rest of the world in privacy laws.

            Well, maybe you could start with this aspect.

        • afraid_of_zombies@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          8
          ·
          11 months ago

          Really? Please show me the signed and notarized letter with the girl’s name on it that says they agree to have their image used for AI porn. Also since she is a minor her legal guardians.

          • CommanderCloon@lemmy.ml
            link
            fedilink
            arrow-up
            6
            arrow-down
            1
            ·
            11 months ago

            How would you possibly enforce that, or prevent people from just copying publicly available pictures for nefarious usage

            • afraid_of_zombies@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              4
              ·
              11 months ago

              It would have to be enforced after getting caught. As an add on charge. Like if an area has a rule against picking locks to commit a crime. You can never be charged with it alone but you can add that on to existing charges.

  • leaky_shower_thought@feddit.nl
    link
    fedilink
    arrow-up
    19
    arrow-down
    2
    ·
    11 months ago

    reading this, I don’t really know what is supposed to be protected here to be deemed possible of protections in the first place.

    closest reasonable one is the girl’s “identity”, so it could be fraud. but it’s not used to fool people. more likely, those getting the pics already consented this is ai generated.

    so might be defamation?

    the image generation tech is already easily accessible so the girl’s picture being easily accessible might be the weakest link?

      • DarkGamer@kbin.social
        link
        fedilink
        arrow-up
        6
        arrow-down
        2
        ·
        edit-2
        11 months ago

        Thanks for the valuable contribution to this discussion! It does appear this is a question of identity and personality rights, regarding how one wants to be portrayed.

        Reading that article though, it seems like that only applies to commercial purposes. If one is making deep fakes for their own non-commercial private use, it doesn’t appear personality rights apply.

  • virock@lemmy.world
    link
    fedilink
    arrow-up
    19
    arrow-down
    2
    ·
    11 months ago

    I studied Computer Science so I know that the only way to teach an AI agent to stop drawing naked girls is to… give it pictures of naked girls so it can learn what not to draw :(

    • rustydomino@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      hmmm - I wonder it makes sense to use generative AI to create negative training data for things like CP. That would essentially be a victimless way to train the AIs. Of course, that creates the conundrum of who actually verifies the AI-generated training data…

      • gohixo9650@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        ·
        11 months ago

        this doesn’t work. AI still needs to know what is CP in order to create CP for negative use. So you need to first feed it with CP. Recent example of how OpenAI was labelling “bad text”

        The premise was simple: feed an AI with labeled examples of violence, hate speech, and sexual abuse, and that tool could learn to detect those forms of toxicity in the wild. That detector would be built into ChatGPT to check whether it was echoing the toxicity of its training data, and filter it out before it ever reached the user. It could also help scrub toxic text from the training datasets of future AI models.

        To get those labels, OpenAI sent tens of thousands of snippets of text to an outsourcing firm in Kenya, beginning in November 2021. Much of that text appeared to have been pulled from the darkest recesses of the internet. Some of it described situations in graphic detail like child sexual abuse, bestiality, murder, suicide, torture, self harm, and incest.

        source: https://time.com/6247678/openai-chatgpt-kenya-workers/

  • renrenPDX@lemmy.world
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    11 months ago

    This is treading on some dangerous waters. Kids need to realize this is way too close to basically creating underage pornography/trafficking.

  • interceder270@lemmy.world
    link
    fedilink
    arrow-up
    13
    arrow-down
    7
    ·
    11 months ago

    I think the best way to combat this is to ostracize anyone who participates in it.

    Let it be a litmus test to see who is and is not worth hanging out with.

    • MotoAsh@lemmy.world
      link
      fedilink
      arrow-up
      15
      ·
      11 months ago

      The problem with that plan is there are too many horrible people in the world. They’ll just group up and keep going. Horrible people don’t stop over mere inconvenience.

      • interceder270@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        4
        ·
        edit-2
        11 months ago

        Yeah. Those horrible people can have a shitty life surrounded by other horrible people.

        Let them be horrible together and we can focus on the people who matter.

        • yamanii@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          11 months ago

          Just like Nazis won’t go away just because you ignore them, it’s the same thing here.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      11 months ago

      These deepfakes don’t disappear. You can ostracize all you like, but that won’t stop these from potentially haunting girls for the rest of their lives.

      I don’t know what the solution is, honestly.

      • calypsopub@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        4
        ·
        11 months ago

        Why should it haunt them? Even if the images were REAL, why should it haunt them? I’m so tired of the puritanical shame women are supposed to feel about their bodies. We all have the same basic equipment. If a guy makes a deep fake, it is HE who should feel shame and humiliation for being a sick pervert. Girls need to be taught this. Band together and laugh these idiots off campus. Name and shame online. Make sure HE will be the one haunted forever.

        • Flying Squid@lemmy.world
          link
          fedilink
          arrow-up
          7
          arrow-down
          1
          ·
          11 months ago

          I don’t mean psychologically haunt them, I mean follow them for the rest of their lives affecting things like jobs and relationships. It doesn’t matter whether or not they’re fake if people don’t think they’re fake.

          Naming and shaming who did this to them will not stop them from being fired from their schoolteaching job in 15 years when the school discovers those images. Do you think “those were fake” is going to be enough for the school corporation if it’s in, for example, Arkansas?

      • crashoverride@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        5
        ·
        11 months ago

        The solution is for no one to care or make a big deal out of it, they’re not real so you shouldn’t care.