A 13-year-old girl at a Louisiana middle school got into a fight with classmates who were sharing AI-generated nude images of her

The girls begged for help, first from a school guidance counselor and then from a sheriff’s deputy assigned to their school. But the images were shared on Snapchat, an app that deletes messages seconds after they’re viewed, and the adults couldn’t find them. The principal had doubts they even existed.

Among the kids, the pictures were still spreading. When the 13-year-old girl stepped onto the Lafourche Parish school bus at the end of the day, a classmate was showing one of them to a friend.

“That’s when I got angry,” the eighth grader recalled at her discipline hearing.

Fed up, she attacked a boy on the bus, inviting others to join her. She was kicked out of Sixth Ward Middle School for more than 10 weeks and sent to an alternative school. She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her. The 13-year-old girl’s attorneys allege he avoided school discipline altogether.

    • Fiery@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      ·
      3 hours ago

      The problem is that it’s impossible to take out this one application. There doesn’t need to be any actual nude pictures of children in the training set for the model to figure out that a naked child is basically just a naked adult but smaller. (Ofc I’m simplifying a bit).

      Even going further and saying let’s remove all nakedness from our dataset, it’s been tried… And what they found is that removing such a significant source of detailed pictures containing a lot of skin decreased the quality of any generated image that has to do with anatomy.

      The solution is not a simple ‘remove this from the training data’. (Not to mention existing models that are able to generate these kinds of pictures are impossible to globally disable even if you were to be able to affect future ones)

      As to what could actually be done, applying and evolving scanning for such pictures (not on people’s phones though [looking at you here EU].) That’s the big problem here, it got shared on a very big social app, not some fringe privacy protecting app (there is little to do except eliminate all privacy if you’d want to eliminate it on this end)

      Regulating this at the image generation level could also be rather effective. There aren’t that many 13 year old savvy enough to set up a local model to generate there. So further checks at places where the images are generated would also help to some degree. Local generation is getting easier by the day to set up though, so while this should be implemented it won’t do everything.

      In conclusion: it’s very hard to eliminate this, but ways exist to make it harder.

    • ObjectivityIncarnate@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      4 hours ago

      You say this as if the US is the only place generative AI models exist.

      That said, the US (and basically every other) government is helpless against the tsunami of technology in general, much less global tech from companies in other countries.

      • Fedizen@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        4 hours ago

        I’m saying why is it so easy for like 12 year olds to find these sites? Its not exactly a pirate bay situation - you can’t generate these kind of AI videos with just a website copied off a USB and an IP address.

        These kind of resources should be far easier to shutdown access to than pirate bay.

    • BarneyPiccolo@lemmy.today
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      11 hours ago

      Because our country is literally being run by an actual pedophile ring.

      They’d be more likely to want to know how to do it themselves, than to stop it.

    • Taldan@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      11 hours ago

      Because money is the only thing we, as a country, truly care about. We’re only against things like CP and pedos as long as it doesn’t get in the way of making money. Same reason Trump sharing Larry Nassar and Jeffrey Epstein’s love of “young and nubile” women, as Epstein put it, didn’t kill his political career – he’s the pro-business candidate who makes the wealthy even wealthier

      • Thebeardedsinglemalt@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        8 hours ago

        The orange Nazi could be raping a 12 yr old girl on national tv, but say it’s the libs and drag queens who are the rapists, and his cult with put their domestic terrorist hats back on

        • BarneyPiccolo@lemmy.today
          link
          fedilink
          arrow-up
          7
          ·
          11 hours ago

          EVERYTHING is political these days, you just get tired of defending corrupt, traitor, racist, misogynist, ignorant, incompetent, PEDOPHILE.

          And ANYONE who supports him are all those same things themselves. Repeat: ALL MAGAs are corrupt, treasonous, racist, misogynist, ignorant, incompetent, and PEDOPHILES.

          That includes YOU. You are defending him, that makes YOU a PEDOPHILE.

          • jve@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 hours ago

            That includes YOU. You are defending him, that makes YOU a PEDOPHILE.

            Im with you mostly, but words do mean things, even in this post-fact society.

            • BarneyPiccolo@lemmy.today
              link
              fedilink
              arrow-up
              3
              ·
              9 hours ago

              I understand that, which is why I want to make it very clear that anyone who voted for Trump is a Pedophile.

              Don’t like it? Don’t vote for pedophiles.

              • jve@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                edit-2
                8 hours ago

                Sure, bud.

                I guess that makes all voters politicians, then?

                Or just voters that defend politicians?

                Not real clear how this transitive property is supposed to work.

                • BarneyPiccolo@lemmy.today
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  8 hours ago

                  No, just voters that are MAGAs, which supports and defends pedophiles as an official tentpost of their party philosophy.

                  It’s simple: Anyone who supports and defends pedophiles is a pedophile. If you vote MAGA, which is ANY right wing/conservative candidate, then you are a Pedophile.

                  It’s so simple, even a MAGA pedophile like you can understand it.

                  • jve@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    edit-2
                    6 hours ago

                    Ah, so you’re just applying it to this one word.

                    These same supporters are not; con men, obstructionists, felons, rapists, liars, insurrectionists, fraudsters, racketeers, etc etc. just pedophiles.

                    It’s so simple, even a MAGA pedophile like you can understand it.

                    I’m not, but thanks for playing. Surely even a pedophile like you (I’ve changed the definition to include you) can understand that somebody can say “words mean things” without being a pedophile.

        • Echo Dot@feddit.uk
          link
          fedilink
          arrow-up
          28
          ·
          19 hours ago

          Because the question was political. I’m sorry that you’ve got such a teeny tiny brain that you can’t work out that if somebody asks a political question then the response must demonstrably be political. I don’t know how else to put it.

        • michaelmrose@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          ·
          20 hours ago

          There is no reason to believe Biden is a villain here meanwhile trump was found to be a rapist in court

      • papertowels@mander.xyz
        link
        fedilink
        arrow-up
        4
        ·
        9 hours ago

        Snapchat allowing this on their platform is the insane part to me. How are they still operating if they’re letting CSAM on the platform??