A 13-year-old girl at a Louisiana middle school got into a fight with classmates who were sharing AI-generated nude images of her
The girls begged for help, first from a school guidance counselor and then from a sheriff’s deputy assigned to their school. But the images were shared on Snapchat, an app that deletes messages seconds after they’re viewed, and the adults couldn’t find them. The principal had doubts they even existed.
Among the kids, the pictures were still spreading. When the 13-year-old girl stepped onto the Lafourche Parish school bus at the end of the day, a classmate was showing one of them to a friend.
“That’s when I got angry,” the eighth grader recalled at her discipline hearing.
Fed up, she attacked a boy on the bus, inviting others to join her. She was kicked out of Sixth Ward Middle School for more than 10 weeks and sent to an alternative school. She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her. The 13-year-old girl’s attorneys allege he avoided school discipline altogether.


Are you kidding me? You can’t ignore rape, lol.
You also can’t ignore fake porn being made of you, that’s the point they were making. Do you know how violating that is? People still tell women to ignore sexual harassment and rape all the fucking time. None of these are solutions except for those who want to sweep these issues under the rug.
Removed by mod
Holy fucking christ that is the biggest steaming dump of a take I’ve ever seen.
Not if it’s being distributed to others or you are being harassed by it.
Basically if you possibly even know that it has been done, then it’s a bigger problem than the material itself.
If, hypothetically, a boy ran a local model to generate such material for himself without ever sharing, then well it’s obviously going to be ignored because no one else in the world even knows it exists. The moment another person becomes a party to the material, it is injurious to the subject.