I feel like our relationship to it is also quite messed.
AI doesn’t actually undress people, it just draws a naked body. It’s an artistic representation, not an X-ray. You’re not getting actual nudes in this process, and AI has no clue how the person looks like naked.
Now, such images can be used to blackmail people, because again, our culture didn’t quite catch up with the fact that every nude image can absolutely be AI-generated fake. When it does, however, I fully expect creators of such things to be seen as odd creeps spreading their fantasies around and any nude imagery to be seen as fake by default.
It’s not an artistic representation, it’s worse. It’s algorithmic and to that extent it actually has a pretty good idea of what a person looks like naked based on their picture. That’s why it’s so disturbing.
Yeah they probably fed it a bunch of legitimate on/off content as well as stuff from people who used to do make “nudes” from celebrity photos with sheer / skimpy outfits as a creepy hobby.
Honestly, I’d love to see more research on how AI CSAM consumption affects consumption of real CSAM and rates of sexual abuse.
Because if it does reduce them, it might make sense to intentionally use datasets already involved in previous police investigations as training data. But only if there’s a clear reduction effect with AI materials.
(Police has already used some materials, with victims’ consent, to crack down on CSAM sharing platforms in the past).
Idk, calling it ‘art’ feels like a reach. At the end of the day, it’s using someone’s real face for stuff they never agreed to. Fake or not, that’s still a massive violation of privacy.
I feel like our relationship to it is also quite messed.
AI doesn’t actually undress people, it just draws a naked body. It’s an artistic representation, not an X-ray. You’re not getting actual nudes in this process, and AI has no clue how the person looks like naked.
Now, such images can be used to blackmail people, because again, our culture didn’t quite catch up with the fact that every nude image can absolutely be AI-generated fake. When it does, however, I fully expect creators of such things to be seen as odd creeps spreading their fantasies around and any nude imagery to be seen as fake by default.
It’s not an artistic representation, it’s worse. It’s algorithmic and to that extent it actually has a pretty good idea of what a person looks like naked based on their picture. That’s why it’s so disturbing.
Yeah they probably fed it a bunch of legitimate on/off content as well as stuff from people who used to do make “nudes” from celebrity photos with sheer / skimpy outfits as a creepy hobby.
Also csam in training data definitely is a thing
Honestly, I’d love to see more research on how AI CSAM consumption affects consumption of real CSAM and rates of sexual abuse.
Because if it does reduce them, it might make sense to intentionally use datasets already involved in previous police investigations as training data. But only if there’s a clear reduction effect with AI materials.
(Police has already used some materials, with victims’ consent, to crack down on CSAM sharing platforms in the past).
Or we could like…not
Idk, calling it ‘art’ feels like a reach. At the end of the day, it’s using someone’s real face for stuff they never agreed to. Fake or not, that’s still a massive violation of privacy.