• aesthelete@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    2 days ago

    Yeah they probably fed it a bunch of legitimate on/off content as well as stuff from people who used to do make “nudes” from celebrity photos with sheer / skimpy outfits as a creepy hobby.

      • Allero@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        2 days ago

        Honestly, I’d love to see more research on how AI CSAM consumption affects consumption of real CSAM and rates of sexual abuse.

        Because if it does reduce them, it might make sense to intentionally use datasets already involved in previous police investigations as training data. But only if there’s a clear reduction effect with AI materials.

        (Police has already used some materials, with victims’ consent, to crack down on CSAM sharing platforms in the past).