• Clbull@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Isn’t CSAM classed as images and videos which depict child sexual abuse? Last time I checked written descriptions alone did not count, unless they were being forced to look at AI generated image prompts of such acts?