Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • danciestlobster@lemmy.zip
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    7 days ago

    I don’t understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn’t that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it’s source material is would be the obvious choice here

    • wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      You know how when you look at a picture of someone and you cover up the clothed bits, they look naked. Your brain fills in the gaps with what it knows of general human anatomy.

      It’s like that.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      7 days ago

      This is mostly about swapping faces. You take a video and a photo of someone’s face. Software can replace the face of someone in the video with that face. That’s been around for a decade or so. There are other ways of doing it.

      When the face belongs to an underage individual, and the video is pornographic…

      LLMs only do text.