• smooth_tea@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    1 year ago

    I really find this a bit alarmist and exaggerated. Consider the motive and the alternative. You really think companies like that have any other options than to deal with those things?

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Very much yes police authorities have CSAM databases. If what you want to do with it really is above board and sensible they’ll let you access that stuff.

      I don’t doubt anything that OpenAI could do with that stuff can be above board, but sensible is another question: Any model that can detect something can be used to train a model which can generate it. As such those models are under lock and key just like their training sets, (social) media platforms which have a use for these things and the resources run them, under the watchful eye of the authorities. Think faceboogle. OpenAI could, in principle, try to get into the business of selling companies at that scale models they can, and have, trained themselves, I don’t really see that making sense from the business POV, either.