• sachabe@lemmy.world
    link
    fedilink
    English
    arrow-up
    74
    arrow-down
    1
    ·
    6 months ago

    So the only thing the article says is :

    The Model Spec document says NSFW content “may include erotica, extreme gore, slurs, and unsolicited profanity.” It is unclear if OpenAI’s explorations of how to responsibly make NSFW content envisage loosening its usage policy only slightly, for example to permit generation of erotic text, or more broadly to allow descriptions or depictions of violence.

    … and somehow Wired turned it into “OpenAI wants to generate porn”.

    This is just pure clickbait.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      12
      ·
      6 months ago

      Erotic text messages could be considered pornographic work I guess, like erotic literature. But I think they just start to realize how many of their customers jailbreak GPT for that specific purpose, and how good alternatives have gotten who allow for this type of chat, such as NovelAI. Given how many other AI services started to censor things and how much that affected their models (like your chat bot partner getting stuck in consent messages as soon as you went into anything slightly outside vanilla territory), and how much drama that has caused throughout those communities, I highly doubt that “loosening” their policy is going to be enough to sway people towards them instead of the competition.

      • yamanii@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 months ago

        After experiencing janitor AI and local models I’m certainly not coming back to character AI, why waste so much time trying to jailbreak a censored model when we have ones that just do as they are told?

        • DarkThoughts@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          6 months ago

          Janitor, like most “free” models, degrades too quickly for my liking. And if I pay I might as well use NovelAI + Sillytavern, since they don’t have any restrictions on their text gen models that could interfere with their generation. Local models I didn’t had much luck with getting them to run and I suspect they’d be pretty slow too.