misk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 年前Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square126fedilinkarrow-up1788arrow-down116
arrow-up1772arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 年前message-square126fedilink
minus-squareWaluigis_Talking_Buttplug@lemmy.worldlinkfedilinkEnglisharrow-up6arrow-down1·2 年前That’s not how it works, it’s not one word that’s banned and you can’t work around it by tricking the AI. Once it starts to repeat a response, it’ll stop and give a warning.
minus-squarefirecat@kbin.sociallinkfedilinkarrow-up2arrow-down2·2 年前Then don’t make it repeated and command it to make new words.
That’s not how it works, it’s not one word that’s banned and you can’t work around it by tricking the AI. Once it starts to repeat a response, it’ll stop and give a warning.
Then don’t make it repeated and command it to make new words.
deleted by creator