• Cornpop@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    30 days ago

    This is so stupid. You shouldn’t have to “jailbreak” these systems. The information is already out there with a google search.

  • A_A@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    30 days ago

    One of 6 described methods :
    The model is prompted to explain refusals and rewrite the prompt iteratively until it complies.