- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
You must log in or register to comment.
This is so stupid. You shouldn’t have to “jailbreak” these systems. The information is already out there with a google search.
One of 6 described methods :
The model is prompted to explain refusals and rewrite the prompt iteratively until it complies.