Hacker Newsnew | past | comments | ask | show | jobs | submit | rafiste's submissionslogin
1.GPT-4 provides little protection against leaking system prompts (twitter.com/alexalbert__)
4 points by rafiste on April 12, 2023 | past
2.Using prompt compression to create the most complex GPT-4 jailbreak ever made (twitter.com/alexalbert__)
8 points by rafiste on April 5, 2023 | past | 1 comment
3.A token-smuggling jailbreak for ChatGPT-4 (twitter.com/alexalbert__)
444 points by rafiste on March 16, 2023 | past | 268 comments
4.Jailbreak Chat: A collection of ChatGPT jailbreaks (jailbreakchat.com)
1118 points by rafiste on Feb 28, 2023 | past | 528 comments

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: