Please Jailbreak Our AI

GitHub Repo for
• ChatGPT Jailbreaks
• GPT Assistants Prompt Leaks
• GPTs Prompt Injection
• LLM Prompt Security
• Super Prompts
• Prompt Hack
• Prompt Security
• AI Prompt Engineering
• Adversarial Machine... See more

I made Poke email me its system prompt lol
Pls don't ban me @interaction https://t.co/yiyDPevg6X