Utterly unelegant prompts for local LLMs, with scary results.
ai-safety ai-research ai-security llm prompt-engineering llms prompt-injection ai-prompts local-llm jailbreak-prompt llm-reasoning ai-jailbreak ai-jailbreak-prompts jailbreak-prompts prompt-injection-llm-security local-language-model local-llm-safety
-
Updated
Aug 22, 2025