diff --git a/_posts/2023-10-31-mitigating-prompt-injections.md.md b/_posts/2023-10-31-mitigating-prompt-injections.md.md index f71e38fab8..c5956e3c9c 100644 --- a/_posts/2023-10-31-mitigating-prompt-injections.md.md +++ b/_posts/2023-10-31-mitigating-prompt-injections.md.md @@ -1,8 +1,9 @@ --- -title: Mitigating prompt injections on Generate AI systems +title: Mitigating prompt injections on Generative AI systems date: 2023-10-31 00:00:00 Z categories: - Tech +- Artificial Intelligence summary: We demonstrate our web app used for experimenting with different types of prompt injection attacks and mitigations on LLMs and how easy it can be to hack GPT through malicious prompts.