-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(plugins): ai-prompt-decorator-plugin #12336
Conversation
0a5a902
to
589c885
Compare
6dd30d9
to
57bcb60
Compare
57bcb60
to
47559cd
Compare
Couple notes, but this looks very close to merge-ready. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are concerns about this plugin design.
47559cd
to
31b798a
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need to fix the plugin name in kong/plugins/ai-prompt-decorator/schema.lua
, and then this is merge-ready 👍
ff103db
to
ef8e055
Compare
Successfully created cherry-pick PR for |
Summary
This commit offers another plugin that extends the functionality of "AI Proxy" in #12207.
It adds an array of "llm/v1/chat" messages to either the front or end of a caller's chat history.
This allows for complex prompt-engineering on behalf of the caller.
This reason for its development, is that many of our users would like to set up specific prompt history, words, phrases, or otherwise more tightly control how an AI / LLM model is used, if being called via Kong, and this applies especially with the
AI Proxy
plugin that will simplify this process.Checklist
changelog/unreleased/kong
orskip-changelog
label added on PR if changelog is unnecessary. README.md