diff --git a/app/_hub/kong-inc/ai-request-transformer/overview/_index.md b/app/_hub/kong-inc/ai-request-transformer/overview/_index.md index 92b4481d63ba..201efe8e0369 100644 --- a/app/_hub/kong-inc/ai-request-transformer/overview/_index.md +++ b/app/_hub/kong-inc/ai-request-transformer/overview/_index.md @@ -9,7 +9,7 @@ This plugin supports `llm/v1/chat` style requests for all of the same providers It also uses all of the same configuration and tuning parameters as the AI Proxy plugin, under the [`config.llm`](/hub/kong-inc/ai-request-transformer/configuration/#config-llm) block. -The AI Request Transformer plugin runs **after** all of the [AI Prompt](/hub/?search=ai%2520prompt) plugins, but **before** the +The AI Request Transformer plugin runs **before** all of the [AI Prompt](/hub/?search=ai%2520prompt) plugins and the AI Proxy plugin, allowing it to also introspect LLM requests against the same, or a different, LLM. ## How it works