Skip to content

Commit

Permalink
custom model dosc
Browse files Browse the repository at this point in the history
  • Loading branch information
Guaris authored and tysoekong committed Sep 27, 2024
1 parent 7601490 commit 54dbf46
Show file tree
Hide file tree
Showing 14 changed files with 81 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -66,3 +66,4 @@ curl -X POST http://localhost:8000/anthropic-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```
{% include_cached /md/plugins-hub/ai-custom-model-advanced.md %}
Original file line number Diff line number Diff line change
Expand Up @@ -77,3 +77,5 @@ curl -X POST http://localhost:8000/azure-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```

{% include_cached /md/plugins-hub/ai-custom-model-advanced.md %}
Original file line number Diff line number Diff line change
Expand Up @@ -64,3 +64,5 @@ curl -X POST http://localhost:8000/cohere-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```

{% include_cached /md/plugins-hub/ai-custom-model-advanced.md %}
Original file line number Diff line number Diff line change
Expand Up @@ -112,3 +112,4 @@ curl -X POST http://localhost:8000/llama2-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```
{% include_cached /md/plugins-hub/ai-custom-model-advanced.md %}
Original file line number Diff line number Diff line change
Expand Up @@ -103,3 +103,4 @@ curl -X POST http://localhost:8000/mistral-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```
{% include_cached /md/plugins-hub/ai-custom-model-advanced.md %}
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,7 @@ formats:
{% endplugin_example %}
<!--vale on-->


### Test the configuration

Make an `llm/v1/chat` type request to test your new endpoint:
Expand All @@ -64,3 +65,4 @@ curl -X POST http://localhost:8000/openai-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```
{% include_cached /md/plugins-hub/ai-custom-model-advanced.md %}
Original file line number Diff line number Diff line change
Expand Up @@ -82,3 +82,4 @@ curl -X POST http://localhost:8000/anthropic-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```
{% include_cached /md/plugins-hub/ai-custom-model.md %}
Original file line number Diff line number Diff line change
Expand Up @@ -89,3 +89,4 @@ curl -X POST http://localhost:8000/azure-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```
{% include_cached /md/plugins-hub/ai-custom-model.md %}
Original file line number Diff line number Diff line change
Expand Up @@ -76,3 +76,5 @@ curl -X POST http://localhost:8000/cohere-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```

{% include_cached /md/plugins-hub/ai-custom-model.md %}
Original file line number Diff line number Diff line change
Expand Up @@ -123,3 +123,5 @@ curl -X POST http://localhost:8000/llama2-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```

{% include_cached /md/plugins-hub/ai-custom-model.md %}
Original file line number Diff line number Diff line change
Expand Up @@ -115,3 +115,4 @@ curl -X POST http://localhost:8000/mistral-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```
{% include_cached /md/plugins-hub/ai-custom-model.md %}
Original file line number Diff line number Diff line change
Expand Up @@ -76,3 +76,4 @@ curl -X POST http://localhost:8000/openai-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
```
{% include_cached /md/plugins-hub/ai-custom-model.md %}
32 changes: 32 additions & 0 deletions app/_includes/md/plugins-hub/ai-custom-model-advanced.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@

For all providers, the Kong AI Proxy Advanced plugin attaches to **route** entities.

### Custom model

You can configure the AI Proxy Advanced Plugin using a custom model of your choice by setting the `name` and `upstream_url` when configuring the model.

<!--vale off-->
{% plugin_example %}
plugin: kong-inc/ai-proxy-advanced
name: ai-proxy-advanced
config:
targets:
- route_type: "llm/v1/chat"
auth:
header_name: "Authorization"
header_value: "Bearer <openai_key>"
- model:
name: custom_model_name
provider: openai|azure|anthropic|cohere|mistral|llama2|gemini|bedrock
options:
upstream_url: http://localhost:8000/vi/chat/completions
targets:
- route
formats:
- curl
- konnect
- yaml
- kubernetes
- terraform
{% endplugin_example %}
<!--vale on-->
32 changes: 32 additions & 0 deletions app/_includes/md/plugins-hub/ai-custom-model.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@

For all providers, the Kong AI Proxy Advanced plugin attaches to **route** entities.

### Custom model

You can configure the AI Proxy Advanced Plugin using a custom model of your choice by setting the `name` and `upstream_url` when configuring the model.

<!--vale off-->
{% plugin_example %}
plugin: kong-inc/ai-proxy-advanced
name: ai-proxy-advanced
config:
targets:
- route_type: "llm/v1/chat"
auth:
header_name: "Authorization"
header_value: "Bearer <openai_key>"
- model:
name: custom_model_name
provider: openai|azure|anthropic|cohere|mistral|llama2|gemini|bedrock
options:
upstream_url: http://localhost:8000/vi/chat/completions
targets:
- route
formats:
- curl
- konnect
- yaml
- kubernetes
- terraform
{% endplugin_example %}
<!--vale on-->

0 comments on commit 54dbf46

Please sign in to comment.