Skip to content

Commit

Permalink
[DOCS][Playground] Mention local model compatibility (#192911)
Browse files Browse the repository at this point in the history
Note about openai sdk compatible local models + links to examples
  • Loading branch information
leemthompo authored Sep 16, 2024
1 parent a0973d6 commit 0a385f3
Showing 1 changed file with 15 additions and 2 deletions.
17 changes: 15 additions & 2 deletions docs/playground/index.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,17 @@ a|

|===

[[playground-local-llms]]
[TIP]
====
You can also use locally hosted LLMs that are compatible with the OpenAI SDK.
Once you've set up your LLM, you can connect to it using the OpenAI connector.
Refer to the following for examples:
* {security-guide}/connect-to-byo-llm.html[Using LM Studio]
* https://www.elastic.co/search-labs/blog/localai-for-text-embeddings[LocalAI with `docker-compose`]
====

[float]
[[playground-getting-started]]
== Getting started
Expand All @@ -101,13 +112,15 @@ image::get-started.png[width=600]
=== Connect to LLM provider

To get started with {x}, you need to create a <<action-types,connector>> for your LLM provider.
Follow these steps on the {x} landing page:
You can also connect to <<playground-local-llms,locally hosted LLMs>> which are compatible with the OpenAI API, by using the OpenAI connector.

To connect to an LLM provider, follow these steps on the {x} landing page:

. Under *Connect to an LLM*, click *Create connector*.
. Select your *LLM provider*.
. *Name* your connector.
. Select a *URL endpoint* (or use the default).
. Enter *access credentials* for your LLM provider.
. Enter *access credentials* for your LLM provider. (If you're running a locally hosted LLM using the OpenAI connector, you must input a value in the API key form, but the specific value doesn't matter.)

[TIP]
====
Expand Down

0 comments on commit 0a385f3

Please sign in to comment.