From 0a385f30fd9b9ab509b3be091d5e1add789cadeb Mon Sep 17 00:00:00 2001 From: Liam Thompson <32779855+leemthompo@users.noreply.github.com> Date: Mon, 16 Sep 2024 14:18:22 +0100 Subject: [PATCH] [DOCS][Playground] Mention local model compatibility (#192911) Note about openai sdk compatible local models + links to examples --- docs/playground/index.asciidoc | 17 +++++++++++++++-- 1 file changed, 15 insertions(+), 2 deletions(-) diff --git a/docs/playground/index.asciidoc b/docs/playground/index.asciidoc index f475c3e2747a2..efb9b6261d8dd 100644 --- a/docs/playground/index.asciidoc +++ b/docs/playground/index.asciidoc @@ -89,6 +89,17 @@ a| |=== +[[playground-local-llms]] +[TIP] +==== +You can also use locally hosted LLMs that are compatible with the OpenAI SDK. +Once you've set up your LLM, you can connect to it using the OpenAI connector. +Refer to the following for examples: + +* {security-guide}/connect-to-byo-llm.html[Using LM Studio] +* https://www.elastic.co/search-labs/blog/localai-for-text-embeddings[LocalAI with `docker-compose`] +==== + [float] [[playground-getting-started]] == Getting started @@ -101,13 +112,15 @@ image::get-started.png[width=600] === Connect to LLM provider To get started with {x}, you need to create a <> for your LLM provider. -Follow these steps on the {x} landing page: +You can also connect to <> which are compatible with the OpenAI API, by using the OpenAI connector. + +To connect to an LLM provider, follow these steps on the {x} landing page: . Under *Connect to an LLM*, click *Create connector*. . Select your *LLM provider*. . *Name* your connector. . Select a *URL endpoint* (or use the default). -. Enter *access credentials* for your LLM provider. +. Enter *access credentials* for your LLM provider. (If you're running a locally hosted LLM using the OpenAI connector, you must input a value in the API key form, but the specific value doesn't matter.) [TIP] ====