Skip to content

Commit

Permalink
Update OpenAI and Azure OpenAI connector setup guides (#5358)
Browse files Browse the repository at this point in the history
* Update OpenAI connector setup guide

* Updates Azure OpenAI guide

* adds serverless changes

* updates verbiage

* minor update

* Update docs/serverless/assistant/connect-to-openai.mdx

Co-authored-by: Nastasha Solomon <[email protected]>

* Update docs/assistant/connect-to-openai.asciidoc

Co-authored-by: Nastasha Solomon <[email protected]>

* Update docs/assistant/azure-openai-setup.asciidoc

Co-authored-by: Nastasha Solomon <[email protected]>

* Update docs/serverless/assistant/connect-to-azure-openai.mdx

Co-authored-by: Nastasha Solomon <[email protected]>

---------

Co-authored-by: Nastasha Solomon <[email protected]>
  • Loading branch information
benironside and nastasha-solomon authored Jun 10, 2024
1 parent ea8158c commit ee612dc
Show file tree
Hide file tree
Showing 4 changed files with 9 additions and 7 deletions.
2 changes: 1 addition & 1 deletion docs/assistant/azure-openai-setup.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ Now, set up the Azure OpenAI model:
** If you select `gpt-4`, set the **Model version** to `0125-Preview`.
** If you select `gpt-4-32k`, set the **Model version** to `default`.
+
IMPORTANT: The models available to you will depend on https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models#model-summary-table-and-region-availability[region availability]. For best results, use `GPT 4 Turbo version 0125-preview` or `GPT 4-32k` with the maximum Tokens-Per-Minute (TPM) capacity. In most regions, the GPT 4 Turbo model offers the largest supported context window.
IMPORTANT: The models available to you depend on https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models#model-summary-table-and-region-availability[region availability]. For best results, use `GPT-4o 2024-05-13` with the maximum Tokens-Per-Minute (TPM) capacity. For more information on how different models perform for different tasks, refer to the <<llm-performance-matrix>>.
+
. Under **Deployment type**, select **Standard**.
. Name your deployment.
Expand Down
3 changes: 2 additions & 1 deletion docs/assistant/connect-to-openai.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ This page provides step-by-step instructions for setting up an OpenAI connector

Before creating an API key, you must choose a model. Refer to the https://platform.openai.com/docs/models/gpt-4-turbo-and-gpt-4[OpenAI docs] to select a model. Take note of the specific model name (for example `gpt-4-turbo`); you'll need it when configuring {kib}.

NOTE: `GPT-4 Turbo` offers increased performance. `GPT-4` and `GPT-3.5` are also supported.
NOTE: `GPT-4o` offers increased performance over previous versions. For more information on how different models perform for different tasks, refer to the <<llm-performance-matrix>>.

[discrete]
=== Create an API key
Expand Down Expand Up @@ -51,6 +51,7 @@ To integrate with {kib}:
. Provide a name for your connector, such as `OpenAI (GPT-4 Turbo Preview)`, to help keep track of the model and version you are using.
. Under **Select an OpenAI provider**, choose **OpenAI**.
. The **URL** field can be left as default.
. Under **Default model**, specify which https://platform.openai.com/docs/models/gpt-4-turbo-and-gpt-4[model] you want to use.
. Paste the API key that you created into the corresponding field.
. Click **Save**.

Expand Down
2 changes: 1 addition & 1 deletion docs/serverless/assistant/connect-to-azure-openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ Now, set up the Azure OpenAI model:
8. Click **Create**.

<DocCallOut title="Important" color="warning">
The models available to you will depend on [region availability](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models#model-summary-table-and-region-availability). For best results, use `GPT 4 Turbo version 0125-preview` or `GPT 4-32k` with the maximum Tokens-Per-Minute (TPM) capacity. In most regions, the GPT 4 Turbo model offers the largest supported context window.
The models available to you will depend on [region availability](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models#model-summary-table-and-region-availability). For best results, use `GPT-4o 2024-05-13` with the maximum Tokens-Per-Minute (TPM) capacity. For more information on how different models perform for different tasks, refer to the <DocLink id="llm-performance-matrix" text="LLM performance matrix"/>.
</DocCallOut>

The following video demonstrates these steps.
Expand Down
9 changes: 5 additions & 4 deletions docs/serverless/assistant/connect-to-openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ This page provides step-by-step instructions for setting up an OpenAI connector
Before creating an API key, you must choose a model. Refer to the [OpenAI docs](https://platform.openai.com/docs/models/gpt-4-turbo-and-gpt-4) to select a model. Take note of the specific model name (for example `gpt-4-turbo`); you'll need it when configuring ((kib)).

<DocCallOut title="Note">
`GPT-4 Turbo` offers increased performance. `GPT-4` and `GPT-3.5` are also supported.
`GPT-4o` offers increased performance over previous versions. For more information on how different models perform for different tasks, refer to the <DocLink id="llm-performance-matrix" text="LLM performance matrix"/>.
</DocCallOut>

### Create an API key
Expand All @@ -43,9 +43,10 @@ To integrate with ((kib)):
2. Navigate to **Stack Management → Connectors → Create Connector → OpenAI**.
3. Provide a name for your connector, such as `OpenAI (GPT-4 Turbo Preview)`, to help keep track of the model and version you are using.
4. Under **Select an OpenAI provider**, choose **OpenAI**.
5. The **URL** field can generally be left unchanged.
6. Enter the API key that you previously created in the corresponding field.
7. Click **Save**.
5. The **URL** field can be left as default.
6. Under **Default model**, specify which [model](https://platform.openai.com/docs/models/gpt-4-turbo-and-gpt-4) you want to use.
7. Paste the API key that you created into the corresponding field.
8. Click **Save**.

The following video demonstrates these steps.

Expand Down

0 comments on commit ee612dc

Please sign in to comment.