Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[OPIK-271] Add integration with aws bedrock #529

Merged
merged 27 commits into from
Nov 5, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
0021cdb
Init bedrock namespace
alexkuzmik Oct 30, 2024
66a89ef
Merge branch 'main' into OPIK-271-add-integration-with-aws-bedrock
alexkuzmik Oct 30, 2024
282ff46
Initial implementation for bedrock converse integration (non-streamin…
alexkuzmik Oct 30, 2024
76fd4eb
Draft implementation for converse_stream support
alexkuzmik Oct 30, 2024
7b921f2
Update implementation for bedrock integration. Stream inside the resp…
alexkuzmik Oct 31, 2024
d8eeb40
Fix lint errors
alexkuzmik Oct 31, 2024
9fb61e7
Update _generators_handler
alexkuzmik Oct 31, 2024
62a9696
Fix lint errors
alexkuzmik Oct 31, 2024
8368f40
Add comments to SpanData and TraceData classes
alexkuzmik Oct 31, 2024
913f72a
Add comment to SpanData and TraceData class
alexkuzmik Oct 31, 2024
c7dca49
Add comment
alexkuzmik Oct 31, 2024
87a334e
created_by -> created_from for unification with other integrations
alexkuzmik Oct 31, 2024
c712c83
add docstrings
alexkuzmik Nov 4, 2024
50a5690
Merge branch 'main' into OPIK-271-add-integration-with-aws-bedrock
alexkuzmik Nov 4, 2024
4ddc82c
Add first draft of Bedrock cookbook
Lothiraldan Nov 4, 2024
d793b4c
Slight update to OpenAI cookbook to avoid recreating the client every…
Lothiraldan Nov 4, 2024
08622a7
Add streaming example to BedRock example
Lothiraldan Nov 4, 2024
341c978
Add streaming example to BedRock example
Lothiraldan Nov 4, 2024
cb92ac7
Apply suggestions from code review
Lothiraldan Nov 5, 2024
e12f3c6
Update bedrock cookbook
Lothiraldan Nov 5, 2024
47c6c81
Introduce Opik.search_spans (#545)
jverre Nov 4, 2024
ac3479b
[OPIK-309] Create prompt endpoint (#531)
thiagohora Nov 4, 2024
95b7540
Move piece of docstring to comment
alexkuzmik Nov 5, 2024
30c9c6a
Add protection from None in _generators_handler
alexkuzmik Nov 5, 2024
b879241
Fix track-stacking when using track_bedrock or track_openai multiple …
alexkuzmik Nov 5, 2024
7da9526
Fix output structure in bedrock chunks aggregator
alexkuzmik Nov 5, 2024
7628d32
Merge branch 'main' into OPIK-271-add-integration-with-aws-bedrock
alexkuzmik Nov 5, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
309 changes: 309 additions & 0 deletions apps/opik-documentation/documentation/docs/cookbook/bedrock.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,309 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Using Opik with AWS Bedrock\n",
"\n",
"Opik integrates with AWS Bedrock to provide a simple way to log traces for all Bedrock LLM calls. This works for all supported models, including if you are using the streaming API.\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Creating an account on Comet.com\n",
"\n",
"[Comet](https://www.comet.com/site?from=llm&utm_source=opik&utm_medium=colab&utm_content=bedrock&utm_campaign=opik) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm&utm_source=opik&utm_medium=colab&utm_content=bedrock&utm_campaign=opik) and grab you API Key.\n",
"\n",
"> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/?from=llm&utm_source=opik&utm_medium=colab&utm_content=bedrock&utm_campaign=opik) for more information."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%pip install --upgrade opik boto3"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import opik\n",
"\n",
"opik.configure(use_local=False)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Preparing our environment\n",
"\n",
"First, we will set up our bedrock client. Uncomment the following lines to pass AWS Credentials manually or [checkout other ways of passing credentials to Boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html). You will also need to request access to the model in the UI before being able to generate text, here we are gonna use the Llama 3.2 model, you can request access to it in [this page for the us-east1](https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/providers?model=meta.llama3-2-3b-instruct-v1:0) region."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import boto3\n",
"\n",
"REGION = \"us-east-1\"\n",
"\n",
"MODEL_ID = \"us.meta.llama3-2-3b-instruct-v1:0\"\n",
"\n",
"bedrock = boto3.client(\n",
" service_name=\"bedrock-runtime\",\n",
" region_name=REGION,\n",
" # aws_access_key_id=ACCESS_KEY,\n",
" # aws_secret_access_key=SECRET_KEY,\n",
" # aws_session_token=SESSION_TOKEN,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Logging traces\n",
"\n",
"In order to log traces to Opik, we need to wrap our Bedrock calls with the `track_bedrock` function:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"\n",
"from opik.integrations.bedrock import track_bedrock\n",
"\n",
"bedrock_client = track_bedrock(bedrock, project_name=\"bedrock-integration-demo\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"PROMPT = \"Why is it important to use a LLM Monitoring like CometML Opik tool that allows you to log traces and spans when working with LLM Models hosted on AWS Bedrock?\"\n",
"\n",
"response = bedrock_client.converse(\n",
" modelId=MODEL_ID,\n",
" messages=[{\"role\": \"user\", \"content\": [{\"text\": PROMPT}]}],\n",
" inferenceConfig={\"temperature\": 0.5, \"maxTokens\": 512, \"topP\": 0.9},\n",
")\n",
"print(\"Response\", response[\"output\"][\"message\"][\"content\"][0][\"text\"])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The prompt and response messages are automatically logged to Opik and can be viewed in the UI.\n",
"\n",
"![Bedrock Integration](https://raw.githubusercontent.com/comet-ml/opik/main/apps/opik-documentation/documentation/static/img/cookbook/bedrock_trace_cookbook.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Logging traces with streaming"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def stream_conversation(\n",
" bedrock_client,\n",
" model_id,\n",
" messages,\n",
" system_prompts,\n",
" inference_config,\n",
"):\n",
" \"\"\"\n",
" Sends messages to a model and streams the response.\n",
" Args:\n",
" bedrock_client: The Boto3 Bedrock runtime client.\n",
" model_id (str): The model ID to use.\n",
" messages (JSON) : The messages to send.\n",
" system_prompts (JSON) : The system prompts to send.\n",
" inference_config (JSON) : The inference configuration to use.\n",
" additional_model_fields (JSON) : Additional model fields to use.\n",
"\n",
" Returns:\n",
" Nothing.\n",
"\n",
" \"\"\"\n",
"\n",
" response = bedrock_client.converse_stream(\n",
" modelId=model_id,\n",
" messages=messages,\n",
" system=system_prompts,\n",
" inferenceConfig=inference_config,\n",
" )\n",
"\n",
" stream = response.get(\"stream\")\n",
" if stream:\n",
" for event in stream:\n",
" if \"messageStart\" in event:\n",
" print(f\"\\nRole: {event['messageStart']['role']}\")\n",
"\n",
" if \"contentBlockDelta\" in event:\n",
" print(event[\"contentBlockDelta\"][\"delta\"][\"text\"], end=\"\")\n",
"\n",
" if \"messageStop\" in event:\n",
" print(f\"\\nStop reason: {event['messageStop']['stopReason']}\")\n",
"\n",
" if \"metadata\" in event:\n",
" metadata = event[\"metadata\"]\n",
" if \"usage\" in metadata:\n",
" print(\"\\nToken usage\")\n",
" print(f\"Input tokens: {metadata['usage']['inputTokens']}\")\n",
" print(f\":Output tokens: {metadata['usage']['outputTokens']}\")\n",
" print(f\":Total tokens: {metadata['usage']['totalTokens']}\")\n",
" if \"metrics\" in event[\"metadata\"]:\n",
" print(f\"Latency: {metadata['metrics']['latencyMs']} milliseconds\")\n",
"\n",
"\n",
"system_prompt = \"\"\"You are an app that creates playlists for a radio station\n",
" that plays rock and pop music. Only return song names and the artist.\"\"\"\n",
"\n",
"# Message to send to the model.\n",
"input_text = \"Create a list of 3 pop songs.\"\n",
"\n",
"\n",
"message = {\"role\": \"user\", \"content\": [{\"text\": input_text}]}\n",
"messages = [message]\n",
"\n",
"# System prompts.\n",
"system_prompts = [{\"text\": system_prompt}]\n",
"\n",
"# inference parameters to use.\n",
"temperature = 0.5\n",
"top_p = 0.9\n",
"# Base inference parameters.\n",
"inference_config = {\"temperature\": temperature, \"topP\": 0.9}\n",
"\n",
"\n",
"stream_conversation(\n",
" bedrock_client,\n",
" MODEL_ID,\n",
" messages,\n",
" system_prompts,\n",
" inference_config,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Bedrock Integration](https://raw.githubusercontent.com/comet-ml/opik/main/apps/opik-documentation/documentation/static/img/cookbook/bedrock_trace_streaming_cookbook.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Using it with the `track` decorator\n",
"\n",
"If you have multiple steps in your LLM pipeline, you can use the `track` decorator to log the traces for each step. If Bedrock is called within one of these steps, the LLM call with be associated with that corresponding step:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from opik import track\n",
"from opik.integrations.bedrock import track_bedrock\n",
"\n",
"bedrock = boto3.client(\n",
" service_name=\"bedrock-runtime\",\n",
" region_name=REGION,\n",
" # aws_access_key_id=ACCESS_KEY,\n",
" # aws_secret_access_key=SECRET_KEY,\n",
" # aws_session_token=SESSION_TOKEN,\n",
")\n",
"\n",
"os.environ[\"OPIK_PROJECT_NAME\"] = \"bedrock-integration-demo\"\n",
"bedrock_client = track_bedrock(bedrock)\n",
"\n",
"\n",
"@track\n",
"def generate_story(prompt):\n",
" res = bedrock_client.converse(\n",
" modelId=MODEL_ID, messages=[{\"role\": \"user\", \"content\": [{\"text\": prompt}]}]\n",
" )\n",
" return res[\"output\"][\"message\"][\"content\"][0][\"text\"]\n",
"\n",
"\n",
"@track\n",
"def generate_topic():\n",
" prompt = \"Generate a topic for a story about Opik.\"\n",
" res = bedrock_client.converse(\n",
" modelId=MODEL_ID, messages=[{\"role\": \"user\", \"content\": [{\"text\": prompt}]}]\n",
" )\n",
" return res[\"output\"][\"message\"][\"content\"][0][\"text\"]\n",
"\n",
"\n",
"@track\n",
"def generate_opik_story():\n",
" topic = generate_topic()\n",
" story = generate_story(topic)\n",
" return story\n",
"\n",
"\n",
"generate_opik_story()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The trace can now be viewed in the UI:\n",
"\n",
"![Bedrock Integration](https://raw.githubusercontent.com/comet-ml/opik/main/apps/opik-documentation/documentation/static/img/cookbook/bedrock_trace_decorator_cookbook.png)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
"metadata": {},
"outputs": [],
"source": [
"%pip install --upgrade --quiet opik openai"
"%pip install --upgrade opik openai"
]
},
{
Expand Down Expand Up @@ -81,10 +81,17 @@
"from openai import OpenAI\n",
"\n",
"os.environ[\"OPIK_PROJECT_NAME\"] = \"openai-integration-demo\"\n",
"client = OpenAI()\n",
"\n",
"openai_client = track_openai(client)\n",
"\n",
"client = OpenAI()\n",
"openai_client = track_openai(client)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"prompt = \"\"\"\n",
"Write a short two sentence story about Opik.\n",
"\"\"\"\n",
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading