Skip to content

Commit

Permalink
Merge branch 'master' into nerf_integration
Browse files Browse the repository at this point in the history
  • Loading branch information
Lothiraldan committed Jan 31, 2024
2 parents 235ef19 + 2180107 commit 018f77e
Show file tree
Hide file tree
Showing 3 changed files with 45 additions and 51 deletions.
88 changes: 42 additions & 46 deletions integrations/llm/langchain/notebooks/Comet_with_Langchain.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -18,13 +18,10 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"In this guide we will demonstrate how to track your Langchain Experiments, Evaluation Metrics, and LLM Sessions with [Comet](https://www.comet.com/site/?utm_source=langchain&utm_medium=referral&utm_campaign=comet_notebook). \n",
"In this guide we will demonstrate how to track your Langchain prompts, Chains, and Agents with [Comet](https://www.comet.com/site/?utm_source=langchain&utm_medium=referral&utm_campaign=comet_notebook).\n",
"\n",
"<a target=\"_blank\" href=\"https://colab.research.google.com/github/comet-ml/comet-examples/blob/master/integrations/llm/langchain/notebooks/Comet_with_Langchain.ipynb\">\n",
" <img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/>\n",
"</a>\n",
"\n",
"**Example Project:** [Comet with LangChain](https://www.comet.com/examples/comet-example-langchain-notebook/prompts?utm_source=langchain&utm_medium=referral&utm_campaign=comet_notebook)"
"**Example Project:** [Comet with LangChain](https://www.comet.com/examples/comet-example-langchain-llm-notebook/prompts?utm_source=langchain&utm_medium=referral&utm_campaign=comet_notebook)"
]
},
{
Expand All @@ -47,7 +44,7 @@
"metadata": {},
"outputs": [],
"source": [
"%pip install -U comet_llm \"langchain>=0.0.346\" openai numexpr"
"%pip install -U comet_llm \"langchain>=0.1.3\" \"langchain-openai\" openai numexpr"
]
},
{
Expand All @@ -57,13 +54,6 @@
"### Initialize Comet and Set your Credentials"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You can grab your [Comet API Key here](https://www.comet.com/signup?utm_source=langchain&utm_medium=referral&utm_campaign=comet_notebook) or click the link after intializing Comet"
]
},
{
"cell_type": "code",
"execution_count": null,
Expand All @@ -72,7 +62,7 @@
"source": [
"import comet_llm\n",
"\n",
"comet_llm.init(project=\"comet-example-langchain-notebook\")"
"comet_llm.init(project=\"comet-example-langchain-llm-notebook\")"
]
},
{
Expand All @@ -97,10 +87,32 @@
"source": [
"import os\n",
"\n",
"os.environ[\"OPENAI_API_KEY\"] = \"...\"\n",
"# os.environ[\"OPENAI_API_KEY\"] = \"...\"\n",
"# os.environ[\"OPENAI_ORGANIZATION\"] = \"...\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Tracing with Comet\n",
"\n",
"There are two ways to trace your LangChains executions with Comet:\n",
"\n",
"1. Setting the `LANGCHAIN_COMET_TRACING` environment variable to \"true\". This is the recommended way.\n",
"2. Import the `CometTracer` manually and pass it explicitely."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"os.environ[\"LANGCHAIN_COMET_TRACING\"] = \"true\"\n",
"from langchain.callbacks.tracers.comet import CometTracer"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand All @@ -116,17 +128,13 @@
"source": [
"from datetime import datetime\n",
"\n",
"from langchain.callbacks.tracers.comet import CometTracer\n",
"from langchain.llms import OpenAI\n",
"from langchain_openai import OpenAI\n",
"\n",
"comet_callback = CometTracer()\n",
"llm = OpenAI(temperature=0.9, verbose=True)\n",
"llm_result = llm.generate(\n",
" [\"Tell me a joke\", \"Tell me a poem\", \"Tell me a fact\"] * 3,\n",
" callbacks=[comet_callback],\n",
")\n",
"print(\"LLM result\", llm_result)\n",
"comet_callback.flush()"
"\n",
"for prompt in [\"Tell me a joke\", \"Tell me a poem\", \"Tell me a fact\"]:\n",
" llm_result = llm.generate([prompt])\n",
" print(\"LLM result\", llm_result)"
]
},
{
Expand All @@ -142,25 +150,20 @@
"metadata": {},
"outputs": [],
"source": [
"from langchain.callbacks.tracers.comet import CometTracer\n",
"from langchain.chains import LLMChain\n",
"from langchain.llms import OpenAI\n",
"from langchain_openai import OpenAI\n",
"from langchain.prompts import PromptTemplate\n",
"\n",
"comet_callback = CometTracer()\n",
"callbacks = [comet_callback]\n",
"\n",
"llm = OpenAI(temperature=0.9, verbose=True)\n",
"\n",
"template = \"\"\"You are a playwright. Given the title of play, it is your job to write a synopsis for that title.\n",
"Title: {title}\n",
"Playwright: This is a synopsis for the above play:\"\"\"\n",
"prompt_template = PromptTemplate(input_variables=[\"title\"], template=template)\n",
"synopsis_chain = LLMChain(llm=llm, prompt=prompt_template, verbose=True)\n",
"synopsis_chain = LLMChain(llm=llm, prompt=prompt_template)\n",
"\n",
"test_prompts = [{\"title\": \"Documentary about Bigfoot in Paris\"}]\n",
"print(synopsis_chain.apply(test_prompts, callbacks=callbacks))\n",
"comet_callback.flush()"
"print(synopsis_chain.apply(test_prompts))"
]
},
{
Expand All @@ -176,27 +179,20 @@
"metadata": {},
"outputs": [],
"source": [
"from langchain.agents import AgentType, initialize_agent, load_tools\n",
"from langchain.callbacks.tracers.comet import CometTracer\n",
"from langchain.llms import OpenAI\n",
"from langchain.agents import initialize_agent, load_tools\n",
"from langchain_openai import OpenAI\n",
"\n",
"comet_callback = CometTracer()\n",
"callbacks = [comet_callback]\n",
"llm = OpenAI(temperature=0.9)\n",
"\n",
"llm = OpenAI(temperature=0.9, verbose=True)\n",
"\n",
"tools = load_tools([\"llm-math\"], llm=llm, verbose=True)\n",
"tools = load_tools([\"llm-math\"], llm=llm)\n",
"agent = initialize_agent(\n",
" tools,\n",
" llm,\n",
" AgentType.ZERO_SHOT_REACT_DESCRIPTION,\n",
" verbose=True,\n",
" agent=\"zero-shot-react-description\",\n",
")\n",
"agent.run(\n",
" \"What is 2 raised to .123243 power?\",\n",
" callbacks=callbacks,\n",
")\n",
"comet_callback.flush()"
")"
]
}
],
Expand All @@ -216,7 +212,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
"version": "3.10.12"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
comet_ml
comet_ml>=3.37.0
fastai
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@
},
"outputs": [],
"source": [
"%pip install -U fastai comet_ml"
"%pip install -U fastai comet_ml>=3.37.0"
]
},
{
Expand Down Expand Up @@ -204,9 +204,7 @@
"id": "q-LxaKnwmKd2"
},
"source": [
"## 5. Training\n",
"\n",
"In fastai, we can train differently depending on if we are running CPU or a GPU. To test, we can use the `data.device.type` property. This will create a fastai `Learner`:"
"## 5. Training"
]
},
{
Expand Down

0 comments on commit 018f77e

Please sign in to comment.