Skip to content

Commit

Permalink
Merge pull request #63 from CambioML/install
Browse files Browse the repository at this point in the history
Add examples to hugginface models
  • Loading branch information
goldmermaid authored Sep 20, 2023
2 parents abc7db8 + ca7d9ee commit 471f0f8
Show file tree
Hide file tree
Showing 9 changed files with 122 additions and 49 deletions.
9 changes: 8 additions & 1 deletion example/chatbot/demo_launch_app_cpu_openai.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Launch a LLM UI and a Database via OpenAI\n",
"# Launch a Chatbot UI (with Database) from an OpenAI model\n",
"\n",
"`pykoi` provides simple UI to launch a chatbot UI based on your LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs. This demo shows how to create and launch an LLM chatbot UI and database for OpenAI/Anthropic/Bedrock APIs. Let's get started!\n",
"\n",
Expand Down Expand Up @@ -35,6 +35,13 @@
"# sys.path.append(\"../..\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Import Libraries"
]
},
{
"cell_type": "code",
"execution_count": null,
Expand Down
2 changes: 1 addition & 1 deletion example/chatbot/demo_launch_app_cpu_openai.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
"""
Demo for the chatbot application using OpenAI endpoint.
Demo for launching a chatbot UI (with database) from an OpenAI model.
- Prerequisites:
To run this jupyter notebook, you need a `pykoi` environment with the `rag` option.
Expand Down
40 changes: 29 additions & 11 deletions example/chatbot/demo_launch_app_gpu_huggingface.ipynb
Original file line number Diff line number Diff line change
@@ -1,5 +1,24 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Launch a Chatbot UI (with Database) from Open-source LLMs\n",
"\n",
"`pykoi` provides simple UI to launch a chatbot UI based on your LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs. This demo shows how to create and launch an LLM chatbot UI and database (with Database) from Open-source LLMs from Huggingface. Let's get started!\n",
"\n",
"\n",
"### Prerequisites\n",
"To run this jupyter notebook, you need a `pykoi` environment with the `huggingface` option. You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-2-rag-gpu) to set up the environment. \n",
"\n",
"You may also need `pip install ipykernel` to run the kernel environment.\n",
"\n",
"\n",
"### (Optional) Developer setup\n",
"If you are a normal user of `pykoi`, you can skip this step. However, if you modify the pykoi code and want to test your changes, you can uncomment the code below."
]
},
{
"cell_type": "code",
"execution_count": null,
Expand All @@ -9,19 +28,18 @@
"# %reload_ext autoreload\n",
"# %autoreload 2\n",
"\n",
"# import os\n",
"# import sys\n",
"\n",
"# # Add the root folder to the module search path\n",
"# # Get the current directory\n",
"# current_directory = os.getcwd()\n",
"\n",
"# # Move two levels up (go to the parent directory of the parent directory)\n",
"# two_levels_up_directory = os.path.dirname(os.path.dirname(current_directory))\n",
"\n",
"# print(two_levels_up_directory)\n",
"\n",
"# sys.path.append(two_levels_up_directory)"
"# sys.path.append(\".\")\n",
"# sys.path.append(\"..\")\n",
"# sys.path.append(\"../..\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Import Libraries"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,16 @@
"""Demo for the chatbot application."""
"""
Demo for the chatbot application using open source LLMs from Huggingface.
- Prerequisites:
To run this jupyter notebook, you need a `pykoi` environment with the `huggingface` option.
You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-2-rag-gpu)
to set up the environment.
- Run the demo:
1. On terminal and `~/pykoi` directory, run
```
python -m example.chatbot.demo_launch_app_gpu_huggingface
```
"""
from pykoi import Application
from pykoi.chat import ModelFactory
from pykoi.chat import QuestionAnswerDatabase
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,17 @@
"""Demo for the chatbot application."""
"""
Demo for the chatbot application using open source LLMs from Huggingface Peft.
- Prerequisites:
To run this jupyter notebook, you need a `pykoi` environment with the `huggingface` option.
You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-2-rag-gpu)
to set up the environment.
- Run the demo:
1. On terminal and `~/pykoi` directory, run
```
python -m example.chatbot.demo_launch_app_gpu_huggingface_peft
```
"""

from pykoi import Application
from pykoi.chat import ModelFactory
from pykoi.chat import QuestionAnswerDatabase
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Demo: LLMs Comparison via OpenAI\n",
"# Demo: LLMs Comparison between OpenAI models\n",
"\n",
"`pykoi` provides simple API to compare between LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs. This demo shows how to create and launch an LLM comparison app for OpenAI/Anthropic/Bedrock APIs. Let's get started!\n",
"\n",
Expand Down Expand Up @@ -34,6 +34,13 @@
"# sys.path.append(\"../..\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Import Libraries"
]
},
{
"cell_type": "code",
"execution_count": null,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
1. Enter your OpenAI API key in the `api_key` below.
2. On terminal and `~/pykoi` directory, run
```
python -m example.chatbot.demo_model_comparator_cpu_openai
python -m example.comparator.demo_model_comparator_cpu_openai
```
"""

Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,23 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Demo: LLMs Comparison between Open-source LLMs\n",
"\n",
"`pykoi` provides simple API to compare between LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs. This demo shows how to create and launch an LLM comparison app between Open-source LLMs from huggingface. Let's get started!\n",
"\n",
"### Prerequisites\n",
"To run this jupyter notebook, you need a `pykoi` environment with the `huggingface` option. You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-2-rag-gpu) to set up the environment. \n",
"\n",
"You may also need `pip install ipykernel` to run the kernel environment.\n",
"\n",
"\n",
"### (Optional) Developer setup\n",
"If you are a normal user of `pykoi`, you can skip this step. However, if you modify the pykoi code and want to test your changes, you can uncomment the code below."
]
},
{
"cell_type": "code",
"execution_count": null,
Expand All @@ -9,19 +27,18 @@
"# %reload_ext autoreload\n",
"# %autoreload 2\n",
"\n",
"# import os\n",
"# import sys\n",
"\n",
"# # Add the root folder to the module search path\n",
"# # Get the current directory\n",
"# current_directory = os.getcwd()\n",
"\n",
"# # Move two levels up (go to the parent directory of the parent directory)\n",
"# two_levels_up_directory = os.path.dirname(os.path.dirname(current_directory))\n",
"\n",
"# print(two_levels_up_directory)\n",
"\n",
"# sys.path.append(two_levels_up_directory)"
"# sys.path.append(\".\")\n",
"# sys.path.append(\"..\")\n",
"# sys.path.append(\"../..\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Import Libraries"
]
},
{
Expand All @@ -42,21 +59,8 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Demo: LLMs Comparison\n",
"\n",
"`pykoi` provides simple API to compare between LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs.\n",
"\n",
"This demo shows how to create and launch an LLM comparison app. Let's get started!\n",
"\n",
"## Load LLMs\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Creating a Huggingface model (requires at least EC2 `g4.2xlarge` or GPU with 16G memory)"
"### Load LLMs\n",
"#### Creating a Huggingface model (requires at least EC2 `g4dn.xlarge` or GPU with at least 16G memory)"
]
},
{
Expand All @@ -65,7 +69,7 @@
"metadata": {},
"outputs": [],
"source": [
"## requires a GPU with at least 16GB memory (e.g. g4.2xlarge)\n",
"## requires a GPU with at least 16GB memory (e.g. g4dn.xlarge)\n",
"huggingface_model_1 = ModelFactory.create_model(\n",
" model_source=\"huggingface\",\n",
" pretrained_model_name_or_path=\"tiiuae/falcon-rw-1b\",\n",
Expand All @@ -78,7 +82,7 @@
"metadata": {},
"outputs": [],
"source": [
"## requires a GPU with at least 16GB memory (e.g. g4.2xlarge)\n",
"## requires a GPU with at least 16GB memory (e.g. g4dn.2xlarge)\n",
"huggingface_model_2 = ModelFactory.create_model(\n",
" model_source=\"huggingface\",\n",
" pretrained_model_name_or_path=\"databricks/dolly-v2-3b\",\n",
Expand All @@ -102,9 +106,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create a chatbot comparator\n",
"### Create a chatbot comparator\n",
"\n",
"### Add `nest_asyncio` \n",
"#### Add `nest_asyncio` \n",
"Add `nest_asyncio` to avoid error. Since we're running another interface inside a Jupyter notebook where an asyncio event loop is already running, we'll encounter the error. (since The uvicorn.run() function uses asyncio.run(), which isn't compatible with a running event loop.)"
]
},
Expand Down Expand Up @@ -134,7 +138,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Add ngrok auth (TODO: change to bash file)"
"#### Add ngrok auth (TODO: change to bash file)"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,16 @@
"""Demo for the chatbot application using multiple model endpoint."""
"""
Demo for the chatbot application using multiple open source LLMs from Huggingface.
- Prerequisites:
To run this jupyter notebook, you need a `pykoi` environment with the `huggingface` option.
You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-2-rag-gpu)
to set up the environment.
- Run the demo:
1. On terminal and `~/pykoi` directory, run
```
python -m example.comparator.demo_model_comparator_gpu_huggingface
```
"""
from pykoi import Application
from pykoi.chat import ModelFactory
from pykoi.component import Compare
Expand Down

0 comments on commit 471f0f8

Please sign in to comment.