From 98fbb5d5142e228aeaf84de5422d6945b4fce4ce Mon Sep 17 00:00:00 2001
From: Rachel Hu
Date: Tue, 19 Sep 2023 15:54:45 -0700
Subject: [PATCH 1/5] readme multiple installation options
---
README.md | 75 ++++++++++++++++++++++++++++++++++++++++++-------------
1 file changed, 58 insertions(+), 17 deletions(-)
diff --git a/README.md b/README.md
index f3b7a7d..f81e069 100644
--- a/README.md
+++ b/README.md
@@ -42,40 +42,50 @@ Reinforcement Learning with Human Feedback (RLHF) is a unique training paradigm
## Installation
-To get started with `pykoi`, you can choose to one of following compute options: CPU (e.g. your laptop) or GPU (e.g. EC2).
+To get started with pykoi, you can choose from any of the installation options. The choice should be based on the features you need (e.g., RAG, RLHF or all) and the compute resources you have, such as a CPU (e.g., your laptop) or GPU (e.g., AWS EC2 or SageMaker).
-### Option 1: CPU (e.g. your laptop)
-Installation on a CPU is simple if you have conda. If not, install [conda](https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html) for your operating system.
+### Option 1: RAG (CPU)
+This option allows you to run RAG on a CPU using either the OpenAI API or the Anthropic Claude2 API. Installation of RAG (CPU) is simple if you have conda. If not, install [conda](https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html) for your operating system.
First, create a conda environment on your terminal using:
```
conda create -n pykoi python=3.10 -y
-conda activate pykoi
+conda activate pykoi # some OS requires `source activate pykoi`
```
Then install `pykoi` and the compatible [pytorch based on your os](https://pytorch.org/get-started)
```
-pip3 install pykoi
+pip3 install pykoi[rag]
pip3 install torch
```
-### Option 2: GPU (e.g. EC2 or SageMaker)
+### Option 2: RAG (GPU)
+This option allows you to run RAG on a GPU using an open-source LLM from HuggingFace. Here's a quick [tutorial](#ec2-dev-setup) on setting up an EC2 GPU instance for the installation below.
-If you are on EC2, you can launch a GPU instance with the following config:
-- EC2 `g4dn.xlarge` (if you want to run a pretrained LLM with 7B parameters)
-- Deep Learning AMI PyTorch GPU 2.0.1 (Ubuntu 20.04)
-
-- EBS: at least 100G
-
+On your GPU instance terminal, create a conda environment using:
+```
+conda create -n pykoi python=3.10 -y && source activate pykoi
+```
+
+Then install `pykoi` and [pytorch based on your cuda version](https://pytorch.org/get-started). You can find your CUDA version via `nvcc -V`.
+```
+pip3 install pykoi[huggingface]
+
+# install torch based on cuda (e.g. cu118 means cuda 11.8)
+pip3 install torch --index-url https://download.pytorch.org/whl/cu118
+```
-Next, on your GPU instance terminal, create a conda environment using:
+### Option 3: RLHF (GPU)
+This option allows you to train LLM via RLHF on a GPU. Here's a quick [tutorial](#ec2-dev-setup) on setting up an EC2 GPU instance for the installation below.
+
+On your GPU instance terminal, create a conda environment using:
```
conda create -n pykoi python=3.10 -y && source activate pykoi
```
-Then install `pykoi` and [pytorch based on your cuda version](https://pytorch.org/get-started).
+Then install `pykoi` and [pytorch based on your cuda version](https://pytorch.org/get-started). You can find your CUDA version via `nvcc -V`.
```
-pip3 install pykoi
+pip3 install pykoi[rlhf]
# install torch based on cuda (e.g. cu118 means cuda 11.8)
pip3 install torch --index-url https://download.pytorch.org/whl/cu118
@@ -92,13 +102,44 @@ conda create -n pykoi python=3.10
conda activate pykoi
cd pykoi
pip3 install poetry
-poetry install --no-root
```
+Then, based the feature you need to develop, run one or more installation options below. We recommend install all the options below although it may take ~3 minutes longer.
+
+- Option 1: RAG (CPU)
+ ```
+ poetry install --no-root --extras rag
+ ```
+- Option 2: RAG (GPU)
+ ```
+ poetry install --no-root --extras huggingface
+ ```
+- Option 3: RLHF (GPU)
+ ```
+ poetry install --no-root --extras rlhf
+ ```
+
+Finally, if you are on a GPU, install `pykoi` and [pytorch based on your cuda version](https://pytorch.org/get-started). You can find your CUDA version via `nvcc -V`.
+```
+pip3 install pykoi[huggingface]
+
+# install torch based on cuda (e.g. cu118 means cuda 11.8)
+pip3 install torch --index-url https://download.pytorch.org/whl/cu118
+```
+
+
### Frontend Dev Setup
-Frontend:
```
cd pykoi/pykoi/frontend
npm install
npm run build
```
+
+### EC2 Dev Setup
+If you are on EC2, you can launch a GPU instance with the following config:
+- EC2 `g4dn.xlarge` (if you want to run a pretrained LLM with 7B parameters)
+- Deep Learning AMI PyTorch GPU 2.0.1 (Ubuntu 20.04)
+
+- EBS: at least 100G
+
+
From b60d2ff1f80f2a26e3067ff8b79c58bf3f65f634 Mon Sep 17 00:00:00 2001
From: Rachel Hu
Date: Tue, 19 Sep 2023 16:19:19 -0700
Subject: [PATCH 2/5] add running cmd
---
.../chatbot/demo_launch_app_cpu_openai.ipynb | 34 +++++++-----
..._demo.py => demo_launch_app_cpu_openai.py} | 13 ++++-
.../demo_model_comparator_cpu_openai.ipynb | 52 +++++++++++--------
.../demo_model_comparator_cpu_openai.py | 13 ++++-
4 files changed, 75 insertions(+), 37 deletions(-)
rename example/chatbot/{openai_model_demo.py => demo_launch_app_cpu_openai.py} (71%)
diff --git a/example/chatbot/demo_launch_app_cpu_openai.ipynb b/example/chatbot/demo_launch_app_cpu_openai.ipynb
index 509f776..cdae33b 100644
--- a/example/chatbot/demo_launch_app_cpu_openai.ipynb
+++ b/example/chatbot/demo_launch_app_cpu_openai.ipynb
@@ -1,5 +1,24 @@
{
"cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Launch a LLM UI and a Database via OpenAI\n",
+ "\n",
+ "`pykoi` provides simple UI to launch a chatbot UI based on your LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs. This demo shows how to create and launch an LLM chatbot UI and database for OpenAI/Anthropic/Bedrock APIs. Let's get started!\n",
+ "\n",
+ "\n",
+ "### Prerequisites\n",
+ "To run this jupyter notebook, you need an `rag` environment. You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-1-rag-cpu) to set up the environment. \n",
+ "\n",
+ "You may also need `pip install ipykernel` to run the kernel environment.\n",
+ "\n",
+ "\n",
+ "### (Optional) Developer setup\n",
+ "If you are a normal user of `pykoi`, you can skip this step. However, if you modify the pykoi code and want to test your changes, you can uncomment the code below."
+ ]
+ },
{
"cell_type": "code",
"execution_count": null,
@@ -9,19 +28,11 @@
"# %reload_ext autoreload\n",
"# %autoreload 2\n",
"\n",
- "# import os\n",
"# import sys\n",
"\n",
- "# # Add the root folder to the module search path\n",
- "# # Get the current directory\n",
- "# current_directory = os.getcwd()\n",
- "\n",
- "# # Move two levels up (go to the parent directory of the parent directory)\n",
- "# two_levels_up_directory = os.path.dirname(os.path.dirname(current_directory))\n",
- "\n",
- "# print(two_levels_up_directory)\n",
- "\n",
- "# sys.path.append(two_levels_up_directory)"
+ "# sys.path.append(\".\")\n",
+ "# sys.path.append(\"..\")\n",
+ "# sys.path.append(\"../..\")"
]
},
{
@@ -30,7 +41,6 @@
"metadata": {},
"outputs": [],
"source": [
- "## pip install ipykernel\n",
"from pykoi import Application\n",
"from pykoi.chat import ModelFactory\n",
"from pykoi.chat import QuestionAnswerDatabase\n",
diff --git a/example/chatbot/openai_model_demo.py b/example/chatbot/demo_launch_app_cpu_openai.py
similarity index 71%
rename from example/chatbot/openai_model_demo.py
rename to example/chatbot/demo_launch_app_cpu_openai.py
index a9f6956..cc516a3 100644
--- a/example/chatbot/openai_model_demo.py
+++ b/example/chatbot/demo_launch_app_cpu_openai.py
@@ -1,4 +1,15 @@
-"""Demo for the chatbot application using OpenAI endpoint."""
+"""
+Demo for the chatbot application using OpenAI endpoint.
+
+- Prerequisites:
+ To run this jupyter notebook, you need an `rag` environment.
+ You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-1-rag-cpu)
+ to set up the environment.
+- Run the demo:
+ 1. Enter your OpenAI API key in the `api_key` below.
+ 2. On terminal and `~/pykoi` directory, run `python -m example.chatbot.demo_launch_app_cpu_openai`
+"""
+
from pykoi import Application
from pykoi.chat import ModelFactory
from pykoi.chat import QuestionAnswerDatabase
diff --git a/example/chatbot/demo_model_comparator_cpu_openai.ipynb b/example/chatbot/demo_model_comparator_cpu_openai.ipynb
index 19a84d1..ed1031c 100644
--- a/example/chatbot/demo_model_comparator_cpu_openai.ipynb
+++ b/example/chatbot/demo_model_comparator_cpu_openai.ipynb
@@ -1,5 +1,23 @@
{
"cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Demo: LLMs Comparison via OpenAI\n",
+ "\n",
+ "`pykoi` provides simple API to compare between LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs. This demo shows how to create and launch an LLM comparison app for OpenAI/Anthropic/Bedrock APIs. Let's get started!\n",
+ "\n",
+ "### Prerequisites\n",
+ "To run this jupyter notebook, you need an `rag` environment. You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-1-rag-cpu) to set up the environment. \n",
+ "\n",
+ "You may also need `pip install ipykernel` to run the kernel environment.\n",
+ "\n",
+ "\n",
+ "### (Optional) Developer setup\n",
+ "If you are a normal user of `pykoi`, you can skip this step. However, if you modify the pykoi code and want to test your changes, you can uncomment the code below."
+ ]
+ },
{
"cell_type": "code",
"execution_count": null,
@@ -9,19 +27,11 @@
"# %reload_ext autoreload\n",
"# %autoreload 2\n",
"\n",
- "# import os\n",
"# import sys\n",
"\n",
- "# # Add the root folder to the module search path\n",
- "# # Get the current directory\n",
- "# current_directory = os.getcwd()\n",
- "\n",
- "# # Move two levels up (go to the parent directory of the parent directory)\n",
- "# two_levels_up_directory = os.path.dirname(os.path.dirname(current_directory))\n",
- "\n",
- "# print(two_levels_up_directory)\n",
- "\n",
- "# sys.path.append(two_levels_up_directory)"
+ "# sys.path.append(\".\")\n",
+ "# sys.path.append(\"..\")\n",
+ "# sys.path.append(\"../..\")"
]
},
{
@@ -39,13 +49,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "# Demo: LLMs Comparison\n",
"\n",
- "`pykoi` provides simple API to compare between LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs.\n",
"\n",
- "This demo shows how to create and launch an LLM comparison app. Let's get started!\n",
- "\n",
- "## Load LLMs\n",
+ "### Load LLMs\n",
"\n",
"#### 1. Creating an OpenAI model (requires an OpenAI API key)"
]
@@ -108,6 +114,13 @@
"nest_asyncio.apply()"
]
},
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Launch the App"
+ ]
+ },
{
"cell_type": "code",
"execution_count": null,
@@ -137,13 +150,6 @@
" \n",
"
"
]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
}
],
"metadata": {
diff --git a/example/chatbot/demo_model_comparator_cpu_openai.py b/example/chatbot/demo_model_comparator_cpu_openai.py
index b5c45f9..3a2f96c 100644
--- a/example/chatbot/demo_model_comparator_cpu_openai.py
+++ b/example/chatbot/demo_model_comparator_cpu_openai.py
@@ -1,4 +1,15 @@
-"""Demo for the chatbot application using multiple model endpoint."""
+"""
+Demo for the chatbot application using multiple OpenAI models.
+
+- Prerequisites:
+ To run this jupyter notebook, you need an `rag` environment.
+ You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-1-rag-cpu)
+ to set up the environment.
+- Run the demo:
+ 1. Enter your OpenAI API key in the `api_key` below.
+ 2. On terminal and `~/pykoi` directory, run `python -m example.chatbot.demo_model_comparator_cpu_openai`
+"""
+
from pykoi import Application
from pykoi.chat import ModelFactory
from pykoi.component import Compare
From f83491d462d76738e8a197b0ddccc9aeb8b28178 Mon Sep 17 00:00:00 2001
From: Rachel Hu
Date: Tue, 19 Sep 2023 16:24:47 -0700
Subject: [PATCH 3/5] add running cmd
---
example/chatbot/demo_launch_app_cpu_openai.py | 5 ++++-
example/chatbot/demo_model_comparator_cpu_openai.py | 5 ++++-
2 files changed, 8 insertions(+), 2 deletions(-)
diff --git a/example/chatbot/demo_launch_app_cpu_openai.py b/example/chatbot/demo_launch_app_cpu_openai.py
index cc516a3..fe788eb 100644
--- a/example/chatbot/demo_launch_app_cpu_openai.py
+++ b/example/chatbot/demo_launch_app_cpu_openai.py
@@ -7,7 +7,10 @@
to set up the environment.
- Run the demo:
1. Enter your OpenAI API key in the `api_key` below.
- 2. On terminal and `~/pykoi` directory, run `python -m example.chatbot.demo_launch_app_cpu_openai`
+ 2. On terminal and `~/pykoi` directory, run
+ ```
+ python -m example.chatbot.demo_launch_app_cpu_openai
+ ```
"""
from pykoi import Application
diff --git a/example/chatbot/demo_model_comparator_cpu_openai.py b/example/chatbot/demo_model_comparator_cpu_openai.py
index 3a2f96c..fef07e0 100644
--- a/example/chatbot/demo_model_comparator_cpu_openai.py
+++ b/example/chatbot/demo_model_comparator_cpu_openai.py
@@ -7,7 +7,10 @@
to set up the environment.
- Run the demo:
1. Enter your OpenAI API key in the `api_key` below.
- 2. On terminal and `~/pykoi` directory, run `python -m example.chatbot.demo_model_comparator_cpu_openai`
+ 2. On terminal and `~/pykoi` directory, run
+ ```
+ python -m example.chatbot.demo_model_comparator_cpu_openai
+ ```
"""
from pykoi import Application
From b681d213f72b140e38bf12ae7eb1bc2e1b7ce69c Mon Sep 17 00:00:00 2001
From: Rachel Hu
Date: Tue, 19 Sep 2023 16:25:57 -0700
Subject: [PATCH 4/5] add running cmd
---
example/chatbot/demo_launch_app_cpu_openai.py | 2 +-
example/chatbot/demo_model_comparator_cpu_openai.py | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/example/chatbot/demo_launch_app_cpu_openai.py b/example/chatbot/demo_launch_app_cpu_openai.py
index fe788eb..8f4e297 100644
--- a/example/chatbot/demo_launch_app_cpu_openai.py
+++ b/example/chatbot/demo_launch_app_cpu_openai.py
@@ -2,7 +2,7 @@
Demo for the chatbot application using OpenAI endpoint.
- Prerequisites:
- To run this jupyter notebook, you need an `rag` environment.
+ To run this jupyter notebook, you need a `pykoi` environment with the `rag` option.
You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-1-rag-cpu)
to set up the environment.
- Run the demo:
diff --git a/example/chatbot/demo_model_comparator_cpu_openai.py b/example/chatbot/demo_model_comparator_cpu_openai.py
index fef07e0..c11c080 100644
--- a/example/chatbot/demo_model_comparator_cpu_openai.py
+++ b/example/chatbot/demo_model_comparator_cpu_openai.py
@@ -2,7 +2,7 @@
Demo for the chatbot application using multiple OpenAI models.
- Prerequisites:
- To run this jupyter notebook, you need an `rag` environment.
+ To run this jupyter notebook, you need a `pykoi` environment with the `rag` option.
You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-1-rag-cpu)
to set up the environment.
- Run the demo:
From dbb352544992e231249e2dddd5a97c1ee31132c9 Mon Sep 17 00:00:00 2001
From: Rachel Hu
Date: Tue, 19 Sep 2023 16:29:26 -0700
Subject: [PATCH 5/5] add running cmd
---
example/chatbot/demo_launch_app_cpu_openai.ipynb | 2 +-
example/chatbot/demo_model_comparator_cpu_openai.ipynb | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/example/chatbot/demo_launch_app_cpu_openai.ipynb b/example/chatbot/demo_launch_app_cpu_openai.ipynb
index cdae33b..2e3d245 100644
--- a/example/chatbot/demo_launch_app_cpu_openai.ipynb
+++ b/example/chatbot/demo_launch_app_cpu_openai.ipynb
@@ -10,7 +10,7 @@
"\n",
"\n",
"### Prerequisites\n",
- "To run this jupyter notebook, you need an `rag` environment. You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-1-rag-cpu) to set up the environment. \n",
+ "To run this jupyter notebook, you need a `pykoi` environment with the `rag` option. You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-1-rag-cpu) to set up the environment. \n",
"\n",
"You may also need `pip install ipykernel` to run the kernel environment.\n",
"\n",
diff --git a/example/chatbot/demo_model_comparator_cpu_openai.ipynb b/example/chatbot/demo_model_comparator_cpu_openai.ipynb
index ed1031c..80a4ff9 100644
--- a/example/chatbot/demo_model_comparator_cpu_openai.ipynb
+++ b/example/chatbot/demo_model_comparator_cpu_openai.ipynb
@@ -9,7 +9,7 @@
"`pykoi` provides simple API to compare between LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs. This demo shows how to create and launch an LLM comparison app for OpenAI/Anthropic/Bedrock APIs. Let's get started!\n",
"\n",
"### Prerequisites\n",
- "To run this jupyter notebook, you need an `rag` environment. You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-1-rag-cpu) to set up the environment. \n",
+ "To run this jupyter notebook, you need a `pykoi` environment with the `rag` option. You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-1-rag-cpu) to set up the environment. \n",
"\n",
"You may also need `pip install ipykernel` to run the kernel environment.\n",
"\n",