diff --git a/example/chatbot/demo_launch_app_cpu_openai.ipynb b/example/chatbot/demo_launch_app_cpu_openai.ipynb
index 2e3d245..65c1060 100644
--- a/example/chatbot/demo_launch_app_cpu_openai.ipynb
+++ b/example/chatbot/demo_launch_app_cpu_openai.ipynb
@@ -4,7 +4,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "# Launch a LLM UI and a Database via OpenAI\n",
+    "# Launch a Chatbot UI (with Database) from an OpenAI model\n",
     "\n",
     "`pykoi` provides simple UI to launch a chatbot UI based on your LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs. This demo shows how to create and launch an LLM chatbot UI and database for OpenAI/Anthropic/Bedrock APIs. Let's get started!\n",
     "\n",
@@ -35,6 +35,13 @@
     "# sys.path.append(\"../..\")"
    ]
   },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "### Import Libraries"
+   ]
+  },
   {
    "cell_type": "code",
    "execution_count": null,
diff --git a/example/chatbot/demo_launch_app_cpu_openai.py b/example/chatbot/demo_launch_app_cpu_openai.py
index 8f4e297..4670ce2 100644
--- a/example/chatbot/demo_launch_app_cpu_openai.py
+++ b/example/chatbot/demo_launch_app_cpu_openai.py
@@ -1,5 +1,5 @@
 """
-Demo for the chatbot application using OpenAI endpoint.
+Demo for launching a chatbot UI (with database) from an OpenAI model.
 
 - Prerequisites:
     To run this jupyter notebook, you need a `pykoi` environment with the `rag` option.  
diff --git a/example/chatbot/demo_launch_app_gpu_huggingface.ipynb b/example/chatbot/demo_launch_app_gpu_huggingface.ipynb
index 35768ff..4b3246e 100644
--- a/example/chatbot/demo_launch_app_gpu_huggingface.ipynb
+++ b/example/chatbot/demo_launch_app_gpu_huggingface.ipynb
@@ -1,5 +1,24 @@
 {
  "cells": [
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "# Launch a Chatbot UI (with Database) from Open-source LLMs\n",
+    "\n",
+    "`pykoi` provides simple UI to launch a chatbot UI based on your LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs. This demo shows how to create and launch an LLM chatbot UI and database (with Database) from Open-source LLMs from Huggingface. Let's get started!\n",
+    "\n",
+    "\n",
+    "### Prerequisites\n",
+    "To run this jupyter notebook, you need a `pykoi` environment with the `huggingface` option. You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-2-rag-gpu) to set up the environment. \n",
+    "\n",
+    "You may also need `pip install ipykernel` to run the kernel environment.\n",
+    "\n",
+    "\n",
+    "### (Optional) Developer setup\n",
+    "If you are a normal user of `pykoi`, you can skip this step. However, if you modify the pykoi code and want to test your changes, you can uncomment the code below."
+   ]
+  },
   {
    "cell_type": "code",
    "execution_count": null,
@@ -9,19 +28,18 @@
     "# %reload_ext autoreload\n",
     "# %autoreload 2\n",
     "\n",
-    "# import os\n",
     "# import sys\n",
     "\n",
-    "# # Add the root folder to the module search path\n",
-    "# # Get the current directory\n",
-    "# current_directory = os.getcwd()\n",
-    "\n",
-    "# # Move two levels up (go to the parent directory of the parent directory)\n",
-    "# two_levels_up_directory = os.path.dirname(os.path.dirname(current_directory))\n",
-    "\n",
-    "# print(two_levels_up_directory)\n",
-    "\n",
-    "# sys.path.append(two_levels_up_directory)"
+    "# sys.path.append(\".\")\n",
+    "# sys.path.append(\"..\")\n",
+    "# sys.path.append(\"../..\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "### Import Libraries"
    ]
   },
   {
diff --git a/example/chatbot/huggingface_model_demo.py b/example/chatbot/demo_launch_app_gpu_huggingface.py
similarity index 71%
rename from example/chatbot/huggingface_model_demo.py
rename to example/chatbot/demo_launch_app_gpu_huggingface.py
index e2cbc0e..fe72694 100644
--- a/example/chatbot/huggingface_model_demo.py
+++ b/example/chatbot/demo_launch_app_gpu_huggingface.py
@@ -1,4 +1,16 @@
-"""Demo for the chatbot application."""
+"""
+Demo for the chatbot application using open source LLMs from Huggingface.
+
+- Prerequisites:
+    To run this jupyter notebook, you need a `pykoi` environment with the `huggingface` option. 
+    You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-2-rag-gpu) 
+    to set up the environment. 
+- Run the demo:
+    1. On terminal and `~/pykoi` directory, run
+        ```
+        python -m example.chatbot.demo_launch_app_gpu_huggingface
+        ```
+"""
 from pykoi import Application
 from pykoi.chat import ModelFactory
 from pykoi.chat import QuestionAnswerDatabase
diff --git a/example/chatbot/peft_huggingface_model_demo.py b/example/chatbot/demo_launch_app_gpu_huggingface_peft.py
similarity index 72%
rename from example/chatbot/peft_huggingface_model_demo.py
rename to example/chatbot/demo_launch_app_gpu_huggingface_peft.py
index be4b8b5..fe82e47 100644
--- a/example/chatbot/peft_huggingface_model_demo.py
+++ b/example/chatbot/demo_launch_app_gpu_huggingface_peft.py
@@ -1,4 +1,17 @@
-"""Demo for the chatbot application."""
+"""
+Demo for the chatbot application using open source LLMs from Huggingface Peft.
+
+- Prerequisites:
+    To run this jupyter notebook, you need a `pykoi` environment with the `huggingface` option. 
+    You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-2-rag-gpu) 
+    to set up the environment. 
+- Run the demo:
+    1. On terminal and `~/pykoi` directory, run
+        ```
+        python -m example.chatbot.demo_launch_app_gpu_huggingface_peft
+        ```
+"""
+
 from pykoi import Application
 from pykoi.chat import ModelFactory
 from pykoi.chat import QuestionAnswerDatabase
diff --git a/example/chatbot/demo_model_comparator_cpu_openai.ipynb b/example/comparator/demo_model_comparator_cpu_openai.ipynb
similarity index 96%
rename from example/chatbot/demo_model_comparator_cpu_openai.ipynb
rename to example/comparator/demo_model_comparator_cpu_openai.ipynb
index 80a4ff9..e75aed6 100644
--- a/example/chatbot/demo_model_comparator_cpu_openai.ipynb
+++ b/example/comparator/demo_model_comparator_cpu_openai.ipynb
@@ -4,7 +4,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "# Demo: LLMs Comparison via OpenAI\n",
+    "# Demo: LLMs Comparison between OpenAI models\n",
     "\n",
     "`pykoi` provides simple API to compare between LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs. This demo shows how to create and launch an LLM comparison app for OpenAI/Anthropic/Bedrock APIs. Let's get started!\n",
     "\n",
@@ -34,6 +34,13 @@
     "# sys.path.append(\"../..\")"
    ]
   },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "### Import Libraries"
+   ]
+  },
   {
    "cell_type": "code",
    "execution_count": null,
diff --git a/example/chatbot/demo_model_comparator_cpu_openai.py b/example/comparator/demo_model_comparator_cpu_openai.py
similarity index 97%
rename from example/chatbot/demo_model_comparator_cpu_openai.py
rename to example/comparator/demo_model_comparator_cpu_openai.py
index c11c080..483c3f3 100644
--- a/example/chatbot/demo_model_comparator_cpu_openai.py
+++ b/example/comparator/demo_model_comparator_cpu_openai.py
@@ -9,7 +9,7 @@
     1. Enter your OpenAI API key in the `api_key` below.
     2. On terminal and `~/pykoi` directory, run 
         ```
-        python -m example.chatbot.demo_model_comparator_cpu_openai
+        python -m example.comparator.demo_model_comparator_cpu_openai
         ```
 """
 
diff --git a/example/chatbot/demo_model_comparator_gpu_huggingface.ipynb b/example/comparator/demo_model_comparator_gpu_huggingface.ipynb
similarity index 76%
rename from example/chatbot/demo_model_comparator_gpu_huggingface.ipynb
rename to example/comparator/demo_model_comparator_gpu_huggingface.ipynb
index 92614e9..1799762 100644
--- a/example/chatbot/demo_model_comparator_gpu_huggingface.ipynb
+++ b/example/comparator/demo_model_comparator_gpu_huggingface.ipynb
@@ -1,5 +1,23 @@
 {
  "cells": [
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "# Demo: LLMs Comparison between Open-source LLMs\n",
+    "\n",
+    "`pykoi` provides simple API to compare between LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs. This demo shows how to create and launch an LLM comparison app between Open-source LLMs from huggingface. Let's get started!\n",
+    "\n",
+    "### Prerequisites\n",
+    "To run this jupyter notebook, you need a `pykoi` environment with the `huggingface` option. You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-2-rag-gpu) to set up the environment. \n",
+    "\n",
+    "You may also need `pip install ipykernel` to run the kernel environment.\n",
+    "\n",
+    "\n",
+    "### (Optional) Developer setup\n",
+    "If you are a normal user of `pykoi`, you can skip this step. However, if you modify the pykoi code and want to test your changes, you can uncomment the code below."
+   ]
+  },
   {
    "cell_type": "code",
    "execution_count": null,
@@ -9,19 +27,18 @@
     "# %reload_ext autoreload\n",
     "# %autoreload 2\n",
     "\n",
-    "# import os\n",
     "# import sys\n",
     "\n",
-    "# # Add the root folder to the module search path\n",
-    "# # Get the current directory\n",
-    "# current_directory = os.getcwd()\n",
-    "\n",
-    "# # Move two levels up (go to the parent directory of the parent directory)\n",
-    "# two_levels_up_directory = os.path.dirname(os.path.dirname(current_directory))\n",
-    "\n",
-    "# print(two_levels_up_directory)\n",
-    "\n",
-    "# sys.path.append(two_levels_up_directory)"
+    "# sys.path.append(\".\")\n",
+    "# sys.path.append(\"..\")\n",
+    "# sys.path.append(\"../..\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "### Import Libraries"
    ]
   },
   {
@@ -42,21 +59,8 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "# Demo: LLMs Comparison\n",
-    "\n",
-    "`pykoi` provides simple API to compare between LLMs, including your own finetuned LLM, a pretrained LLM from huggingface, or OpenAI/Anthropic/Bedrock APIs.\n",
-    "\n",
-    "This demo shows how to create and launch an LLM comparison app. Let's get started!\n",
-    "\n",
-    "## Load LLMs\n",
-    "\n"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "#### Creating a Huggingface model (requires at least EC2 `g4.2xlarge` or GPU with 16G memory)"
+    "### Load LLMs\n",
+    "#### Creating a Huggingface model (requires at least EC2 `g4dn.xlarge` or GPU with at least 16G memory)"
    ]
   },
   {
@@ -65,7 +69,7 @@
    "metadata": {},
    "outputs": [],
    "source": [
-    "## requires a GPU with at least 16GB memory (e.g. g4.2xlarge)\n",
+    "## requires a GPU with at least 16GB memory (e.g. g4dn.xlarge)\n",
     "huggingface_model_1 = ModelFactory.create_model(\n",
     "    model_source=\"huggingface\",\n",
     "    pretrained_model_name_or_path=\"tiiuae/falcon-rw-1b\",\n",
@@ -78,7 +82,7 @@
    "metadata": {},
    "outputs": [],
    "source": [
-    "## requires a GPU with at least 16GB memory (e.g. g4.2xlarge)\n",
+    "## requires a GPU with at least 16GB memory (e.g. g4dn.2xlarge)\n",
     "huggingface_model_2 = ModelFactory.create_model(\n",
     "    model_source=\"huggingface\",\n",
     "    pretrained_model_name_or_path=\"databricks/dolly-v2-3b\",\n",
@@ -102,9 +106,9 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "## Create a chatbot comparator\n",
+    "### Create a chatbot comparator\n",
     "\n",
-    "### Add `nest_asyncio` \n",
+    "#### Add `nest_asyncio` \n",
     "Add `nest_asyncio` to avoid error. Since we're running another interface inside a Jupyter notebook where an asyncio event loop is already running, we'll encounter the error. (since The uvicorn.run() function uses asyncio.run(), which isn't compatible with a running event loop.)"
    ]
   },
@@ -134,7 +138,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "### Add ngrok auth (TODO: change to bash file)"
+    "#### Add ngrok auth (TODO: change to bash file)"
    ]
   },
   {
diff --git a/example/chatbot/demo_model_comparator_gpu_huggingface.py b/example/comparator/demo_model_comparator_gpu_huggingface.py
similarity index 81%
rename from example/chatbot/demo_model_comparator_gpu_huggingface.py
rename to example/comparator/demo_model_comparator_gpu_huggingface.py
index d033143..7ddec16 100644
--- a/example/chatbot/demo_model_comparator_gpu_huggingface.py
+++ b/example/comparator/demo_model_comparator_gpu_huggingface.py
@@ -1,4 +1,16 @@
-"""Demo for the chatbot application using multiple model endpoint."""
+"""
+Demo for the chatbot application using multiple open source LLMs from Huggingface.
+
+- Prerequisites:
+    To run this jupyter notebook, you need a `pykoi` environment with the `huggingface` option. 
+    You can follow [the installation guide](https://github.com/CambioML/pykoi/tree/install#option-2-rag-gpu) 
+    to set up the environment. 
+- Run the demo:
+    1. On terminal and `~/pykoi` directory, run
+        ```
+        python -m example.comparator.demo_model_comparator_gpu_huggingface
+        ```
+"""
 from pykoi import Application
 from pykoi.chat import ModelFactory
 from pykoi.component import Compare