diff --git a/docker/README.md b/docker/README.md
index da70e3e..56181b2 100644
--- a/docker/README.md
+++ b/docker/README.md
@@ -10,7 +10,7 @@ To run a `pykoi` Docker container, launch a GPU instance with the following conf
- Deep Learning AMI PyTorch GPU 2.0.1 (Ubuntu 20.04)
- EBS: at least 100G
-
+
### Installing Docker on your EC2
@@ -146,10 +146,28 @@ docker rm [CONTAINER_NAME]
```
# Building Custom Docker Images
-In this folder, we create the different dockerfiles for using pykoi.
+In this folder, we have the different Dockerfiles for using `pykoi`.
+
+## Building from the Repo
+Some of these examples build from the pykoi repository. For example, you can reference the Dockerfile in the `pykoi-retrieval-huggingface`.
+
+In order to build the docker image, first ensure you are in the base directory of the `pykoi` repo. __You will encounter errors if you're not in the base directory__
+
+Then, you can run the following command:
+```
+docker build -t [name]:[tag] . -f [file_path_to_dockerfile]
+```
+
+Here's and example of building the `pykoi-retrieval-huggingface` Dockerfile.
+```
+docker build -t pykoi:0.1 . -f docker/pykoi-retrieval-huggingface/Dockerfile
+```
+
+## Building based on pykoi library
+There are also examples to build based on the `pykoi` package. These examples install `pykoi` via `pip` and run different applications from there. This relies on the latest pypi released versions.
1. `pykoi-cpu`: The base image for the cpu-based usage.
-2. `pykoi-cpu-custom`: When you run this docker image, try to modify the `app.py` and mount it when running the docker container.
+1. `pykoi-cpu-custom`: When you run this docker image, try to modify the `app.py` and mount it when running the docker container.
To run a docker container, we can use the following command:
```bash
diff --git a/Dockerfile b/docker/pykoi-retrieval-huggingface/Dockerfile
similarity index 80%
rename from Dockerfile
rename to docker/pykoi-retrieval-huggingface/Dockerfile
index 919a07d..4e2946a 100644
--- a/Dockerfile
+++ b/docker/pykoi-retrieval-huggingface/Dockerfile
@@ -1,5 +1,5 @@
# Use a Python base image
-FROM python:3.10
+FROM pytorch/pytorch:2.1.1-cuda12.1-cudnn8-runtime
# Set working directory
WORKDIR /app/
@@ -17,19 +17,20 @@ RUN pip install poetry
# Copy only necessary project files into the container
COPY pyproject.toml poetry.lock /app/
-
# Install project dependencies using Poetry
RUN poetry config virtualenvs.create false \
&& poetry install --no-root --extras "rag huggingface" \
- && pip uninstall -y torch \
&& pip install --pre torch --index-url https://download.pytorch.org/whl/nightly/cu121 \
+ && pip uninstall -y torchvision \
&& rm -rf /root/.cache/
-
# Copy the project files into the container
COPY example /app/example
COPY pykoi /app/pykoi
+# Set the necessary environment variables to enable GPU
+ENV NVIDIA_VISIBLE_DEVICES all
+ENV NVIDIA_DRIVER_CAPABILITIES compute,utility
ENV RETRIEVAL_MODEL=databricks/dolly-v2-3b
# Set entrypoint to run your command