Skip to content

Build a private Generative AI on local laptop with Rancher Desktop and K3S

License

Notifications You must be signed in to change notification settings

akiliuhk/rancher-desktop-k3s-genai

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Introduction - Develop & Deploy GenAI App with Ollama, Open WebUI with K3S in Rancher Desktop

In this session, we're diving into an exciting lineup of tools and technologies to empower your development and deployment processes.

Rancher Desktop with K3s: Rancher Desktop is an open-source application developed by SUSE that brings Kubernetes and container management to your desktop, we'll set up a local Kubernetes environment using Rancher Desktop and K3s. This setup will simulate a production-like environment, enabling you to test and refine your deployments effectively.

OpenWebUI with Ollama: OpenWebUI is a application with an intuitive interface that provides GenAI capabilities, we'll explore how to use Mistral LLM with API to enhances your application development by integrating GenAI advanced functionalities and insights, making the application smarter and more intuitive.

OpenDGR API Gateway: OpenDGR is an open-source application developed by TPI Software is an API gateway and management solution designed to act as an intermediary between client applications and backend services. It provides a range of features to manage and secure API traffic, ensuring efficient communication and data flow.

Rancher Fleet: Rancher Fleet is a Continuous Delivery (GitOps) tool to manage deployments from a single Kubernetes cluster to large-scale deployment of multiple Kubernetes clusters. We'll deploy our GenAI applications to multi cluster environment using Rancher Fleet to automate and manage large-scale deployments, ensuring consistency and reliability across your infrastructure.

By the end of this workshop, you'll have a comprehensive understanding of these tools and how they can streamline your development workflow from local testing to production deployment. Let's get started and make the most of this collaborative learning experience!

Table of Contents:

  • Task 1 - Setup Rancher Desktop and deploy GenAI app OpenWebUI with Ollama
  • Task 2 - Deploy the OpenDGR API gateway protec the GenAI app
  • Task 3 - Multi-Cluster Deployment with Rancher Fleet (Optional)

System Requirements

To complete this lab, you need to have a laptop (Quad core, 16GB RAM and 50GB free SSD disk space) with fast and stable internet access with one of the following operating systems installed.

  • Windows 10,
  • MacBook Apple Silicon M1 or above, or
  • Linux (e.g. OpenSUSE Leap)

Task 1 - Setup Rancher Desktop and Deploy GenAI app Open WebUI with Ollama

Setup the development environment. Intend to develop everything within containers.

Get K3S and Rancher Desktop up and running

  1. Download from Rancher Desktop website and install the latest stable version of Rancher Desktop application (v1.16 at the time of this writing) on your laptop.

  2. Configure VM used by Rancher Desktop (Under Preferences, Virtual Machine tab) to be 10GB RAM and 4 vcore

01-rancher-desktop-preference

  1. Configure Container Engine used by Rancher Desktop (Under Preferences, Container Engine tab) to dockerd(moby)

01-rancher-desktop-container-engine

  1. Configure Kubernetes Version used by Rancher Desktop (Under Preferences, Kubernetes tab) to v1.28.5

01-rancher-desktop-container-engine

  1. Enable resource monitoring by navigating to Extensions and Install Resource usage.

01-rancher-desktop-extension-install-resource-usage

  1. Check Resource usage dashboard by navigating to Resource usage 01-rancher-desktop-resource-usage

  2. After the Kubernetes services (k3s) is up and running, we can open a terminal console to access to the cluster. Open your terminal, you should now have access to your local K3S cluster.

❯ kubectl get node
NAME                   STATUS   ROLES                  AGE   VERSION
lima-rancher-desktop   Ready    control-plane,master   76d   v1.28.5+k3s1

Let's deploy GenAI app Open WebUI with ollama into our local k3s cluster.

  1. Prepare the open-webui-values-k3s.yaml file.
ollama:
  image:
    tag: 0.3.9
  resources:
    requests:
      cpu: "2000m"
      memory: "2Gi"
    limits:
      cpu: "4000m"
      memory: "6Gi"
      nvidia.com/gpu: "0"
  service:
    type: ClusterIP
  gpu:
    enabled: false
  models: ["mistral:7b"]
  persistentVolume:
    enabled: true
    size: 20Gi

resources:
  requests:
    cpu: "500m"
    memory: "500Mi"
  limits:
    cpu: "1000m"
    memory: "1Gi"
service:
  type: NodePort
  1. Add helm repo for Open WebUI.
helm repo add open-webui https://helm.openwebui.com/
helm repo update
  1. Deploy open-webui with embedded llama onto your local k3s
kubectl create ns myfirstgenai
helm upgrade --install open-webui-ollama open-webui/open-webui \
  --namespace myfirstgenai \
  --create-namespace \
  --values open-webui-values-k3s.yaml
  1. Check the deployment status
❯ kubectl get all -n myfirstgenai
NAME                                      READY   STATUS    RESTARTS   AGE
pod/open-webui-pipelines-bd86b5bc-nzpvb   1/1     Running   0          7d
pod/open-webui-0                          1/1     Running   0          7d
pod/open-webui-ollama-5d6b97fc9f-kjzqw    1/1     Running   0          7d

NAME                           TYPE        CLUSTER-IP      EXTERNAL-IP   PORT(S)        AGE
service/open-webui             NodePort    10.43.110.192   <none>        80:32574/TCP   7d
service/open-webui-pipelines   ClusterIP   10.43.231.90    <none>        9099/TCP       7d
service/open-webui-ollama      ClusterIP   10.43.94.196   <none>        11434/TCP      7d

NAME                                   READY   UP-TO-DATE   AVAILABLE   AGE
deployment.apps/open-webui-pipelines   1/1     1            1           7d
deployment.apps/open-webui-ollama      1/1     1            1           7d

NAME                                            DESIRED   CURRENT   READY   AGE
replicaset.apps/open-webui-pipelines-bd86b5bc   1         1         1       7d
replicaset.apps/open-webui-ollama-5d6b97fc9f    1         1         1       7d

NAME                          READY   AGE
statefulset.apps/open-webui   1/1     7d
  1. Enable port-forwarding for open-webui , open-webui-ollama and open-dgr-svc by navigating to Port Forwarding.
  • forward open-webui to port 8080
  • forward open-webui-ollama to port 11434

02-rancher-desktop-port-forwarding-1

  1. Navigate to the http://127.0.0.1:8080 and sign up your own first user account and sign in.

02-openwebui-1

  1. Download the mistral LLM from Open WebUI.

02-openwebui-2

  1. Let's try to ask questions to see if the local LLM works. For example: why is the sky blue? please answer in less than 10 words

02-openwebui-3

  1. Let's try test the ollama api with command curl
curl http://127.0.0.1:11434/api/chat -d '
{
  "model": "mistral",
  "stream": false,
  "messages": [
    { "role": "user", "content": "why is the sky blue?  please answer in less than 10 words." }
  ]
} '

Task 2 - Deploy the OpenDGR API gateway protec the GenAI app

Let's deploy OpenDGR onto our local k3s cluster for securing the GenAI apps access.

  1. OpenDGR Deployment with a single line curl command
curl -s https://raw.githubusercontent.com/TPIsoftwareOSPO/digiRunner_Open/refs/heads/master/manifest/open_dgr.yaml | kubectl apply -f -
  1. Enable port-forwarding for open-webui , open-webui-ollama and open-dgr-svc by navigating to Port Forwarding.

    • forward open-dgr-svc to port 18080

    image-20241016205503

  2. Navigate to the http://127.0.0.1:18080/dgrv4/ac4/login and login with OpenDGR manager.

  • 登入帳號: manager
  • 密碼: manager123 03-opendgrui-login
  1. Check the ollama API service internal cluster IP
❯ kubectl get svc -n myfirstgenai

NAME                           TYPE        CLUSTER-IP      EXTERNAL-IP   PORT(S)        AGE
service/open-webui             NodePort    10.43.110.192   <none>        80:32574/TCP   7d
service/open-webui-pipelines   ClusterIP   10.43.231.90    <none>        9099/TCP       7d
service/open-webui-ollama      ClusterIP   10.43.94.196   <none>        11434/TCP      7d

  1. Navigate to API registry (Under API Management, API Registry).
  • Target URL : http://<service/open-webui-ollama-clusterIP>:11434/api/chat # replace with the ollama API service internal cluster IP above
  • API Name : chat
  • digiRunner Proxy Path : chat
  • Http Methods : POST
  • No Auth : Yes 03-opendgrui-dashboard
  1. Navigate to API List, Enable Chat API. 03-opendgrui-api-list

  2. Navigate to API Test and try test the ollama api

{
  "model": "mistral",
  "stream": false,
  "messages": [
    { "role": "user", "content": "why is the sky blue?  please answer in less than 10 words." }
  ]
} 

03-opendgrui-api-test

  1. API test result with ollama. image-20241015160909483

Task 3 - Multi-Cluster Deployment with Rancher Fleet (Optional)

The GenAI app works great! but the steps to manual deploy and update multiple clusters are too time consuming. Lets adopt the GitOps approach to maintain the GenAI app.

  1. Go to Rancher Server home page, Click the top left 3-line bar icon to expand the navigation menu, click Continuous Delivery

rancher-fleet-homepage

Before we proced, let's verify if we can see all our clusters in Continous Delivery

rancher-fleet-cluster-list

With Rancher Fleet, one can manage individual or group of clusters. Managing cluster via Group reduces adminstrative efforts.

  1. Now we will create a Cluster Group.

Navigate to Cluster Group and click on Create.

rancher-fleet-cluster-group-create

Give it a name edge

Under Cluster Selector , click Add Rule provide the following values

Key:env

Operator: in list

Value:edge

we are going to use the same Label which was used to create edge01 and edge02 clusters.

rancher-fleet-cluster-group-edge

  1. Once you key in the key:value pair, Rancher will use the selector labels to indentify the clusters to be associated with our newly created cluster group in Rancher Continuous Delivery. You should see it show matches all 2 existing clusters.

Click on Create which will create our first Cluster Group.

rancher-fleet-cluster-group-added

we can click into the edge cluster group for resources details.

rancher-fleet-cluster-group-details

  1. Configure a git repository

we will use the fleet-examples git repo to deploy the Kubernetes sample guestbook application. The app will be deployed into the default namespace.

you can also fork my fleet-examples Git Repository(https://github.com/akiliuhk/fleet-examples) for testing.

In the Git Repos page click on Add Repository

rancher-fleet-git-repo-add

  • Enter fleet-examples as your git repo Name

  • Enter https://github.com/akiliuhk/fleet-examples (the fleet-examples git repo URL) in Repository URL

  • Enter Paths: simple , with all the parameters default setting then click Next

Sample output of the GitRepo configuration below

rancher-fleet-git-repo-add-details1

Scroll to bottom rancher-fleet-git-repo-add-details2

in the Step2, Deploy to Target dropdown list, select the Cluster Group we created previosuly edge and click Create.

rancher-fleet-git-repo-add-details-step2

We have successfully completed Rancher Contious Delivery (GitOps) configuration.

rancher-fleet-git-repo-list

  1. click into the fleet-examples git repo, you can expect the example app will be deployed to the cluster group in a minute.

rancher-fleet-git-repo-status

When there is any commit or updates in the git repo, fleet by default will checks the git repo changes every 15 seconds, then fleet will deploy new changes into the cluster group automatically.

Conclusion

We successfully set up a Rancher Desktop with K3s to create a local Kubernetes environment. Next, we deployed the GenAI application, OpenWebUI with Ollama, into the local K3s cluster. Finally, we secured the GenAI app using the OpenDGR API gateway.

As the next step, we will demo to utilize Rancher Fleet to deploy the GenAI applications into the production environment, ensuring scalability and reliability

By following these steps, you can effectively simulate a production environment, test deployments, and manage GenAI app and deploy to large-scale environment.

About

Build a private Generative AI on local laptop with Rancher Desktop and K3S

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 57.3%
  • Dockerfile 42.7%