A collaboration between Oori and Denver subgroup of the Rocky Mountain AI Interest Group
We just help people gather to try out hands-on development in GenAI, offering some basic guidance, and expert support during the lab event
AI Labs - Hands-On Exploration of Agentic AI, 19 Nov 2024 at Venture X Denver North
Break up into groups of 2 or 3, usually with one primary host for the GenAIOps. All attendees should ideally bring a laptop, but The GenAIOps host requires one.
Notes on suggested config below. These are in part to align with what the facilitators can most readyly help with, and to reduce the need for any paid or registered third-party services.
Suggested configs and procedures are just suggestions! Feeel free to use different tools or actions, and then teach your fellow lab participants about your approach!
We recommend prepping for an AI Lab session by doing the following:
- Revieiwing the configs below to assess which would work best on the laptop you bring
- Review and better yet install any mentioned tools
- Think of some simple, real-life problem you've encountered where GenAI could help
Primary host on M1/M2/M3/M4 MacBook with at least 8GB RAM, and ideally 16GB
Use the MLX library to host local, high performance GenAI models running on GPU (see their examples)
Install Python 3.12 from python.org, via the "macOS 64-bit universal2 installer"
Alternatively: Install Python 3.12 via Homebrew, but the facilitators will not be able to help as readily with this step
Set up a virtual environment (venv). A good name for this is ailab
Make sure you can run the following command for the venv: pip install mlx mlx_lm
TBD
Primary host on a Linux laptop, preferably with a modern GPU, though you can use CPU-only, which will be much slower.
Use Ollama
Make sure you have a 3.10 or more recent Python environemnt.
Set up a virtual environment (venv). A good name for this is ailab
DO NOT proceed outside a venv.
Install Ollama:
curl -fsSL https://ollama.com/install.sh | sh
Optionally, also install Open WebUI.
pip install open-webui
Test that you can launch it:
open-webui serve