- Objective
A really simple chat interface to compare answers from some chat models on the market.
- Installation
First, install ollama on your pc : click here
from the directory where you want to install if, run the following command lines :
ollama pull llama3:latest
git clone https://github.com/jeromedejaegher/chat_interface.git
conda create --name <<NEW_ENV_NAME>> python=3.12
conda activate <<NEW_ENV_NAME>>
cd chat_interface
mkdir logs
pip install -r requirements.txt
- Run the code
from the chat_interface directory, run the following command line :
streamlit run main.py