Here is a draft for your README file based on the gathered code and structure of the Jakee4488/CustomGPT-RAG
repository:
Chat Bot using LLM models and custom fine-tuning with RAG.
- Introduction
- Installation
- Usage
- Code Overview
- Local GPT UI
- Ingesting Documents
- Running Locally
- API Integration
- Contributing
- License
CustomGPT-RAG is a chatbot project that leverages large language models (LLM) and Retrieval-Augmented Generation (RAG) to provide enhanced interactive responses. This project allows for custom fine-tuning and operates even without an internet connection, making it ideal for handling sensitive data.
To get started with CustomGPT-RAG, follow these steps:
-
Clone the repository:
git clone https://github.com/Jakee4488/CustomGPT-RAG.git cd CustomGPT-RAG
-
Install the necessary dependencies:
pip install -r requirements.txt
To run the UI locally, use the following command:
python localGPTUI/localGPTUI.py --host 0.0.0.0 --port 5111
This will start the UI on localhost:5111
.
To ingest documents into the system, use the UI to upload the files. Supported formats include text, PDF, CSV, and Excel files. The application processes these documents to create a comprehensive database for the model.
- localGPTUI/localGPTUI.py: Defines the Flask application to render the UI and handle user inputs.
- ingest.py: Handles the ingestion of documents, splitting them into chunks, and creating embeddings.
- run_localGPT.py: Implements the main logic for the local GPT, including setting up the QA system and running the interactive loop.
- run_localGPT_API.py: Provides API endpoints to manage document ingestion and prompt handling.
- Manages the web interface using Flask.
- Handles user prompts and document uploads.
- Interacts with backend endpoints to process user inputs.
- Loads documents from a specified source directory.
- Splits documents into manageable chunks.
- Creates embeddings using specified models and stores them in a vectorstore.
- Sets up the local QA system with specified device type and options.
- Runs an interactive loop for user queries and returns answers based on ingested documents.
- Defines API endpoints to delete and save documents, run ingestion, and handle user prompts.
- Executes ingestion scripts and manages the document database.
The UI is built using Flask and renders templates defined in localGPTUI/templates/home.html
. The main functionalities include:
- Uploading and managing documents.
- Submitting user prompts and displaying responses.
Documents can be ingested by uploading them through the UI. The backend processes these files, creating embeddings and storing them for efficient retrieval during QA sessions.
To run the application locally, follow the usage instructions to start the UI and handle document ingestion. Ensure that all dependencies are installed and the necessary directories are set up.
The project provides several API endpoints to manage documents and handle user prompts:
/api/delete_source
: Deletes and recreates the source document folder./api/save_document
: Saves uploaded documents to the server./api/run_ingest
: Runs the document ingestion process./api/prompt_route
: Handles user prompts and returns responses.
Contributions are welcome! Please fork the repository and submit a pull request with your changes. Ensure your code adheres to the project's coding standards and includes appropriate tests.
This project is licensed under the MIT License. See the LICENSE file for more details.
Feel free to further customize this README to suit your project's needs. Let me know if you need any additional information or modifications!