Skip to content

Convert documents to a vector database and pass them to an LLM for relevant answers. Features include chat history support and a Streamlit UI for a seamless experience

Notifications You must be signed in to change notification settings

anuragjain67/ai_assistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI Assistant

It's an attempt to learn how to take advantage of LLM and create a better product. First I am converting docs to vector database, Now whenever user is quering, first passing this query to vector db which is returning context using using similarity search. Pass question and context to LLM with prompt which will return answer only from the context provided. Also have support for chat history so particular session becomes relevant for the user.

Demo

  1. Select the Data Source
    Choose the data source for querying.

    Datasource Selection

  2. Ask the Assistant
    Interact with the assistant by asking questions related to your selected data source.

    Ask Question

Installation

To set up the AI Assistant, follow these steps:

  1. Install required dependencies:
    pip install -r requirements.txt
  2. Obtain an API key from Google and copy to .env file
    cp env_sample .env
  3. Create the necessary directories:
    mkdir -p data/personal_notion metadata
  4. Copy your documents into the data/personal_notion folder.
  5. Convert documents to a vector database:
    python data.py
  6. Launch the application:
    streamlit run app.py

Let's Understand Architecutre

When you run python data.py, it converts the documents into a vector database. Whenever you ask a question, the system retrieves relevant context from the vector database and sends it to the LLM using a structured prompt.

Architecture Diagram

TechStack

  1. LangChain
  2. ChromaDB
  3. StreamLit
  4. Google Gemini

Potential Use Cases

Here are a few ways you can utilize LLMs for different projects:

  1. Configuration Generator
    Automatically generate configurations based on input data.

  2. NLP to SQL
    Convert natural language queries to SQL statements.

  3. Document Summarizer
    Summarize lengthy documents into key points.

  4. Teacher
    Use the assistant to answer educational questions and help with learning.

You can extend the LLM by integrating custom tools and functions, giving it access to specific data and functionalities based on your needs.

Star History

Star History Chart

About

Convert documents to a vector database and pass them to an LLM for relevant answers. Features include chat history support and a Streamlit UI for a seamless experience

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages