Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add summarizing and quoting capabilities to 05-assistive-chatbot #30

Merged
merged 12 commits into from
Jun 3, 2024

Conversation

yoomlam
Copy link
Collaborator

@yoomlam yoomlam commented May 30, 2024

Ticket

Partly resolves https://navalabs.atlassian.net/browse/DST-218

Changes

Populate prototype code in 05-assistive-chatbot with the summarizing and quoting capabilities from 02-household-queries chatbot.

Update handling settings for LLM clients and did some general refactoring.

Context for reviewers

This provides baseline summarizing and quoting capabilities for the prototype.

Testing

Testing instructions and expected behavior:

  1. pip install -r requirements.txt
  2. Update .env
  3. To clear and populate the DB, run: rm -rf chroma_db && ./ingest-guru-cards.py
  4. Try it on the command line, for example:
CHATBOT_LOG_LEVEL=INFO CHAT_ENGINE=Summaries    LLM_MODEL_NAME='openai :: gpt-3.5-turbo-instruct' LLM_MODEL_NAME_2='openai :: gpt-3.5-turbo-instruct' RETRIEVE_K=2 ./cmdline.py

CHATBOT_LOG_LEVEL=INFO CHAT_ENGINE=Summaries-DSPy LLM_MODEL_NAME='dspy :: gpt-3.5-turbo-instruct' LLM_MODEL_NAME_2='openai :: gpt-3.5-turbo-instruct' RETRIEVE_K=2 ./cmdline.py
  1. Try it via the web app (To ensure summaries are generated correctly, pick *instruct model for both LLMs):
./chatbot-chainlit.py
  • Open a browser to http://localhost:8000/
  • Open the Settings panel by clicking on the icon to the left of the bottom text input field.
  • Choose "Summaries" as the Chat Mode (synonymous with CHAT_ENGINE for the cmdline.py app) and pick openai :: gpt-3.5-turbo-instruct as the Primary LLM and LLM Model for summarizer. Adjust the two Temperature... settings to play with the responses.
  • click Confirm
  • type in any question and see derived questions, Guru cards, their summaries, and quotes:
image

from chatbot.ingest.text_splitter import TextSplitter

logger = logging.getLogger(f"chatbot.{__name__}")


Copy link
Collaborator Author

@yoomlam yoomlam May 30, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

moved the following to a reusable vector_db.py so that the same default settings can be used by the chatbot to retrieve from the DB

Comment on lines +153 to +154
if quote != card:
card_to_quotes[card].add(quote)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the quote == card title, then don't use it as a quote b/c the card title will already be shown in the UI and the title/question is not particular helpful as a quote.

)


ingest_vectordb_wrapper = LocalLangchainChromaVectorDb()
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ingest_vectordb_wrapper provides the same default settings for use by ingest-guru-cards.py and the chatbot.

return {
"content": "\n".join(resp + dq_resp + cards_resp),
"elements": [
# cl.Text(name="Derived Questions", content="\n".join(dq_resp), display="side"),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this still needed?

Copy link
Collaborator Author

@yoomlam yoomlam Jun 3, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll leave it for now as an example of how to use cl.Text with different display parameters -- it's not intuitive.

self.vectordb_wrapper = vector_db.ingest_vectordb_wrapper
self.retrieve_k = int(settings.pop("retrieve_k"))

# TODO: ingestigate if this should be set to true
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

investigate*

Copy link
Collaborator

@ccheng26 ccheng26 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no blocking issues, lgtm!

@yoomlam yoomlam merged commit f1464cd into main Jun 3, 2024
1 check passed
@yoomlam yoomlam deleted the yl/v2_household_engine branch June 3, 2024 15:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants