Chat Engine #15418
Unanswered
d3buggerdan
asked this question in
Q&A
Chat Engine
#15418
Replies: 1 comment 4 replies
-
To create a chat engine with the best context mode and include chat history, you can use the following code: # Assuming `index` is already created and configured
chat_engine = index.as_chat_engine(chat_mode="best", verbose=True)
# To include chat history, you can initialize the conversation with some existing history
from llama_index.core.llms import ChatMessage, MessageRole
custom_chat_history = [
ChatMessage(
role=MessageRole.USER,
content="Hello assistant, we are having an insightful discussion about Paul Graham today.",
),
ChatMessage(role=MessageRole.ASSISTANT, content="Okay, sounds good."),
]
# Assuming `query_engine` is already created and configured
chat_engine = CondenseQuestionChatEngine.from_defaults(
query_engine=query_engine,
chat_history=custom_chat_history,
verbose=True,
)
# Now you can have a conversation with your data
response = chat_engine.chat("Tell me a joke.")
print(response) This code sets up a chat engine with the "best" context mode and includes some initial chat history. For more detailed information, you can refer to the LlamaIndex Chat Engines documentation [1][2]. |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
@dosu
How can i make this with the best or context mode and have chat history.
Initialize custom chat history
custom_chat_history = [
]
Create the chat engine using the low-level API
chat_engine = CondenseQuestionChatEngine.from_defaults(
query_engine=query_engine,
#condense_question_prompt=custom_prompt,
chat_history=custom_chat_history,
verbose=True,)
Beta Was this translation helpful? Give feedback.
All reactions