-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ideas from meeting with David and more ! #1
Comments
Alain. Please, create a branch from feature/fiware_integration in order the try out the local model using the CB |
sorry @dncampo, @javicond3, I did not saw your message, I forked the repo @ mine: https://github.com/agaldemas/Poc_LLM_CB_FIWARE Tell me if you want I create the branch anyway... |
I will do a new branch for flowise integration... |
|
Your idea by using I had to use openai.chat.completions, which is working, but we loose the context of a conversation :( ... => this is a feature we can introduce using flowise (which use langchain under the hood)
|
Yes, threads are only supported now by OpenAI. You can send the context (or a summary) in every iteration but is not efficient... Does flowise support it? |
you can do this in many different ways with flowise, manage prompt, store context, inject embeddings, prepared in vector db, etc.... |
introduce Flowise in the loop
use local llm (ollama + compatible model)
add vector storage to store retrieved POIs as embeddings, then use as retriever to augment prompt instead of giving the context-broker results...
add additional agent that can search on the web, and add some results to the response
format better the output (display markdown return transformed in html)
The text was updated successfully, but these errors were encountered: