-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Switch to using langchain.js #6
Conversation
src/handler.ts
Outdated
this._history.messages.map(msg => { | ||
if (msg.sender.username === 'User') { | ||
return new HumanMessage(msg.body); | ||
} | ||
return new SystemMessage(msg.body); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need to stack all the previous messages ?
I see that it was already the case before, but I wonder if it is necessary, it can become huge for a long conversation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes it can also increase the payload for the requests, but it gives the full context of the conversation as there is otherwise no way to have some kind of "memory"?
But maybe we can try using some higher level primitives offered by langgraph
to hide this logic: https://js.langchain.com/docs/how_to/message_history/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also it looks like this SystemMessage
should be switched to a AIMessage
: https://js.langchain.com/docs/concepts/#aimessage
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes it can also increase the payload for the requests, but it gives the full context of the conversation as there is otherwise no way to have some kind of "memory"?
Right, I was wondering if there could be some kind of "session" when using the API, and this history to be stored on the server providing the API.
But maybe we can try using some higher level primitives offered by
langgraph
to hide this logic: https://js.langchain.com/docs/how_to/message_history/
We can probably keep it that way for a first PR, and think on how to improve the history in future issues / PRs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also it looks like this
SystemMessage
should be switched to aAIMessage
: https://js.langchain.com/docs/concepts/#aimessage
Good catch.
Also the bot messages were not added to the handler history, nor to the API messages.
@@ -29,6 +27,10 @@ const inlineProviderPlugin: JupyterFrontEndPlugin<void> = { | |||
manager: ICompletionProviderManager, | |||
settingRegistry: ISettingRegistry | |||
): void => { | |||
const mistralClient = new MistralAI({ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also we'll probably want to revisit this later to handle multiple providers:
- do we need to instantiate both a
MistralAI
andChatMistralAI
clients? - maybe there should be one plugin to provide the AI clients, that other plugins would consume
- offer a way to use in set the API key without going through the settings system, since the settings are stored in the browser in JupyterLite
Experiments towards #5