-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Making the LLM providers more generics #10
Conversation
src/chat-handler.ts
Outdated
if (this._llmClient === null) { | ||
const botMsg: IChatMessage = { | ||
id: UUID.uuid4(), | ||
body: '**Chat client not configured**', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we can open a new issue to track translating this string (and maybe others) with the ITranslator
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
src/chat-handler.ts
Outdated
this._llmClient = options.llmClient; | ||
} | ||
|
||
get llmClient(): BaseChatModel | null { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wondering if "provider" could also be a good name as an alternative to llmClient
? Or something else similar to what Jupyter AI uses?
src/index.ts
Outdated
id: 'jupyterlab-codestral:llm-provider', | ||
autoStart: true, | ||
requires: [ICompletionProviderManager, ISettingRegistry], | ||
provides: ILlmProvider, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ILlmProvider
may read a bit awkward with the change of casing between the two "L". Maybe ILLMProvider
or IAIProvider
could also work? (or something else)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @brichet!
Looks like we can get this one in and iterate more on it.
It would be interesting to follow-up by adding support for another provider, for example OpenAI. Edit: opened #13 |
This PR adds the possibility of changing LLM provider, even if only MistralAI is available at the moment.
A unique inline completion provider is registered to the manager.
This inline completion provider relies on a completer specific to the LLM used, to correctly call the LLM API.
Adding a new LLM would be to:
TODO, probably in a follow up PR: dynamically update the available settings when the selected provider has changed.