Chat: Add configuration for Ollama chat server #3358
Closed
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Follow up on #3282, #3282 (comment) in specific, to make the URL for the chat completion endpoint to be configurable. Default to use the default Ollama local host.
experimentalOllamaChatApiEndpoint
toConfiguration
interfaceollamaChatClient
to use configurable API endpointapiEndpoint
toCompletionParameters
interfacecody.experimental.ollamaChat.ApiEndpoint
setting to VSCode extensionchatClient.chat
inSimpleChatPanelProvider
This allows using a custom API endpoint for the Ollama chat models, enabling cases where the chat server is hosted remotely, or separately from their autocompletion server
Test plan
Update the new configuration to see if the chat works based on the change