-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement conversation context and streaming with OllamaSharp #310
base: main
Are you sure you want to change the base?
Conversation
…ue to Ollama misconfiguration
…nd port; Precise warning message
…sible null exception
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kborowinski Thanks for making improvements to the ollama agent!
<SuppressNETCoreSdkPreviewMessage>true</SuppressNETCoreSdkPreviewMessage> | ||
<CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
<SuppressNETCoreSdkPreviewMessage>true</SuppressNETCoreSdkPreviewMessage> | |
<CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies> | |
<SuppressNETCoreSdkPreviewMessage>true</SuppressNETCoreSdkPreviewMessage> | |
<CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies> |
<PackageReference Include="OllamaSharp" Version="4.0.7" /> | ||
</ItemGroup> | ||
|
||
<ItemGroup> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
<ItemGroup> | |
<ItemGroup> |
if (Process.GetProcessesByName("ollama").Length is 0) | ||
{ | ||
host.RenderFullResponse("Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met."); | ||
host.MarkupWarningLine($"[[{Name}]]: Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met."); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
host.MarkupWarningLine($"[[{Name}]]: Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met."); | |
host.WriteErrorLine($"[{Name}]: Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met."); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, given that the prompt will be @ollama
when the agent is in use, maybe we don't need to include the [{Name}]
part in the error message.
<configuration> | ||
<packageSources> | ||
<clear /> | ||
<add key="PowerShell_PublicPackages" value="https://pkgs.dev.azure.com/powershell/PowerShell/_packaging/powershell/nuget/v3/index.json" /> | ||
<add key="NuGet" value="https://api.nuget.org/v3/index.json" /> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have saved the OllamaSharp
package to the MS feed (we're required to use it only), so you can revert the changes in this file.
string SampleContent = $$""" | ||
{ | ||
// To use Ollama API service: | ||
// 1. Install Ollama: | ||
// winget install Ollama.Ollama | ||
// 2. Start Ollama API server: | ||
// ollama serve | ||
// 3. Install Ollama model: | ||
// ollama pull phi3 | ||
|
||
// Declare Ollama model | ||
"Model": "phi3", | ||
// Declare Ollama endpoint | ||
"Endpoint": "http://localhost:11434", | ||
// Enable Ollama streaming | ||
"Stream": false | ||
} | ||
"""; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
string SampleContent = $$""" | |
{ | |
// To use Ollama API service: | |
// 1. Install Ollama: | |
// winget install Ollama.Ollama | |
// 2. Start Ollama API server: | |
// ollama serve | |
// 3. Install Ollama model: | |
// ollama pull phi3 | |
// Declare Ollama model | |
"Model": "phi3", | |
// Declare Ollama endpoint | |
"Endpoint": "http://localhost:11434", | |
// Enable Ollama streaming | |
"Stream": false | |
} | |
"""; | |
string SampleContent = """ | |
{ | |
// To use Ollama API service: | |
// 1. Install Ollama: `winget install Ollama.Ollama` | |
// 2. Start Ollama API server: `ollama serve` | |
// 3. Install Ollama model: `ollama pull phi3` | |
// Declare Ollama model | |
"Model": "phi3", | |
// Declare Ollama endpoint | |
"Endpoint": "http://localhost:11434", | |
// Enable Ollama streaming | |
"Stream": false | |
} | |
"""; |
host.WriteErrorLine($"[{Name}]: {e.Message}"); | ||
host.WriteErrorLine($"[{Name}]: Selected Model : \"{_settings.Model}\""); | ||
host.WriteErrorLine($"[{Name}]: Selected Endpoint : \"{_settings.Endpoint}\""); | ||
host.WriteErrorLine($"[{Name}]: Configuration File: \"{SettingFile}\""); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nitpick: when using the ollama agent, the prompt will be @ollama
, so maybe we don't need to add the [{Name}]:
part in the error message.
_request = new GenerateRequest() | ||
{ | ||
Model = _settings.Model, | ||
Stream = _settings.Stream | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given that we will assign Model
and Stream
per query, maybe we just do the following
_request = new GenerateRequest() | |
{ | |
Model = _settings.Model, | |
Stream = _settings.Stream | |
}; | |
_request = new GenerateRequest(); |
private string _model; | ||
private string _endpoint; | ||
private bool _stream; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can consider removing those fields. You can make the properties read-only (e.g. public string Model { get; }
), and initialize them in the constructor.
GenerateDoneResponseStream ollamaLastStream = null; | ||
|
||
// Directly process the stream when no spinner is needed | ||
await foreach (var ollamaStream in _client.GenerateAsync(_request, token)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In my experiments, there is several-seconds delay before the streaming starts, so we should still use the spinner for the call _client.GenerateAsync(_request, token)
to make it more user friendly.
} | ||
|
||
// Update request context | ||
_request.Context = ollamaLastStream.Context; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR Summary
This PR refactors the
AIShell.Ollama.Agent
to use theOllamaSharp
library, enhancing functionality and maintainability. Key updates include support for conversation context, streaming, improved settings management, and better error handling.Important:
If you choose this PR, then close Implement settings file for Ollama agent
PR Context
This update improves the
AIShell.Ollama.Agent
by:Switching to OllamaSharp:
OllamaApiClient
.Introducing Context Support:
Enhancing Configuration Management:
ollama.config.json
with real-time updates usingFileSystemWatcher
.Improving Chat Interaction:
Model
andEndpoint
) to ensure proper functionality.This PR also aligns with prerequisites outlined in issue #155 and improves the user experience with clearer feedback and enhanced features.