Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement conversation context and streaming with OllamaSharp #310

Open
wants to merge 12 commits into
base: main
Choose a base branch
from

Conversation

kborowinski
Copy link

@kborowinski kborowinski commented Nov 21, 2024

PR Summary

This PR refactors the AIShell.Ollama.Agent to use the OllamaSharp library, enhancing functionality and maintainability. Key updates include support for conversation context, streaming, improved settings management, and better error handling.

Important:
If you choose this PR, then close Implement settings file for Ollama agent

PR Context

This update improves the AIShell.Ollama.Agent by:

  1. Switching to OllamaSharp:

    • Simplifies API calls with OllamaApiClient.
    • Supports both streaming and non-streaming responses.
  2. Introducing Context Support:

    • Enables the agent to remember previous responses, improving conversational flow.
  3. Enhancing Configuration Management:

    • Manages settings via ollama.config.json with real-time updates using FileSystemWatcher.
    • Provides a sample configuration file to simplify setup.
  4. Improving Chat Interaction:

    • Adds robust error handling, especially when prerequisites (like the Ollama server) are not met.
    • Validates settings (e.g., Model and Endpoint) to ensure proper functionality.

This PR also aligns with prerequisites outlined in issue #155 and improves the user experience with clearer feedback and enhanced features.

@kborowinski kborowinski changed the title Implement settings file for Ollama agent Implement conversation context and streaming with OllamaSharp Nov 21, 2024
Copy link
Member

@daxian-dbw daxian-dbw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kborowinski Thanks for making improvements to the ollama agent!

Comment on lines +6 to +7
<SuppressNETCoreSdkPreviewMessage>true</SuppressNETCoreSdkPreviewMessage>
<CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
<SuppressNETCoreSdkPreviewMessage>true</SuppressNETCoreSdkPreviewMessage>
<CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies>
<SuppressNETCoreSdkPreviewMessage>true</SuppressNETCoreSdkPreviewMessage>
<CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies>

<PackageReference Include="OllamaSharp" Version="4.0.7" />
</ItemGroup>

<ItemGroup>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
<ItemGroup>
<ItemGroup>

if (Process.GetProcessesByName("ollama").Length is 0)
{
host.RenderFullResponse("Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met.");
host.MarkupWarningLine($"[[{Name}]]: Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met.");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
host.MarkupWarningLine($"[[{Name}]]: Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met.");
host.WriteErrorLine($"[{Name}]: Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met.");

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, given that the prompt will be @ollama when the agent is in use, maybe we don't need to include the [{Name}] part in the error message.

<configuration>
<packageSources>
<clear />
<add key="PowerShell_PublicPackages" value="https://pkgs.dev.azure.com/powershell/PowerShell/_packaging/powershell/nuget/v3/index.json" />
<add key="NuGet" value="https://api.nuget.org/v3/index.json" />
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have saved the OllamaSharp package to the MS feed (we're required to use it only), so you can revert the changes in this file.

Comment on lines +308 to +325
string SampleContent = $$"""
{
// To use Ollama API service:
// 1. Install Ollama:
// winget install Ollama.Ollama
// 2. Start Ollama API server:
// ollama serve
// 3. Install Ollama model:
// ollama pull phi3

// Declare Ollama model
"Model": "phi3",
// Declare Ollama endpoint
"Endpoint": "http://localhost:11434",
// Enable Ollama streaming
"Stream": false
}
""";
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
string SampleContent = $$"""
{
// To use Ollama API service:
// 1. Install Ollama:
// winget install Ollama.Ollama
// 2. Start Ollama API server:
// ollama serve
// 3. Install Ollama model:
// ollama pull phi3
// Declare Ollama model
"Model": "phi3",
// Declare Ollama endpoint
"Endpoint": "http://localhost:11434",
// Enable Ollama streaming
"Stream": false
}
""";
string SampleContent = """
{
// To use Ollama API service:
// 1. Install Ollama: `winget install Ollama.Ollama`
// 2. Start Ollama API server: `ollama serve`
// 3. Install Ollama model: `ollama pull phi3`
// Declare Ollama model
"Model": "phi3",
// Declare Ollama endpoint
"Endpoint": "http://localhost:11434",
// Enable Ollama streaming
"Stream": false
}
""";

Comment on lines +230 to +233
host.WriteErrorLine($"[{Name}]: {e.Message}");
host.WriteErrorLine($"[{Name}]: Selected Model : \"{_settings.Model}\"");
host.WriteErrorLine($"[{Name}]: Selected Endpoint : \"{_settings.Endpoint}\"");
host.WriteErrorLine($"[{Name}]: Configuration File: \"{SettingFile}\"");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nitpick: when using the ollama agent, the prompt will be @ollama, so maybe we don't need to add the [{Name}]: part in the error message.

Comment on lines +94 to +98
_request = new GenerateRequest()
{
Model = _settings.Model,
Stream = _settings.Stream
};
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given that we will assign Model and Stream per query, maybe we just do the following

Suggested change
_request = new GenerateRequest()
{
Model = _settings.Model,
Stream = _settings.Stream
};
_request = new GenerateRequest();

Comment on lines +8 to +10
private string _model;
private string _endpoint;
private bool _stream;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can consider removing those fields. You can make the properties read-only (e.g. public string Model { get; }), and initialize them in the constructor.

GenerateDoneResponseStream ollamaLastStream = null;

// Directly process the stream when no spinner is needed
await foreach (var ollamaStream in _client.GenerateAsync(_request, token))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In my experiments, there is several-seconds delay before the streaming starts, so we should still use the spinner for the call _client.GenerateAsync(_request, token) to make it more user friendly.

}

// Update request context
_request.Context = ollamaLastStream.Context;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When pressing Ctrl+c while streaming is on, it's possible that the await foreach stops without throwing an exception, and then execution moves to this line, which will fail because ollamaLastStream is null.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants