Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement settings file for Ollama agent #307

Closed
wants to merge 9 commits into from

Conversation

kborowinski
Copy link

@kborowinski kborowinski commented Nov 19, 2024

Important:

I created another PR that utilizes the OllamaSharp library, featuring context, streaming, and settings support. I strongly recommend reviewing and merging #310. However, if you are unable to use third-party libraries for any reason, please consider this PR instead.

PR Summary

This PR implements basic settings file for Ollama agent:

  1. The settings file is stored at $HOME\.aish\agent-config\ollama\ollama.config.json
  2. Currently only model and endpoint parameters are stored in the config file.
  3. Default config file is created when none exists:
{
    // To use Ollama API service:
    // 1. Install Ollama:
    //      winget install Ollama.Ollama
    // 2. Start Ollama API server:
    //      ollama serve
    // 3. Install Ollama model:
    //      ollama pull phi3

    // Declare Ollama model
    "Model": "phi3",
    // Declare Ollama endpoint
    "Endpoint": "http://localhost:11434"
}

PR Context

This PR allows user to specify custom Ollama model and Ollama enpoint location, instead of hardcoded values.
This partially implements #155 (no history and streaming)

@StevenBucher98 : This is quick and dirty, but I needed the settings file ASAP to test the different models without constant agent recompilation. I'm open for suggestions.

@kborowinski kborowinski marked this pull request as draft November 19, 2024 16:11
@kborowinski kborowinski marked this pull request as ready for review November 19, 2024 18:40
@kborowinski kborowinski marked this pull request as draft November 20, 2024 15:44
@kborowinski kborowinski marked this pull request as ready for review November 20, 2024 16:03
@kborowinski
Copy link
Author

kborowinski commented Nov 20, 2024

@StevenBucher98 I have just discovered OllamaSharp nuget package that would simplify the agent code and provide easy streaming service. Would you consider switching to it?

This is how easy is to setup session and start streaming:

var _ollama = new OllamaApiClient("http://localhost:11434");
var _models = await _ollama.ListLocalModelsAsync();
_ollama.SelectedModel = _models.FirstOrDefault().Name;

await foreach (var stream in _ollama.GenerateAsync("How to list files with PowerShell?"))
{
    Console.Write(stream.Response);
}

Edit: I have the PR it the works, with the streaming and context support.

@daxian-dbw
Copy link
Member

OllamaSharp uses the MIT license, so it's totally fine to use it for this agent.

I will close this PR to favor the other one.

@daxian-dbw daxian-dbw closed this Nov 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants