From cd0ec920e4a02cd915f6fbba3cc79660058b83d2 Mon Sep 17 00:00:00 2001 From: Himanshu Gohel <1551217+hgohel@users.noreply.github.com> Date: Tue, 24 Dec 2024 07:45:51 -0500 Subject: [PATCH] Expand Ollama documentation (#32) * Expand Ollama documentation * Separate configuration options and environment variables. Remove table. --- docs/install_setup/chat.md | 4 +++- docs/install_setup/configuration.md | 1 + 2 files changed, 4 insertions(+), 1 deletion(-) diff --git a/docs/install_setup/chat.md b/docs/install_setup/chat.md index 2141581..a1886c4 100644 --- a/docs/install_setup/chat.md +++ b/docs/install_setup/chat.md @@ -90,7 +90,9 @@ volumes: ollama_data: ``` -and then set the `LLM_BASE_URL` configuration parameter to `http://ollama:11434/v1` and the `LLM_MODEL` to a model supported by Ollama. +and then set the `LLM_BASE_URL` configuration parameter to `http://ollama:11434/v1`. Set `LLM_MODEL` to a model supported by Ollama, and pull it down in your container with `ollama pull `. Finally, set `OPENAI_API_KEY` to `ollama`. + +To troubleshoot problems with Ollama, you can enable debug logging by setting environment variable `OLLAMA_DEBUG=1` in the Ollama service environment. !!! info If you are using Ollama for Gramps Web AI chat, please support the community by completing this documentation with any missing details. diff --git a/docs/install_setup/configuration.md b/docs/install_setup/configuration.md index f2f7134..1ec2215 100644 --- a/docs/install_setup/configuration.md +++ b/docs/install_setup/configuration.md @@ -96,6 +96,7 @@ Key | Description `LLM_BASE_URL` | Base URL for the OpenAI-compatible chat API. Defaults to `None`, which uses the OpenAI API. `LLM_MODEL` | The model to use for the OpenAI-compatible chat API. If unset (the default), chat is disabled. `VECTOR_EMBEDDING_MODEL` | The [Sentence Transformers](https://sbert.net/) model to use for semantic search vector embeddings. If unset (the default), semantic search and chat are disabled. +`LLM_MAX_CONTEXT_LENGTH` | Character limit for the family tree context provided to the LLM. Defaults to 50000. ## Example configuration file