Skip to content

Commit

Permalink
docs: update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
olimorris committed Jan 12, 2025
1 parent af69afb commit 914e203
Show file tree
Hide file tree
Showing 3 changed files with 109 additions and 1 deletion.
3 changes: 3 additions & 0 deletions doc/.vitepress/config.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,7 @@ export default defineConfig({
text: "Configuration",
collapsed: false,
items: [
{ text: "Introduction", link: "/configuration/introduction" },
{ text: "Adapters", link: "/configuration/adapters" },
{ text: "Chat Buffer", link: "/configuration/chat-buffer" },
{ text: "Inline Assistant", link: "/configuration/inline-assistant" },
Expand Down Expand Up @@ -95,5 +96,7 @@ export default defineConfig({
link: "https://github.com/olimorris/codecompanion.nvim",
},
],

search: { provider: "local" },
},
});
79 changes: 78 additions & 1 deletion doc/configuration/adapters.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ require("codecompanion").setup({
}),
```

## Extending an Adapter
## Setting an API Key

Extend a base adapter to set options like `api_key` or `model`:

Expand Down Expand Up @@ -53,6 +53,36 @@ require("codecompanion").setup({
}),
```

> [!NOTE]
> In this example, we're using the 1Password CLI to extract the OpenAI API Key. You could also use gpg as outlined [here](https://github.com/olimorris/codecompanion.nvim/discussions/601)
## Configuring Adapter Settings

LLMs have many settings such as model, temperature and max_tokens. In an adapter, these sit within a schema table and can be configured during setup:

```lua
require("codecompanion").setup({
adapters = {
llama3 = function()
return require("codecompanion.adapters").extend("ollama", {
name = "llama3", -- Give this adapter a different name to differentiate it from the default ollama adapter
schema = {
model = {
default = "llama3:latest",
},
num_ctx = {
default = 16384,
},
num_predict = {
default = -1,
},
},
})
end,
},
})
```

## Adding a Custom Adapter

> [!NOTE]
Expand Down Expand Up @@ -104,6 +134,53 @@ require("codecompanion").setup({
},
}),
```

## Example: Using OpenAI Compatible Models

To use any other OpenAI compatible models, change the URL in the env table, set an API key:

```lua
require("codecompanion").setup({
adapters = {
ollama = function()
return require("codecompanion.adapters").extend("openai_compatible", {
env = {
url = "http[s]://open_compatible_ai_url", -- optional: default value is ollama url http://127.0.0.1:11434
api_key = "OpenAI_API_KEY", -- optional: if your endpoint is authenticated
chat_url = "/v1/chat/completions", -- optional: default value, override if different
},
})
end,
},
})
```

## Example: Using Ollama Remotely

To use Ollama remotely, change the URL in the env table, set an API key and pass it via an "Authorization" header:

```lua
require("codecompanion").setup({
adapters = {
ollama = function()
return require("codecompanion.adapters").extend("ollama", {
env = {
url = "https://my_ollama_url",
api_key = "OLLAMA_API_KEY",
},
headers = {
["Content-Type"] = "application/json",
["Authorization"] = "Bearer ${api_key}",
},
parameters = {
sync = true,
},
})
end,
},
})
```

## Example: Azure OpenAI

Below is an example of how you can leverage the `azure_openai` adapter within the plugin:
Expand Down
28 changes: 28 additions & 0 deletions doc/configuration/introduction.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
# Introduction

This section sets out how various elements of CodeCompanion's config can be changed. The examples are shown wrapped in a `require("codecompanion").setup({})` block to work with all plugin managers.

However, if you're using [Lazy.nvim](https://github.com/folke/lazy.nvim), you can apply config changes in the `opts` table which is much cleaner:

```lua
{
"olimorris/codecompanion.nvim",
dependencies = {
"nvim-lua/plenary.nvim",
"nvim-treesitter/nvim-treesitter",
},
opts = {
strategies = {
-- Change the default chat adapter
chat = {
adapter = "anthropic",
},
},
opts = {
-- Set debug logging
log_level = "DEBUG",
},
},
},
```
Of course, peruse the rest of this section for more configuration options.

0 comments on commit 914e203

Please sign in to comment.