diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md index 6b9d969..8c717fc 100644 --- a/.github/ISSUE_TEMPLATE/bug_report.md +++ b/.github/ISSUE_TEMPLATE/bug_report.md @@ -20,3 +20,8 @@ If applicable, add screenshots to help explain your problem. - Browser [e.g. chrome, safari] - CatAI version [e.g. 0.3.10] (`catai --version`) - Node.js version [e.g 19] (`node --version`) + - Which model are you trying to run? (`catai active`) + - How many GB of RAM do you have available? + - What CPU do you have? + +Is this model compatible? (run `catai ls` for this info) diff --git a/docs/configuration.md b/docs/configuration.md index b6e1eb9..3a11d2a 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -14,7 +14,11 @@ You can config the model by the following steps: 2. Edit configuration (JSON file): If the model binding is `node-llama-cpp` or `node-llama-cpp-v2`, You can find the configuration - here: [Node-Llama-CPP Model's Options](https://withcatai.github.io/node-llama-cpp/types/LlamaModelOptions.html) + here: + + [LlamaContextOptions](https://withcatai.github.io/node-llama-cpp/api/type-aliases/LlamaContextOptions) + + [LLamaChatPromptOptions](https://withcatai.github.io/node-llama-cpp/api/type-aliases/LLamaChatPromptOptions) 3. Restart the server.