From 57b1faf7a4cb665b69ee2907afc60a343b832e84 Mon Sep 17 00:00:00 2001 From: Fortyseven Date: Sat, 17 Aug 2024 15:01:11 -0400 Subject: [PATCH] feat: updates README --- README.md | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/README.md b/README.md index 819527f..175d8d9 100644 --- a/README.md +++ b/README.md @@ -16,10 +16,16 @@ Chit is a light, serverless chat front-end for Ollama that doesn't rely on a ser - Compatible with KoboldCpp presets, at least on import. Saving is supported, but not guaranteed to be backwards compatible. Try it! + - Temporary system prompt clipboard (works like "M" and "MC" on a calculator) + +- Quick system prompt presets for common tasks (summary, etc.) + - Basic variable expansion for system prompts (e.g. `{{myvariable}}` -> `My Text`, and `{{date}}` expands to the current date/time of inference. - Markdown rendering if it's detected in the response (code blocks, etc), or just uses it all the time if configured. + - Code block syntax highting for common languages + - Reroll responses, copy responses to clipboard, etc. - Implements image pasting from the clipboard for inference against multimodal models. @@ -30,6 +36,7 @@ Chit is a light, serverless chat front-end for Ollama that doesn't rely on a ser - Everything is persisted through your browser's localStorage, or through JSON exports. + # Use it There's a [hosted copy here on GitHub](https://fortyseven.github.io/chit/) with the latest build. You can use this as much as you like (it defaults to the default localhost Ollama endpoint), but feel free to build and host it yourself. 🍻 - Just beware that since it's the latest build, it may include new bugs. But it also might include new features. _Live life on the edge._