diff --git a/README.md b/README.md index 9ed2b7d5..189cc0ac 100644 --- a/README.md +++ b/README.md @@ -32,7 +32,7 @@ language models. picoLLM Inference Engine is: - [picoLLM](#picollm) - [Table of Contents](#table-of-contents) - - [Supported Models](#supported-models) + - [Models](#models) - [AccessKey](#accesskey) - [Demos](#demos) - [Python](#python-demos) @@ -50,8 +50,16 @@ language models. picoLLM Inference Engine is: - [Releases](#releases) - [FAQ](#faq) -## Supported Models +## Models +PicoLLM Inference Engine supports the following open-weight models. You can download them from +[Picovoice Console](https://console.picovoice.ai/). + +- Gemma + - `gemma-2b` + - `gemma-2b-it` + - `gemma-7b` + - `gemma-7b-it` - Llama-2 - `llama-2-7b` - `llama-2-7b-chat` @@ -61,9 +69,9 @@ language models. picoLLM Inference Engine is: - `llama-2-70b-chat` - Llama-3 - `llama-3-8b` - - `llama-3-8b-chat` + - `llama-3-8b-instruct` - `llama-3-70b` - - `llama-3-70b-chat` + - `llama-3-70b-instruct` - Mistral - `mistral-7b-v0.1` - `mistral-7b-instruct-v0.1` @@ -71,11 +79,6 @@ language models. picoLLM Inference Engine is: - Mixtral - `mixtral-8x7b-v0.1` - `mixtral-8x7b-instruct-v0.1` -- Gemma - - `gemma-2b` - - `gemma-2b-it` - - `gemma-7b` - - `gemma-7b-it` - Phi-2 ## AccessKey