From 84667cd9bb8ddab15a608acc3fd0fb9068eee765 Mon Sep 17 00:00:00 2001 From: YiYing He Date: Wed, 25 Sep 2024 16:38:08 +0800 Subject: [PATCH] Update docs for 0.14.1 and plug-in information. Signed-off-by: YiYing He --- .env | 2 +- docs/contribute/contribute.md | 23 +- docs/contribute/source/build_from_src.md | 5 +- docs/contribute/source/plugin/wasi_logging.md | 6 + docs/contribute/source/plugin/wasi_nn.md | 232 +++++-------- .../kubernetes-containerd-runwasi.md | 20 +- docs/develop/javascript/networking.md | 2 +- docs/develop/rust/database/postgres_driver.md | 3 +- docs/develop/rust/database/qdrant_driver.md | 3 +- docs/develop/rust/database/redis_driver.md | 1 - docs/develop/rust/http_service/server.md | 4 +- docs/develop/rust/setup.md | 2 +- docs/develop/rust/wasinn/openvino.md | 2 +- docs/develop/rust/wasinn/piper.md | 22 ++ docs/develop/rust/wasinn/pytorch.md | 2 +- docs/develop/rust/wasinn/tensorflow_lite.md | 2 +- docs/develop/rust/wasinn/tf_plugin.md | 11 +- docs/develop/rust/wasinn/whisper.md | 22 ++ docs/embed/c/reference/latest.md | 2 +- docs/start/faq.md | 2 - docs/start/install.md | 325 ++++++------------ docs/start/overview.md | 2 +- docs/start/wasmedge/extensions/plugins.md | 25 +- .../current/contribute/contribute.md | 161 +++++++-- .../contribute/source/build_from_src.md | 5 +- .../current/contribute/source/os/linux.md | 2 + .../contribute/source/plugin/rusttls.md | 7 +- .../contribute/source/plugin/wasi_logging.md | 6 + .../contribute/source/plugin/wasi_nn.md | 223 +++++++++--- .../current/contribute/users.md | 8 +- .../develop/deploy/cri-runtime/containerd.md | 2 + .../develop/deploy/gpu/_category_.json | 9 + .../deploy/gpu}/docker_wasm_gpu.md | 0 .../deploy/gpu}/podman_wasm_gpu.md | 0 .../kubernetes-containerd-runwasi.md | 133 ++++++- .../develop/deploy/oci-runtime/youki.md | 28 +- .../current/develop/javascript/networking.md | 2 +- .../current/develop/python/_category_.json | 2 +- .../current/develop/rust/_category_.json | 2 +- .../develop/rust/database/my_sql_driver.md | 23 +- .../develop/rust/database/postgres_driver.md | 5 +- .../develop/rust/database/qdrant_driver.md | 5 +- .../develop/rust/database/redis_driver.md | 3 +- .../develop/rust/http_service/client.md | 9 +- .../develop/rust/http_service/server.md | 6 +- .../current/develop/rust/setup.md | 16 +- .../develop/rust/wasinn/llm_inference.md | 103 ++---- .../current/develop/rust/wasinn/openvino.md | 2 +- .../current/develop/rust/wasinn/piper.md | 22 ++ .../current/develop/rust/wasinn/pytorch.md | 2 +- .../develop/rust/wasinn/tensorflow_lite.md | 2 +- .../current/develop/rust/wasinn/tf_plugin.md | 11 +- .../current/develop/rust/wasinn/whisper.md | 22 ++ .../current/embed/c/reference/latest.md | 2 +- .../current/start/faq.md | 4 + .../start/wasmedge/extensions/plugins.md | 67 ++-- 56 files changed, 901 insertions(+), 713 deletions(-) create mode 100644 docs/develop/rust/wasinn/piper.md create mode 100644 docs/develop/rust/wasinn/whisper.md create mode 100644 i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/gpu/_category_.json rename i18n/zh/docusaurus-plugin-content-docs/current/{start/build-and-run => develop/deploy/gpu}/docker_wasm_gpu.md (100%) rename i18n/zh/docusaurus-plugin-content-docs/current/{start/build-and-run => develop/deploy/gpu}/podman_wasm_gpu.md (100%) create mode 100644 i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/piper.md create mode 100644 i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/whisper.md diff --git a/.env b/.env index ac3c6c81e..7a5ce39d3 100644 --- a/.env +++ b/.env @@ -1,2 +1,2 @@ -WASMEDGE_VERSION='0.14.0' +WASMEDGE_VERSION='0.14.1' WASMEDGE_GO_VERSION='0.13.4' \ No newline at end of file diff --git a/docs/contribute/contribute.md b/docs/contribute/contribute.md index 6847b1020..0770f571e 100644 --- a/docs/contribute/contribute.md +++ b/docs/contribute/contribute.md @@ -4,7 +4,6 @@ sidebar_position: 8 # Contributing Guide - * [New Contributor Guide](#contributing-guide) * [Ways to Contribute](#ways-to-contribute) * [Find an Issue](#find-an-issue) @@ -43,11 +42,10 @@ We welcome many different types of contributions including: Not everything happens through a GitHub pull request. Please come to our [meetings](https://docs.google.com/document/d/1iFlVl7R97Lze4RDykzElJGDjjWYDlkI8Rhf8g4dQ5Rk/edit?usp=sharing) or [contact us](https://groups.google.com/g/wasmedge) and let's discuss how we can work -together. +together. ### Come to Meetings - Absolutely everyone is welcome to come to any of our meetings. You never need an invite to join us. In fact, we want you to join us, even if you don’t have anything you feel like you want to contribute. Just being there is enough! @@ -66,7 +64,7 @@ suitable for someone who isn't a core maintainer and is good to move onto after your first pull request. Sometimes there won’t be any issues with these labels. That’s ok! There is -likely still something for you to work on. If you want to contribute but +likely still something for you to work on. If you want to contribute but don’t know where to start or can't find a suitable issue, you can leave a comment under this issue like "I'd like to work on this. Can you tell XYZ (list the stuff you want to communicate)" or send your questions to our discord server or slack channel. Once you see an issue that you'd like to work on, please post a comment saying @@ -85,12 +83,11 @@ Before opening any issue, please look up the existing [issues](https://github.co When reporting issues, always include: -- Version of your system -- Configuration files of WasmEdge +* Version of your system +* Configuration files of WasmEdge Because the issues are open to the public, when submitting the log and configuration files, be sure to remove any sensitive information, e.g. user name, password, IP address, and company name. You can replace those parts with "REDACTED" or other strings like "\*\*\*\*". Be sure to include the steps to reproduce the problem if applicable. It can help us understand and fix your issue faster. - ## Pull Request Lifecycle Pull requests are always welcome, even if they only contain minor fixes like typos or a few lines of code. If there will be a significant effort, please document it as an issue and get a discussion going before starting to work on it. @@ -99,13 +96,13 @@ Please submit a pull request broken down into small changes bit by bit. A pull r Generally, once your pull request has been opened, it will be assigned to one or more reviewers. Those reviewers will do a thorough code review, looking for correctness, bugs, opportunities for improvement, documentation and comments, and coding style. If your PR is not ready to review, please mark your PR as a draft. -The reviewers will give you some feedback in three work days. +The reviewers will give you some feedback in three work days. -After the first review is done, the PR contributor is expected to review and make some changes based on the review in 5 workdays. +After the first review is done, the PR contributor is expected to review and make some changes based on the review in 5 workdays. If you have finished the adjustments, mark the problem as solved, then the reviewers will review your PR again in 2 workdays. -If the PR contributor doesn't respond to the PR in 30 days, the maintainer will close the PR. The original PR contributor is welcome to open it again. +If the PR contributor doesn't respond to the PR in 30 days, the maintainer will close the PR. The original PR contributor is welcome to open it again. If the PR contributor doesn't want to maintain the PR due to some reason, please enable maintainers to edit this PR if you still want this PR to be merged. @@ -124,6 +121,7 @@ To build WasmEdge from the source, please refer to: [Build WasmEdge from source] ## Sign Your Commits ### DCO + Licensing is important to open source projects. It provides some assurances that the software will continue to be available based under the terms that the author(s) desired. We require that contributors sign off on commits submitted to @@ -143,11 +141,10 @@ Git has a `-s` command line option to do this automatically: git commit -s -m 'This is my commit message' If you forgot to do this and have not yet pushed your changes to the remote -repository, you can amend your commit with the sign-off by running +repository, you can amend your commit with the sign-off by running git commit --amend -s - ## Pull Request Checklist When you submit your pull request, or you push new commits to it, our automated @@ -159,10 +156,8 @@ before you submit your code: * DCO: Did you sign off your commit * Code of conduct: Did you follow the CNCF code of conduct - ## Reporting issues - ## Documenting Update the documentation if you are creating or changing features. Good documentation is as necessary as the code itself. Documents are written with Markdown. See [Writing on GitHub](https://help.github.com/categories/writing-on-github/) for more details. diff --git a/docs/contribute/source/build_from_src.md b/docs/contribute/source/build_from_src.md index 768656d5d..33f498d8d 100644 --- a/docs/contribute/source/build_from_src.md +++ b/docs/contribute/source/build_from_src.md @@ -76,7 +76,8 @@ Developers can set the CMake options to customize the WasmEdge building. - To build the WASI-NN plug-in with multiple backends, please use `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=,`. 13. `WASMEDGE_PLUGIN_WASI_CRYPTO`: build the WasmEdge WASI-Crypto plug-in (Linux and MacOS platforms only). Default is `OFF`. - This option is useless if the option `WASMEDGE_BUILD_PLUGINS` is set as `OFF`. -14. `WASMEDGE_PLUGIN_WASI_LOGGING`: build the WasmEdge WASI-Logging plug-in (Linux and MacOS platforms only). Default is `OFF`. +14. `WASMEDGE_PLUGIN_WASI_LOGGING`: build the WasmEdge WASI-Logging plug-in (Linux and MacOS platforms only). Default is `ON`. + - In WasmEdge `0.14.1`, the WASI-Logging plug-in is bundled into the WasmEdge library and will not generate the plug-in shared library target. - This option is useless if the option `WASMEDGE_BUILD_PLUGINS` is set as `OFF`. 15. `WASMEDGE_PLUGIN_WASM_BPF`: build the WasmEdge wasm_bpf plugin (Linux platforms only). Default is `OFF`. - This option is useless if the option `WASMEDGE_BUILD_PLUGINS` is set as `OFF`. @@ -91,7 +92,7 @@ Developers can set the CMake options to customize the WasmEdge building. Developers can follow the steps to build WasmEdge with plug-ins from source. -- [WASI-NN (OpenVINO, PyTorch, or TensorFlow-Lite backends)](plugin/wasi_nn.md) +- [WASI-NN (with several backends)](plugin/wasi_nn.md) - [WASI-Crypto](plugin/wasi_crypto.md) - [WasmEdge-Image](plugin/image.md) - [WasmEdge-TensorFlow](plugin/tensorflow.md) diff --git a/docs/contribute/source/plugin/wasi_logging.md b/docs/contribute/source/plugin/wasi_logging.md index 70fa900c1..b12118dcc 100644 --- a/docs/contribute/source/plugin/wasi_logging.md +++ b/docs/contribute/source/plugin/wasi_logging.md @@ -6,6 +6,12 @@ sidebar_position: 1 WASI-Logging allows WebAssembly applications to log messages in a standardized way. This becomes particularly helpful when debugging applications or understanding the flow of execution within them. The WASI-Logging plug-in is designed to be straightforward to use, enabling developers to focus more on their application logic and less on logging mechanics. + +:::note +In WasmEdge `0.14.1` version, this plug-in is bundled into the WasmEdge library and not generate the plug-in shared library. +The plug-in building architecture will be refactored in the future. Therefore we reserve this page for documentation before `0.14.0` versions. +::: + ## Prerequisites The prerequisite of the Wasi-Logging plug-in is the same as the WasmEdge building environment on the [Linux](../os/linux.md) and [MacOS](../os/macos.md) platforms. diff --git a/docs/contribute/source/plugin/wasi_nn.md b/docs/contribute/source/plugin/wasi_nn.md index 1f2f3f818..1901f6eda 100644 --- a/docs/contribute/source/plugin/wasi_nn.md +++ b/docs/contribute/source/plugin/wasi_nn.md @@ -2,15 +2,47 @@ sidebar_position: 2 --- -# Build with WASI-nn Plug-in +# Build with WASI-NN Plug-in The WASI-NN plug-in is a proposed WebAssembly System Interface (WASI) API for machine learning. It allows WebAssembly programs to access host-provided machine learning functions. ## Prerequisites -Currently, WasmEdge used OpenVINO™, PyTorch, TensorFlow Lite, or llama.cpp as the WASI-NN backend implementation. For using WASI-NN on WasmEdge, you need to install [OpenVINO™](https://docs.openvino.ai/2023.0/openvino_docs_install_guides_installing_openvino_apt.html)(2023), [TensorFlow Lite](https://www.tensorflow.org/install/lang_c), or [PyTorch 1.8.2 LTS](https://pytorch.org/get-started/locally/) for the backend. +Currently, WasmEdge supports following backends for WASI-NN proposal: -By default, we don't enable any WASI-NN backend in WasmEdge. Therefore developers should [build the WasmEdge from source](../os/linux.md) with the cmake option `WASMEDGE_PLUGIN_WASI_NN_BACKEND` to enable the backends. +| Backend | Dependency | CMake Option | +|---------|------------|--------------| +| [OpenVINO](#build-wasmedge-with-wasi-nn-openvino-backend) | [OpenVINO™ (2023)](https://docs.openvino.ai/2023.0/openvino_docs_install_guides_installing_openvino_apt.html) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=OpenVINO` | +| [TensorFlow-Lite](#build-wasmedge-with-wasi-nn-tensorflow-lite-backend) | [TensorFlow Lite](https://www.tensorflow.org/install/lang_c) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=TensorFlowLite` | +| [PyTorch](#build-wasmedge-with-wasi-nn-pytorch-backend) | [PyTorch 1.8.2 LTS](https://pytorch.org/get-started/locally/) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=PyTorch` | +| [GGML](#build-wasmedge-with-wasi-nn-pytorch-backend) | [llama.cpp](https://github.com/ggerganov/llama.cpp) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=GGML` | +| [Piper](#build-wasmedge-with-wasi-nn-piper-backend) | [Piper](https://github.com/rhasspy/piper) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=Piper` | +| [Whisper](#build-wasmedge-with-wasi-nn-whisper-backend) | [whisper.cpp](https://github.com/ggerganov/whisper.cpp) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=Whisper` | +| [ChatTTS](#build-wasmedge-with-wasi-nn-chattts-backend) | [ChatTTS](https://github.com/2noise/ChatTTS) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=ChatTTS` | +| [MLX](#build-wasmedge-with-wasi-nn-mlx-backend) | [MLX](https://github.com/ml-explore/mlx) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=MLX` | + +Developers can [build the WasmEdge from source](../os/linux.md) with the cmake option `WASMEDGE_PLUGIN_WASI_NN_BACKEND` to enable the backends. For supporting multiple backends, developers can assign the option such as `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND="GGML;Whisper;TensorFlowLite"`. + +After building, you will have the WASI-NN with specified backend(s) plug-in shared library under `/plugins/wasi_nn/libwasmedgePluginWasiNN.so` (or `.dylib` extension on MacOS). + + +:::note +If the `wasmedge` CLI tool cannot find the WASI-NN plug-in, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plug-in installation path (such as `/usr/local/lib/wasmedge/`, or the built plug-in path `build/plugins/wasi_nn/`) to try to fix this issue. +::: + +For the `Burn.rs` backend, please use the cmake option `WASMEDGE_PLUGIN_WASI_NN_BURNRS_MODEL` to assign the model. + +| Model for `Burn.rs` backend | CMake Option | +|-------|--------------| +| Squeezenet | `-WASMEDGE_PLUGIN_WASI_NN_BURNRS_MODEL=Squeezenet` | +| Whisper | `-WASMEDGE_PLUGIN_WASI_NN_BURNRS_MODEL=Whisper` | + +After building, you will have the WASI-NN with specified backend(s) plug-in shared library under `/plugins/wasi_nn_burnrs/libwasmedgePluginWasiNN.so` (or `.dylib` extension on MacOS). + + +:::note +The `WASI-NN Burn.rs` backend cannot build with other backends. +::: ## Build WasmEdge with WASI-NN OpenVINO Backend @@ -31,17 +63,8 @@ Then build and install WasmEdge from source: cd cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="OpenVINO" cmake --build build -# For the WASI-NN plug-in, you should install this project. -cmake --install build ``` - -:::note -If the built `wasmedge` CLI tool cannot find the WASI-NN plug-in, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plug-in installation path (such as `/usr/local/lib/wasmedge/`, or the built plug-in path `build/plugins/wasi_nn/`) to try to fix this issue. -::: - -Then you will have an executable `wasmedge` runtime under `/usr/local/bin` and the WASI-NN with OpenVINO backend plug-in under `/usr/local/lib/wasmedge/libwasmedgePluginWasiNN.so` after installation. - ## Build WasmEdge with WASI-NN PyTorch Backend For choosing and installing PyTorch on `Ubuntu 20.04` for the backend, we recommend the following commands: @@ -74,17 +97,8 @@ Then build and install WasmEdge from source: cd cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="PyTorch" cmake --build build -# For the WASI-NN plug-in, you should install this project. -cmake --install build ``` - -:::note -If the built `wasmedge` CLI tool cannot find the WASI-NN plug-in, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plug-in installation path (such as `/usr/local/lib/wasmedge/`, or the built plug-in path `build/plugins/wasi_nn/`) to try to fix this issue. -::: - -Then you will have an executable `wasmedge` runtime under `/usr/local/bin` and the WASI-NN with PyTorch backend plug-in under `/usr/local/lib/wasmedge/libwasmedgePluginWasiNN.so` after installation. - ## Build WasmEdge with WASI-NN TensorFlow-Lite Backend You can build and install WasmEdge from source directly (on `Linux x86_64`, `Linux aarch64`, `MacOS x86_64`, or `MacOS arm64` platforms): @@ -93,17 +107,8 @@ You can build and install WasmEdge from source directly (on `Linux x86_64`, `Lin cd cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="TensorflowLite" cmake --build build -# For the WASI-NN plug-in, you should install this project. -cmake --install build ``` - -:::note -If the built `wasmedge` CLI tool cannot find the WASI-NN plug-in, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plug-in installation path (such as `/usr/local/lib/wasmedge/`, or the built plug-in path `build/plugins/wasi_nn/`) to try to fix this issue. -::: - -Then you will have an executable `wasmedge` runtime under `/usr/local/bin` and the WASI-NN with TensorFlow-lite backend plug-in under `/usr/local/lib/wasmedge/libwasmedgePluginWasiNN.so` after installation. - Installing the necessary `libtensorflowlite_c.so` and `libtensorflowlite_flex.so` on both `Ubuntu 20.04` and `manylinux2014` for the backend, we recommend the following commands: ```bash @@ -136,7 +141,7 @@ You don't need to install any llama.cpp libraries. WasmEdge will download it dur Due to the acceleration frameworks being various, you will need to use different compilation options to build this plugin. Please make sure you are following the same OS section to do this. -### MacOS +### Build with llama.cpp Backend on MacOS #### Intel Model @@ -151,8 +156,6 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ -DWASMEDGE_PLUGIN_WASI_NN_GGML_LLAMA_BLAS=OFF \ . cmake --build build -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` #### Apple Silicon Model @@ -168,11 +171,9 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ -DWASMEDGE_PLUGIN_WASI_NN_GGML_LLAMA_BLAS=OFF \ . cmake --build build -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` -### Linux +### Build with llama.cpp Backend on Linux #### Ubuntu/Debian with CUDA 12 @@ -203,9 +204,6 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ . cmake --build build - -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` #### Ubuntu on NVIDIA Jetson AGX Orin @@ -231,9 +229,6 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ . cmake --build build - -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` #### Ubuntu/Debian with OpenBLAS @@ -256,9 +251,6 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ . cmake --build build - -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` #### General Linux without any acceleration framework @@ -272,21 +264,20 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ . cmake --build build - -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` -### Windows +### Build with llama.cpp Backend on Windows -#### Install Dependencies +#### Install Dependencies for llama.cpp And Build on Windows + +Developers can follow the steps for installing the requested dependencies. 1. (Optional, skip this deps if you don't need to use GPU) Download and install CUDA toolkit - We use CUDA Toolkit 12 for the release assets - - Link: https://developer.nvidia.com/cuda-downloads?target_os=Windows&target_arch=x86_64&target_version=11&target_type=exe_local + - Link: 2. Download and install Visual Studio 2022 Community Edition - - Link: https://visualstudio.microsoft.com/vs/community/ + - Link: - Select the following components in the installer: - msvc v143 - vs 2022 c++ x64/x86 build tools (latest) - windows 11 sdk (10.0.22621.0) @@ -294,74 +285,67 @@ cmake --install build 3. Download and install cmake - We use cmake 3.29.3 for the release assets - - Link: https://github.com/Kitware/CMake/releases/download/v3.29.3/cmake-3.29.3-windows-x86_64.msi + - Link: -5. Download and install git +4. Download and install git - We use git 2.45.1 - - Link: https://github.com/git-for-windows/git/releases/download/v2.45.1.windows.1/Git-2.45.1-64-bit.exe + - Link: -6. Download and install ninja-build +5. Download and install ninja-build - We use ninja-build 1.12.1 - - Link: https://github.com/ninja-build/ninja/releases/download/v1.12.1/ninja-win.zip + - Link: - Installation: just unzip it to a custom folder -#### Build +Then developers can build by following the steps. 1. Open Developer PowerShell for VS 2022 - Start -> Visual Studio 2022 -> Visual Studio Tools -> Developer PowerShell for VS 2022 2. Inside the PowerShell, use git to download wasmedge repo -```console -cd $HOME -git clone https://github.com/WasmEdge/WasmEdge.git -cd WasmEdge -``` + ```console + cd $HOME + git clone https://github.com/WasmEdge/WasmEdge.git + cd WasmEdge + ``` 3. Compile wasmedge with enabling the `wasi_nn_ggml` related options, please use the following commands. To build the plugin, you don't need to enable AOT/LLVM related features, so set them to OFF. -##### CUDA Enable + - If you want to enable CUDA: -```console -# CUDA ENABLE: -& "C:\Program files\CMake\bin\cmake.exe" -Bbuild -GNinja -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND=ggml -DWASMEDGE_PLUGIN_WASI_NN_GGML_LLAMA_CUBLAS=ON -DWASMEDGE_USE_LLVM=OFF . -& "\ninja.exe" -C build -``` + ```console + # CUDA ENABLE: + & "C:\Program files\CMake\bin\cmake.exe" -Bbuild -GNinja -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND=ggml -DWASMEDGE_PLUGIN_WASI_NN_GGML_LLAMA_CUBLAS=ON -DWASMEDGE_USE_LLVM=OFF . + & "\ninja.exe" -C build + ``` -##### CUDA Disable + - If you want to disable CUDA: -```console -# CUDA DISABLE: -& "C:\Program files\CMake\bin\cmake.exe" -Bbuild -GNinja -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND=ggml -DWASMEDGE_USE_LLVM=OFF . -& "\ninja.exe" -C build -``` + ```console + # CUDA DISABLE: + & "C:\Program files\CMake\bin\cmake.exe" -Bbuild -GNinja -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND=ggml -DWASMEDGE_USE_LLVM=OFF . + & "\ninja.exe" -C build + ``` -#### Execute the WASI-NN plugin with the llama example +#### Execute the WASI-NN plugin with the llama example on Windows 1. Set the environment variables -```console -$env:PATH += ";$pwd\build\lib\api" -$env:WASMEDGE_PLUGIN_PATH = "$pwd\build\plugins" -``` + ```console + $env:PATH += ";$pwd\build\lib\api" + $env:WASMEDGE_PLUGIN_PATH = "$pwd\build\plugins" + ``` 2. Download the wasm and run -```console -wget https://github.com/second-state/WasmEdge-WASINN-examples/raw/master/wasmedge-ggml/llama/wasmedge-ggml-llama.wasm -wget https://huggingface.co/QuantFactory/Meta-Llama-3-8B-Instruct-GGUF/blob/main/Meta-Llama-3-8B-Instruct.Q5_K_M.gguf -wasmedge --dir .:. --env llama3=true --env n_gpu_layers=100 --nn-preload default:GGML:AUTO:Meta-Llama-3-8B-Instruct.Q5_K_M.gguf wasmedge-ggml-llama.wasm default -``` - -### Appendix + ```console + wget https://github.com/second-state/WasmEdge-WASINN-examples/raw/master/wasmedge-ggml/llama/wasmedge-ggml-llama.wasm + wget https://huggingface.co/QuantFactory/Meta-Llama-3-8B-Instruct-GGUF/blob/main/Meta-Llama-3-8B-Instruct.Q5_K_M.gguf + wasmedge --dir .:. --env llama3=true --env n_gpu_layers=100 --nn-preload default:GGML:AUTO:Meta-Llama-3-8B-Instruct.Q5_K_M.gguf wasmedge-ggml-llama.wasm default + ``` - -:::note -If the built `wasmedge` CLI tool cannot find the WASI-NN plugin, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plugin installation path (such as `/usr/local/lib/wasmedge/` or the built plugin path `build/plugins/wasi_nn/`) to try to fix this issue. -::: +### Appendix for llama.cpp backend - -:::note We also provided the pre-built ggml plugins on the following platforms: - darwin\_x86\_64: Intel Model macOS @@ -375,57 +359,30 @@ We also provided the pre-built ggml plugins on the following platforms: - manylinux2014\_x86\_64: x86\_64 Linux (the glibc is using CentOS 7 one) - manylinux2014\_aarch64: aarch64 Linux (the glibc is using CentOS 7 one) -::: - -## Build WasmEdge with WASI-NN Neural Speed Backend - -The Neural Speed backend relies on Neural Speed, we recommend the following commands to install Neural Speed. - -```bash -sudo apt update -sudo apt upgrade -sudo apt install python3-dev -wget https://raw.githubusercontent.com/intel/neural-speed/main/requirements.txt -pip install -r requirements.txt -pip install neural-speed -``` +## Build WasmEdge with WASI-NN Piper Backend -Then build and install WasmEdge from source: +Build and install WasmEdge from source: ```bash cd - -cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="neuralspeed" +cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="Piper" cmake --build build - -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` -Then you will have an executable `wasmedge` runtime under `/usr/local/bin` and the WASI-NN with Neural Speed backend plug-in under `/usr/local/lib/wasmedge/libwasmedgePluginWasiNN.so` after installation. - -## Build WasmEdge with WASI-NN Piper Backend +## Build WasmEdge with WASI-NN Whisper Backend Build and install WasmEdge from source: ```bash cd -cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="Piper" +cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="Whisper" cmake --build build -# For the WASI-NN plug-in, you should install this project. -cmake --install build ``` - -:::note -If the built `wasmedge` CLI tool cannot find the WASI-NN plug-in, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plug-in installation path (such as `/usr/local/lib/wasmedge/`, or the built plug-in path `build/plugins/wasi_nn/`) to try to fix this issue. -::: - -Then you will have an executable `wasmedge` runtime under `/usr/local/bin` and the WASI-NN with Piper backend plug-in under `/usr/local/lib/wasmedge/libwasmedgePluginWasiNN.so` after installation. - ## Build WasmEdge with WASI-NN ChatTTS Backend The ChatTTS backend relies on ChatTTS and Python library, we recommend the following commands to install dependencies. + ```bash sudo apt update sudo apt upgrade @@ -440,38 +397,17 @@ cd cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="chatTTS" cmake --build build - -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` - - -:::note -If the built `wasmedge` CLI tool cannot find the WASI-NN plug-in, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plug-in installation path (such as `/usr/local/lib/wasmedge/`, or the built plug-in path `build/plugins/wasi_nn/`) to try to fix this issue. -::: - -Then you will have an executable `wasmedge` runtime under `/usr/local/bin` and the WASI-NN with ChatTTS backend plug-in under `/usr/local/lib/wasmedge/libwasmedgePluginWasiNN.so` after installation. - - ## Build WasmEdge with WASI-NN MLX Backend You can directly build and install WasmEdge from source or custom install mlx and set `CMAKE_INSTALL_PREFIX` variable. Build and install WasmEdge from source: + ``` bash cd cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="mlx" cmake --build build - -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` - - -:::note -If the built `wasmedge` CLI tool cannot find the WASI-NN plug-in, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plug-in installation path (such as `/usr/local/lib/wasmedge/`, or the built plug-in path `build/plugins/wasi_nn/`) to try to fix this issue. -::: - -Then you will have an executable `wasmedge` runtime under `/usr/local/bin` and the WASI-NN with MLX backend plug-in under `/usr/local/lib/wasmedge/libwasmedgePluginWasiNN.dylib` after installation. diff --git a/docs/develop/deploy/kubernetes/kubernetes-containerd-runwasi.md b/docs/develop/deploy/kubernetes/kubernetes-containerd-runwasi.md index e294a2302..2c783a5c5 100644 --- a/docs/develop/deploy/kubernetes/kubernetes-containerd-runwasi.md +++ b/docs/develop/deploy/kubernetes/kubernetes-containerd-runwasi.md @@ -18,11 +18,9 @@ In the rest of this section, we will explain the steps in detail. Please ensure that you have completed the following steps before proceeding with this setup. - Install the latest version of [Wasmedge](../../../start/install.md) -- Ensure that you have containerd setup following the [instructions here](../../deploy/cri-runtime/containerd-crun.md). +- Ensure that you have containerd setup following the [instructions here](../../deploy/cri-runtime/containerd-crun.md). - Ensure that you have installed and [setup runwasi](../../deploy/cri-runtime/containerd.md) for containerd-shim-wasmedge - - ## Install and start Kubernetes Run the following commands from a terminal window. It sets up Kubernetes for local development. @@ -56,7 +54,6 @@ Local Kubernetes cluster is running. Press Ctrl-C to shut it down. Do NOT close your terminal window. Kubernetes is running! - ## Run and test the Kubernetes Cluster Finally, we can run WebAssembly programs in Kubernetes as containers in pods. In this section, we will start from **another terminal window** and start using the cluster. @@ -87,10 +84,11 @@ CoreDNS is running at https://localhost:6443/api/v1/namespaces/kube-system/servi To further debug and diagnose cluster problems, use 'kubectl cluster-info dump'. ``` -## Configure containerd and Kubernetes for Wasmedge Runtime +## Configure containerd and Kubernetes for Wasmedge Runtime Next we will configure containerd to add support for the containerd-shim-wasmedge. Please ensure that you have [setup runwasi](../../deploy/cri-runtime/containerd.md) to work with WasmEdge container images. + ```bash # Run the following command as root user sudo bash -c "containerd config default > /etc/containerd/config.toml" @@ -118,6 +116,7 @@ sudo cluster/kubectl.sh label nodes 127.0.0.1 runtime=wasm # A successful output from the above command looks like this node/127.0.0.1 labeled ``` + ### A WebAssembly-based HTTP service [A separate article](https://github.com/second-state/wasmedge-containers-examples/blob/main/http_server_wasi_app.md) explains how to compile, package, and publish a simple WebAssembly HTTP service application as a container image to Docker hub. Run the WebAssembly-based image from Docker Hub in the Kubernetes cluster as follows. @@ -134,14 +133,3 @@ echo: name=WasmEdge ``` That's it! - - - - - - - - - - - diff --git a/docs/develop/javascript/networking.md b/docs/develop/javascript/networking.md index 7cea5f556..df1463bf2 100644 --- a/docs/develop/javascript/networking.md +++ b/docs/develop/javascript/networking.md @@ -15,7 +15,7 @@ The networking API in WasmEdge is non-blocking and hence supports asynchronous I ## Prerequisites -[Install WasmEdge](../../start/install.md). To make HTTPS requests, install the [WasmEdge TLS plug-in](../../start/install.md#tls-plug-in). +[Install WasmEdge](../../start/install.md). To make HTTPS requests, install the [WasmEdge TLS plug-in](../../start/install.md#install-wasmedge-with-plug-ins). [Install WasmEdge-QuickJS](./hello_world#prerequisites). Make sure that the `modules` directory is located in your local directory where you want to execute the `wasmedge` command. diff --git a/docs/develop/rust/database/postgres_driver.md b/docs/develop/rust/database/postgres_driver.md index 9f8068dc3..2968603db 100644 --- a/docs/develop/rust/database/postgres_driver.md +++ b/docs/develop/rust/database/postgres_driver.md @@ -31,7 +31,7 @@ wasmedge --env "DATABASE_URL=postgres://user:passwd@localhost/testdb" target/was In order to compile the `tokio-postgres` and `tokio` crates, we will need to apply patches to add WasmEdge-specific socket APIs to those crates in `Cargo.toml`. -``` +```toml [patch.crates-io] tokio = { git = "https://github.com/second-state/wasi_tokio.git", branch = "v1.36.x" } socket2 = { git = "https://github.com/second-state/socket2.git", branch = "v0.5.x" } @@ -151,4 +151,3 @@ async fn main() -> Result<(), Error> { Ok(()) } ``` - diff --git a/docs/develop/rust/database/qdrant_driver.md b/docs/develop/rust/database/qdrant_driver.md index 92e554720..f95526df2 100644 --- a/docs/develop/rust/database/qdrant_driver.md +++ b/docs/develop/rust/database/qdrant_driver.md @@ -52,7 +52,7 @@ qdrant_rest_client = "0.1.0" ## Code explanation The following program uses the `qdrant_rest_client` crate to access local Qdrant server through its RESTful API. -It first creates several points (vectors), saves those vectors to the Qdrant database, retrieves some vectors, +It first creates several points (vectors), saves those vectors to the Qdrant database, retrieves some vectors, searches for vectors, and finally deletes them from the database. ```rust @@ -129,4 +129,3 @@ async fn main() -> Result<(), Box> { Ok(()) } ``` - diff --git a/docs/develop/rust/database/redis_driver.md b/docs/develop/rust/database/redis_driver.md index 126cf3670..b3f268a14 100644 --- a/docs/develop/rust/database/redis_driver.md +++ b/docs/develop/rust/database/redis_driver.md @@ -70,4 +70,3 @@ async fn main() -> Result<()> { Ok(()) } ``` - diff --git a/docs/develop/rust/http_service/server.md b/docs/develop/rust/http_service/server.md index ab2328e89..1f75887fc 100644 --- a/docs/develop/rust/http_service/server.md +++ b/docs/develop/rust/http_service/server.md @@ -6,7 +6,7 @@ sidebar_position: 2 For WasmEdge to become a cloud-native runtime for microservices, it needs to support HTTP servers. By its very nature, the HTTP server is always asynchronous (non-blocking -- so that it can handle concurrent requests). This chapter will cover HTTP servers using popular Rust APIs. -- [The axum API](#the-warp-api) +- [The axum API](#the-axum-api) - [The hyper API](#the-hyper-api) @@ -46,7 +46,7 @@ In your Rust application, you will apply a few patches developed by the WasmEdge POSIX sockets with WasmEdge sockets in standard libraries. With those patches, you can then use the official `tokio` and `axum` crates. -``` +```toml [patch.crates-io] tokio = { git = "https://github.com/second-state/wasi_tokio.git", branch = "v1.36.x" } socket2 = { git = "https://github.com/second-state/socket2.git", branch = "v0.5.x" } diff --git a/docs/develop/rust/setup.md b/docs/develop/rust/setup.md index 2c00df8fc..d04950517 100644 --- a/docs/develop/rust/setup.md +++ b/docs/develop/rust/setup.md @@ -53,7 +53,7 @@ rustflags = ["--cfg", "wasmedge", "--cfg", "tokio_unstable"] Once you have these lines in `.cargo/config.toml`, you can simply use the regular `cargo` command. -``` +```bash cargo build --target wasm32-wasi --release ``` diff --git a/docs/develop/rust/wasinn/openvino.md b/docs/develop/rust/wasinn/openvino.md index 76e24107d..844b959a5 100644 --- a/docs/develop/rust/wasinn/openvino.md +++ b/docs/develop/rust/wasinn/openvino.md @@ -1,5 +1,5 @@ --- -sidebar_position: 4 +sidebar_position: 3 --- # OpenVINO Backend diff --git a/docs/develop/rust/wasinn/piper.md b/docs/develop/rust/wasinn/piper.md new file mode 100644 index 000000000..5dad97e03 --- /dev/null +++ b/docs/develop/rust/wasinn/piper.md @@ -0,0 +1,22 @@ +--- +sidebar_position: 6 +--- + +# Piper Backend + +We will use [this example project](https://github.com/second-state/WasmEdge-WASINN-examples/tree/master/wasmedge-piper) to show how to make AI inference with a Piper model in WasmEdge and Rust. + +## Prerequisite + +Besides the [regular WasmEdge and Rust requirements](../../rust/setup.md), please make sure that you have the [WASI-NN plugin with Piper installed](../../../start/install.md#wasi-nn-plug-in-with-piper-backend). + +## Quick start + +Because the example already includes a compiled WASM file from the Rust code, we could use WasmEdge CLI to execute the example directly. First, git clone the `WasmEdge-WASINN-examples` repo. + +```bash +git clone https://github.com/second-state/WasmEdge-WASINN-examples.git +cd WasmEdge-WASINN-examples/wasmedge-piper/ +``` + +Please follow the `README.md` to run the example. diff --git a/docs/develop/rust/wasinn/pytorch.md b/docs/develop/rust/wasinn/pytorch.md index 8a22d704c..c924d636c 100644 --- a/docs/develop/rust/wasinn/pytorch.md +++ b/docs/develop/rust/wasinn/pytorch.md @@ -1,5 +1,5 @@ --- -sidebar_position: 2 +sidebar_position: 5 --- # PyTorch Backend diff --git a/docs/develop/rust/wasinn/tensorflow_lite.md b/docs/develop/rust/wasinn/tensorflow_lite.md index 28b217b94..5e061338b 100644 --- a/docs/develop/rust/wasinn/tensorflow_lite.md +++ b/docs/develop/rust/wasinn/tensorflow_lite.md @@ -1,5 +1,5 @@ --- -sidebar_position: 3 +sidebar_position: 4 --- # TensorFlow Lite Backend diff --git a/docs/develop/rust/wasinn/tf_plugin.md b/docs/develop/rust/wasinn/tf_plugin.md index 09b48ea69..de7144a52 100644 --- a/docs/develop/rust/wasinn/tf_plugin.md +++ b/docs/develop/rust/wasinn/tf_plugin.md @@ -1,8 +1,8 @@ --- -sidebar_position: 5 +sidebar_position: 8 --- -# TensorFlow Plug-in For WasmEdge +# TensorFlow And TensorFlow-Lite Plug-in For WasmEdge Developers can use [WASI-NN](https://github.com/WebAssembly/wasi-nn) to inference the models. However, for the TensorFlow and TensorFlow-Lite users, the WASI-NN APIs could be more friendly to retrieve the input and output tensors. Therefore WasmEdge provides the TensorFlow-related plug-in and rust SDK for inferencing models in WASM. @@ -138,10 +138,3 @@ Please refer to [WasmEdge CLI](../../../start/build-and-run/cli.md) for WASM exe :::info Work in Progress ::: - -## Old WasmEdge TensorFlow extension - - -:::info -Work in Progress -::: diff --git a/docs/develop/rust/wasinn/whisper.md b/docs/develop/rust/wasinn/whisper.md new file mode 100644 index 000000000..aae57e06c --- /dev/null +++ b/docs/develop/rust/wasinn/whisper.md @@ -0,0 +1,22 @@ +--- +sidebar_position: 7 +--- + +# Whisper Backend + +We will use [this example project](https://github.com/second-state/WasmEdge-WASINN-examples/tree/master/whisper-basic) to show how to make AI inference with a Whisper model in WasmEdge and Rust. + +## Prerequisite + +Besides the [regular WasmEdge and Rust requirements](../../rust/setup.md), please make sure that you have the [WASI-NN plugin with Whisper installed](../../../start/install.md#install-wasmedge-with-plug-ins). + +## Quick start + +Because the example already includes a compiled WASM file from the Rust code, we could use WasmEdge CLI to execute the example directly. First, git clone the `WasmEdge-WASINN-examples` repo. + +```bash +git clone https://github.com/second-state/WasmEdge-WASINN-examples.git +cd WasmEdge-WASINN-examples/whisper-basic/ +``` + +Please follow the `README.md` to run the example. diff --git a/docs/embed/c/reference/latest.md b/docs/embed/c/reference/latest.md index 8abc3b951..cc7b45da1 100644 --- a/docs/embed/c/reference/latest.md +++ b/docs/embed/c/reference/latest.md @@ -2,7 +2,7 @@ sidebar_position: 1 --- -# C API 0.14.0 Documentation +# C API 0.14.1 Documentation [WasmEdge C API](https://github.com/WasmEdge/WasmEdge/blob/master/include/api/wasmedge/wasmedge.h) denotes an interface to access the WasmEdge runtime at version `{{ wasmedge_version }}`. The following are the guides to working with the C APIs of WasmEdge. diff --git a/docs/start/faq.md b/docs/start/faq.md index 9532496ec..bbd134129 100644 --- a/docs/start/faq.md +++ b/docs/start/faq.md @@ -39,5 +39,3 @@ WasmEdge provides the WASI (WebAssembly System Interface) API for interacting wi The relationship between WasmEdge and Second State is rooted in the latter contributing their WasmEdge Runtime project to the Cloud Native Computing Foundation (CNCF). Subsequently, Second State became one of the maintainers for WasmEdge. As WasmEdge seeks to broaden its community, it continues to search for additional maintainers. Please remember, this FAQ page is not exhaustive, and the WasmEdge community is always ready to help with any questions or issues you may have. Don't hesitate to reach out if you need assistance in our [Discord server](https://discord.gg/h4KDyB8XTt). - - diff --git a/docs/start/install.md b/docs/start/install.md index 567e02608..ebf1cf7c1 100644 --- a/docs/start/install.md +++ b/docs/start/install.md @@ -48,6 +48,14 @@ curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/insta Suppose you are interested in the latest builds from the `HEAD` of the `master` branch, which is basically WasmEdge's nightly builds. In that case, you can download the release package directly from our Github Action's CI artifact. [Here is an example](https://github.com/WasmEdge/WasmEdge/actions/runs/2969775464#artifacts). +#### Install via Nix + +For nix/nixos users, we also provide a `flake.nix` in repository, so you can install WasmEdge via: + +```bash +nix profile install github:WasmEdge/WasmEdge +``` + #### Install WasmEdge with plug-ins WasmEdge plug-ins are pre-built native modules that provide additional functionalities to the WasmEdge Runtime. To install plug-ins with the runtime, you can pass the `--plugins` parameter in the installer. For example, the command below installs the `wasi_nn-ggml` plug-in to enable LLM (Large Language Model) inference. @@ -66,16 +74,40 @@ The installer downloads the plug-in files from the WasmEdge release on GitHub, u :::note -AI plug-ins for WasmEdge, such as the `OpenVINO backend` or `PyTorch backend` for `WASI-NN` plug-ins, have additional dependencies on the `OpenVINO` or `PyTorch` runtime libraries. [See the next section](#install-wasmedge-plug-ins-and-dependencies) for commands to install the plug-in dependencies. -::: +The `WASI-NN` related plug-ins are all EXCLUSIVE. Users can only install one of the WASI-NN backends. -#### Install via Nix +After WasmEdge `0.14.1`, the WASI-Logging plug-in is bundled into the WasmEdge shared library and is not necessary to be installed. -For nix/nixos users, we also provide a `flake.nix` in repository, so you can install WasmEdge via: +Some of plug-ins need dependencies. Please follow the guide in the comment column to install the dependencies. +::: -```bash -nix profile install github:WasmEdge/WasmEdge -``` +The following lists are the WasmEdge official released plug-ins. Users can install them easily by the parameters of `--plugins` option of installer. + +| Plug-in | Parameter | Supported Platforms | Versions | Comment | +|---------|-----------|---------------------|----------|---------| +| WASI-Logging | `wasi_logging` | All | Since `0.13.0` | Bundled into WasmEdge library since `0.14.1`. | +| WASI-Crypto | `wasi_crypto` | Linux (`x86_64`, `aarch64`), MacOS (`x86_64`, `arm64`) | Since `0.10.1` | | +| WASI-NN OpenVINO backend | `wasi_nn-openvino` | Linux (`x86_64`, Ubuntu only) | Since `0.10.1` | Users should install the [OpenVINO dependency](#openvino-dependencies). | +| WASI-NN PyTorch backend | `wasi_nn-pytorch` | Linux (`x86_64`) | Since `0.11.1` | Users should install the [PyTorch dependency](#pytorch-dependencies). | +| WASI-NN TensorFlow-Lite backend | `wasi_nn-tensorflowlite` | Linux (`x86_64`, `aarch64`), MacOS (`x86_64`, `arm64`) | Since `0.11.2` | [Dependency](#tensorflow-lite-dependencies) installed automatically by installer. | +| WASI-NN GGML backend | `wasi_nn-ggml` | Linux (`x86_64`, `aarch64`), MacOS (`x86_64`, `arm64`) | Since `0.13.4` | [Notes for the dependency](#ggml-dependencies) | +| WASI-NN Piper backend | `wasi_nn-piper` | Linux (`x86_64`, `aarch64`) | Since `0.14.1` | Users should install the [Piper dependency](#piper-dependencies). | +| WASI-NN Whisper backend | `wasi_nn-whisper` | Linux (`x86_64`, `aarch64`), MacOS (`x86_64`, `arm64`) | Since `0.14.1` | | +| WASI-NN Burn.rs backend (Squeezenet) | `wasi_nn_burnrs-squeezenet` | Linux (`x86_64`, Ubuntu only) | Since `0.14.1` | | +| WASI-NN Burn.rs backend (Whisper) | `wasi_nn_burnrs-whisper` | Linux (`x86_64`, Ubuntu only) | Since `0.14.1` | | +| Ffmpeg | `wasmedge_ffmpeg` | Linux (`x86_64`, `aarch64`), MacOS (`x86_64`, `arm64`) | Since `0.14.0` | | +| Image | `wasmedge_image` | Linux (`x86_64`, `aarch64`), MacOS (`x86_64`, `arm64`) | Since `0.13.0` | | +| LLM | `wasmedge_llmc` | Linux (`x86_64`, `aarch64`) | Since `0.14.1` | | +| OpenCV mini | `wasmedge_opencvmini` | Linux (`x86_64`, `aarch64`), MacOS (`x86_64`, `arm64`) | Since `0.13.3` | | +| Process | `wasmedge_process` | Linux (`x86_64`, `aarch64`) | Since `0.10.0` | | +| Stable Diffusion | `wasmedge_stablediffusion` | Linux (`x86_64`, `aarch64`), MacOS (`x86_64`, `arm64`) | Since `0.14.1` | | +| TensorFlow | `wasmedge_tensorflow` | Linux (`x86_64`, `aarch64`), MacOS (`x86_64`, `arm64`) | Since `0.13.0` | [Dependency](#tensorflow-dependencies) installed automatically by installer. | +| TensorFlow-Lite | `wasmedge_tensorflowlite` | Linux (`x86_64`, `aarch64`), MacOS (`x86_64`, `arm64`) | Since `0.13.0` | [Dependency](#tensorflow-lite-dependencies) installed automatically by installer. | +| Zlib | `wasmedge_zlib` | Linux (`x86_64`, `aarch64`), MacOS (`x86_64`, `arm64`) | Since `0.13.5` | | +| WASM-eBPF | `wasm_bpf` | Linux (`x86_64`) | Since `0.13.2` | | +| Rust TLS | `wasmedge_rustls` | Linux (`x86_64`) | Since `0.13.0` | Until `0.13.5`. DEPRECATED. | + +For further details of each plug-ins, please follow the [plug-in page](wasmedge/extensions/plugins.md). ### Windows @@ -127,221 +159,6 @@ You could also change it to `/usr/local` if you did a system-wide install. If you used `winget` to install WasmEdge, the files are located at `C:\Program Files\WasmEdge`. ::: -## Install WasmEdge plug-ins and dependencies - -WasmEdge uses plug-ins to extend its functionality. If you want to use more of WasmEdge's features, you can install WasmEdge along with its plug-ins and extensions as described below: - -### The logging plug-in - -The `wasi_logging` plug-in supports the [log::Log](https://crates.io/crates/log) Rust API. -It allows [log::Log](https://crates.io/crates/log) in Rust code to be compiled to Wasm and to run in WasmEdge. - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --plugins wasi_logging -``` - -[See more examples](https://github.com/WasmEdge/WasmEdge/tree/master/examples/plugin/wasi-logging) - -### WASI-NN plug-ins - -WasmEdge supports various backends for `WASI-NN`, which provides a standardized API for WasmEdge applications to access AI models for inference. Each backend supports a specific type of AI models. - -- [ggml backend](#wasi-nn-plug-in-with-ggml-backend): supported on `Ubuntu 20.04+` and macOS. -- [PyTorch backend](#wasi-nn-plug-in-with-pytorch-backend): supported on `Ubuntu 20.04+` and `manylinux2014_x86_64`. -- [OpenVINO™ backend](#wasi-nn-plug-in-with-openvino-backend): supported on `Ubuntu 20.04+`. -- [TensorFlow-Lite backend](#wasi-nn-plug-in-with-tensorflow-lite-backend): supported on `Ubuntu 20.04+`, `manylinux2014_x86_64`, and `manylinux2014_aarch64`. - -Noticed that the backends are exclusive. Developers can only choose and install one backend for the `WASI-NN` plug-in. - -#### WASI-NN plug-in with ggml backend - -The WASI-NN plug-in with ggml backend allows WasmEdge to run llama2 inference. To install WasmEdge with WASI-NN ggml backend, please pass the `wasi_nn-ggml` option to the `--plugins` flag when running the installer command. - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --plugins wasi_nn-ggml -``` - -Please note, the installer from WasmEdge 0.13.5 will detect CUDA automatically. If CUDA is detected, the installer will always attempt to install a CUDA-enabled version of the plug-in. - -If CPU is the only available hardware on your machine, the installer will install OpenBLAS version of plugin instead. - -```bash -apt update && apt install -y libopenblas-dev # You may need sudo if the user is not root. -``` - -Then, go to the [Llama2 inference in Rust chapter](../develop/rust/wasinn/llm_inference) to see how to run AI inference with llama2 series of models. - -#### WASI-NN plug-in with PyTorch backend - -The WASI-NN plug-in with PyTorch backend allows WasmEdge applications to perform PyTorch model inference. To install WasmEdge with WASI-NN PyTorch backend, please pass the `wasi_nn-pytorch` option to the `--plugins` flag when running the installer command. - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --plugins wasi_nn-pytorch -``` - -The WASI-NN plug-in with PyTorch backend depends on the `libtorch` C++ library to perform AI/ML computations. You need to install the [PyTorch 1.8.2 LTS](https://pytorch.org/get-started/locally/) dependencies for it to work properly. - -```bash -export PYTORCH_VERSION="1.8.2" -# For the Ubuntu 20.04 or above, use the libtorch with cxx11 abi. -export PYTORCH_ABI="libtorch-cxx11-abi" -# For the manylinux2014, please use the without cxx11 abi version: -# export PYTORCH_ABI="libtorch" -curl -s -L -O --remote-name-all https://download.pytorch.org/libtorch/lts/1.8/cpu/${PYTORCH_ABI}-shared-with-deps-${PYTORCH_VERSION}%2Bcpu.zip -unzip -q "${PYTORCH_ABI}-shared-with-deps-${PYTORCH_VERSION}%2Bcpu.zip" -rm -f "${PYTORCH_ABI}-shared-with-deps-${PYTORCH_VERSION}%2Bcpu.zip" -export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:$(pwd)/libtorch/lib -``` - - -:::note -For the `Ubuntu 20.04` or above versions, the WasmEdge installer will install the `Ubuntu` version of WasmEdge and its plug-ins. -For other systems, the WasmEdge installer will install the `manylinux2014` version, and you should get the `libtorch` without `cxx11-abi`. -::: - -Then, go to the [WASI-NN PyTorch backend in Rust chapter](../develop/rust/wasinn/pytorch) to see how to run AI inference with `Pytorch`. - -#### WASI-NN plug-in with OpenVINO backend - -The WASI-NN plug-in with the OpenVINO backend allows WasmEdge applications to perform OpenVINO model inference. To install WasmEdge with WASI-NN OpenVINO backend, please pass the `wasi_nn-openvino` option to the `--plugins` flag when running the installer command. - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --plugins wasi_nn-openvino -``` - -The WASI-NN plug-in with OpenVINO backend depends on the OpenVINO C library to perform AI/ML computations. [OpenVINO](https://docs.openvino.ai/2023.0/openvino_docs_install_guides_installing_openvino_apt.html)(2023) dependencies. The following instructions are for Ubuntu 20.04 and above. - -```bash -wget https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB -sudo apt-key add GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB -echo "deb https://apt.repos.intel.com/openvino/2023 ubuntu20 main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2023.list -sudo apt update -sudo apt-get -y install openvino -ldconfig -``` - -Then, go to the [WASI-NN OpenVINO backend in Rust](../develop/rust/wasinn/openvino) chapter to see how to run AI inference with `OpenVINO. - -#### WASI-NN plug-in with TensorFlow-Lite backend - -The WASI-NN plug-in with Tensorflow-Lite backend allows WasmEdge applications to perform Tensorflow-Lite model inference. To install WasmEdge with WASI-NN Tensorflow-Lite backend, please pass the `wasi_nn-tensorflowlite` option to the `--plugins` flag when running the installer command. - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --plugins wasi_nn-tensorflowlite -``` - -The WASI-NN plug-in with Tensorflow-Lite backend depends on the `libtensorflowlite_c` shared library to perform AI/ML computations, and it will be installed by the installer automatically. - - -:::note -If you install this plug-in WITHOUT installer, you can [refer to here to install the dependency](#tensorflow-lite-dependencies). -:::note - -Then, go to [WASI-NN TensorFlow-lite backend in Rust chapter](../develop/rust/wasinn/tensorflow_lite) to see how to run AI inference with TensorFlow-Lite. - -### WASI-Crypto Plug-in - -[WASI-crypto](https://github.com/WebAssembly/wasi-crypto) is Cryptography API proposals for WASI. To use WASI-Crypto proposal, please use the `--plugins wasi_crypto` parameter when [running the installer command](#generic-linux-and-macos). - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --plugins wasi_crypto -``` - -Then, go to [WASI-Crypto in Rust chapter](../develop/rust/wasicrypto.md) to see how to run WASI-crypto functions. - -### WasmEdge OpenCV mini Plug-in - -The WasmEdge OpenCV Mini plug-in supports a subset of OpenCV APIs in a [Rust API](https://github.com/second-state/opencvmini). -It is essential for developing image processing / vision AI applications in WasmEdge. - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --plugins wasmedge_opencvmini -``` - -[See an example](https://github.com/second-state/opencvmini-example) - -### WasmEdge zlib Plug-in - -The zlib is required for compiling and running many existing C / C++ / Rust apps in Wasm. Most noticeably, it is required for the Python port to Wasm. It supports the standard [zlib.h](https://github.com/madler/zlib/blob/develop/zlib.h) C API. - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --plugins wasmedge_zlib -``` - -[See an example](https://github.com/WasmEdge/WasmEdge/tree/master/examples/plugin/wasmedge-zlib). - -### WasmEdge Image Plug-in - -The wasmEdge-Image plug-in can help developers to load and decode JPEG and PNG images and convert into tensors. To install this plug-in, please use the `--plugins wasmedge_image` parameter when [running the installer command](#generic-linux-and-macos). - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --plugins wasmedge_image -``` - -Then, go to [TensorFlow interface (image part) in Rust chapter](../develop/rust/wasinn/tf_plugin.md#image-loading-and-conversion) to see how to run WasmEdge-Image functions. - -### WasmEdge TensorFlow Plug-in - -The WasmEdge-TensorFlow plug-in can help developers to perform TensorFlow model inference as the similar API in python. To install this plug-in, please use the `--plugins wasmedge_tensorflow` parameter when [running the installer command](#generic-linux-and-macos). - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --plugins wasmedge_tensorflow -``` - -The WasmEdge-Tensorflow plug-in depends on the `libtensorflow_cc` shared library. - - -:::note -If you install this plug-in WITHOUT installer, you can [refer to here to install the dependency](#tensorflow-dependencies). -:::note - -Then, go to [TensorFlow interface in Rust chapter](../develop/rust/wasinn/tf_plugin.md) to see how to run `WasmEdge-TensorFlow` functions. - -### TLS plug-in - - -:::note -The WasmEdge TLS plugin is being deprecated from WasmEdge 0.14.0. We now compile TLS functions directly into Wasm for better portability. -:::note - -The WasmEdge TLS plug-in utilizes the native OpenSSL library to support HTTPS and TLS requests from WasmEdge sockets. To install WasmEdge with the TLS plug-in, run the following command. - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- -v 0.13.5 --plugins wasmedge_rustls -``` - -The HTTPS and TLS demos from 0.13.5 require the TLS plug-in. - -### WasmEdge TensorFlow-Lite Plug-in - - -:::note -The Tensorflow Lite plugin is being deprecated. Please use the [WASI NN TensorflowLite plugin](#wasi-nn-plug-in-with-tensorflow-lite-backend) instead. -:::note - -The wasmEdge-TensorFlowLite plug-in can help developers to perform TensorFlow-Lite model inference. To install this plug-in, please use the `--plugins wasmedge_tensorflowlite` parameter when [running the installer command](#generic-linux-and-macos). - -```bash -curl -sSf https://raw.githubusercontent.com/WasmEdge/WasmEdge/master/utils/install.sh | bash -s -- --plugins wasmedge_tensorflowlite -``` - -## Install WasmEdge extensions and dependencies - - -:::note -The WasmEdge extensions are deprecated and replaced by the plug-ins since `0.13.0`. The latest version supporting the extensions is `0.12.1`. This chapter will be removed when the `0.12.x` versions are no longer supported by the WasmEdge installer. -:::note - -To install the WasmEdge extensions, please use the `-e` option and assign the WasmEdge version before `0.13.0`. You can also use the `-e all` to install the supported extensions. - -### WasmEdge Image extension - -WasmEdge Image extension (replaced by the [WasmEdge-Image plug-in](#wasmedge-image-plug-in) after `0.13.0`) can help developers to load and decode JPEG and PNG images and convert them into tensors. To install this extension, please use the `-e image` parameter when [running the installer command](#generic-linux-and-macos). - -### WasmEdge Tensorflow and TensorFlow-Lite extension with CLI tool - -WasmEdge Tensorflow extension and the CLI tool (replaced by the [WasmEdge-Tensorflow plug-in](#wasmedge-tensorflow-plug-in) and the [WasmEdge-TensorflowLite plug-in](#wasmedge-tensorflow-lite-plug-in) after `0.13.0`) can help developers to perform `TensorFlow` and `TensorFlow-Lite` model inference as the similar API in python. To install this extension, please use the `-e tensorflow` parameter when [running the installer command](#generic-linux-and-macos). - ## Uninstall To uninstall WasmEdge, you can run the following command: @@ -379,11 +196,21 @@ If you used `winget` to install WasmEdge on Windows, run the following command t winget uninstall wasmedge ``` -## Appendix: Installing the TensorFlow Dependencies +## Appendix: Installing the Dependencies + +### GGML Dependencies + +The installer from WasmEdge 0.13.5 will detect CUDA automatically. If CUDA is detected, the installer will always attempt to install a CUDA-enabled version of the WASI-NN GGML plug-in. + +If CPU is the only available hardware on your machine, the installer will install OpenBLAS version of plugin instead. + +```bash +apt update && apt install -y libopenblas-dev # You may need sudo if the user is not root. +``` ### TensorFlow-Lite Dependencies -If you install the WASI NN TensorflowLite plug-in WITHOUT installer, you can download the shared libraries with the following commands: +If you install the WASI-NN TensorflowLite or `WasmEdge-TensorFlowLite` plug-in WITHOUT installer, you can download the shared libraries with the following commands: ```bash VERSION=TF-2.12.0-CC @@ -457,6 +284,56 @@ ln -s libtensorflow_framework.2.12.0.dylib ~/.wasmedge/lib/libtensorflow_framewo ln -s libtensorflow_framework.2.dylib ~/.wasmedge/lib/libtensorflow_framework.dylib ``` +### OpenVINO Dependencies + +The WASI-NN plug-in with OpenVINO backend depends on the OpenVINO C library to perform AI/ML computations. The following commands are for Ubuntu 20.04 and above to install [OpenVINO](https://docs.openvino.ai/2023.0/openvino_docs_install_guides_installing_openvino_apt.html)(2023) dependencies. + +```bash +wget https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB +sudo apt-key add GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB +echo "deb https://apt.repos.intel.com/openvino/2023 ubuntu20 main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2023.list +sudo apt update +sudo apt-get -y install openvino +ldconfig +``` + +### PyTorch Dependencies + +The WASI-NN plug-in with PyTorch backend depends on the `libtorch` C++ library to perform AI/ML computations. You need to install the [PyTorch 1.8.2 LTS](https://pytorch.org/get-started/locally/) dependencies for it to work properly. + +```bash +export PYTORCH_VERSION="1.8.2" +# For the Ubuntu 20.04 or above, use the libtorch with cxx11 abi. +export PYTORCH_ABI="libtorch-cxx11-abi" +# For the manylinux2014, please use the without cxx11 abi version: +# export PYTORCH_ABI="libtorch" +curl -s -L -O --remote-name-all https://download.pytorch.org/libtorch/lts/1.8/cpu/${PYTORCH_ABI}-shared-with-deps-${PYTORCH_VERSION}%2Bcpu.zip +unzip -q "${PYTORCH_ABI}-shared-with-deps-${PYTORCH_VERSION}%2Bcpu.zip" +rm -f "${PYTORCH_ABI}-shared-with-deps-${PYTORCH_VERSION}%2Bcpu.zip" +export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:$(pwd)/libtorch/lib +``` + + +:::note +For the `Ubuntu 20.04` or above versions, the WasmEdge installer will install the `Ubuntu` version of WasmEdge and its plug-ins. +For other systems, the WasmEdge installer will install the `manylinux2014` version, and you should get the `libtorch` without `cxx11-abi`. +::: + +### Piper Dependencies + +The WASI-NN plug-in with Piper backend depends on the ONNX Runtime C++ API. For installation instructions, please refer to the installation table on the [official website](https://onnxruntime.ai/getting-started). + +Example of installing ONNX Runtime 1.14.1 on Ubuntu: + +```bash +curl -LO https://github.com/microsoft/onnxruntime/releases/download/v1.14.1/onnxruntime-linux-x64-1.14.1.tgz +tar zxf onnxruntime-linux-x64-1.14.1.tgz +mv onnxruntime-linux-x64-1.14.1/include/* /usr/local/include/ +mv onnxruntime-linux-x64-1.14.1/lib/* /usr/local/lib/ +rm -rf onnxruntime-linux-x64-1.14.1.tgz onnxruntime-linux-x64-1.14.1 +ldconfig +``` + ## Troubleshooting Some users, especially in China, reported encountering the Connection refused error when trying to download the `install.sh` from the `githubusercontent.com`. diff --git a/docs/start/overview.md b/docs/start/overview.md index 9b5792314..09bafe263 100644 --- a/docs/start/overview.md +++ b/docs/start/overview.md @@ -18,7 +18,7 @@ We will cover the following content: - Introduce the [WasmEdge Runtime](/category/what-is-wasmedge) - Usages of [Running WasmEdge](/category/running-with-wasmedge) - [Frequently Asked Question](faq.md) -- [Style Guide](style_guide.md) for new contributors +- [Style Guide](style_guide.md) for new contributors - [Troubleshooting Guide](troubleshooting_guide.md) For advanced programming with WasmEdge, please refer to the guides for [developing WASM apps](../develop/overview.md), [Embedding WasmEdge in your apps](../embed/overview.md), or [contributing](../contribute/overview.md) to WasmEdge. diff --git a/docs/start/wasmedge/extensions/plugins.md b/docs/start/wasmedge/extensions/plugins.md index c93a9b304..d6fcece47 100644 --- a/docs/start/wasmedge/extensions/plugins.md +++ b/docs/start/wasmedge/extensions/plugins.md @@ -14,27 +14,24 @@ The following lists are the WasmEdge official released plug-ins. Users can insta | Plug-in | Description | Platform Support | Guest Language Support | Build From Source | |---------|-------------|------------------|------------------------|-------------------| -| [WasmEdge-Process](../../../contribute/source/plugin/process.md) | Allows WebAssembly programs to execute native commands in the host operating system. It supports passing arguments, environment variables, `STDIN`/`STDOUT` pipes, and security policies for host access. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.10.0`) | [Rust](https://crates.io/crates/wasmedge_process_interface) | [Steps](../../../contribute/source/plugin/process.md) | +| [WASI-Logging](https://github.com/WebAssembly/wasi-logging) | Logging API for WebAssembly program to log messages. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.0`) | Rust | [Steps](../../../contribute/source/plugin/wasi_logging.md) | | [WASI-Crypto](https://github.com/WebAssembly/wasi-crypto) | APIs that a runtime can expose to WebAssembly modules in order to perform cryptographic operations and key management. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.10.1`) | [Rust](https://crates.io/crates/wasi-crypto) | [Steps](../../../contribute/source/plugin/wasi_crypto.md) | | [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(OpenVINO backend)](../../../develop/rust/wasinn/openvino.md) | AI inference using OpenVINO models. | `ubuntu 20.04 (x86_64)`
(since `0.10.1`) | [Rust](https://crates.io/crates/wasi-nn), JavaScript | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-openvino-backend) | | [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(Pytorch backend)](../../../develop/rust/wasinn/pytorch.md) | AI inference using Pytorch models. | `manylinux2014 (x86_64)`
`ubuntu 20.04 (x86_64)`
(since `0.11.1`) | [Rust](https://crates.io/crates/wasi-nn), JavaScript | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-pytorch-backend) | | [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(TensorFlow-Lite backend)](../../../develop/rust/wasinn/tensorflow_lite.md) | AI inference using TensorFlow-Lite models. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.11.2`) | [Rust](https://crates.io/crates/wasi-nn), JavaScript | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-tensorflow-lite-backend) | | [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(Ggml backend)](../../../develop/rust/wasinn/llm_inference.md) | AI inference using LLM interfaces. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.4`) | [Rust](https://github.com/second-state/wasmedge-wasi-nn) | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-llamacpp-backend) | -| [WASI-Logging](https://github.com/WebAssembly/wasi-logging) | Logging API for WebAssembly program to log messages. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.0`) | Rust | [Steps](../../../contribute/source/plugin/wasi_logging.md) | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(Piper backend)](../../../develop/rust/wasinn/piper.md) | AI inference using Piper models. | `manylinux_2_28 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.14.1`) | [Rust](https://github.com/second-state/wasmedge-wasi-nn) | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-piper-backend) | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(Whisper backend)](../../../develop/rust/wasinn/whisper.md) | AI inference using Whisper models. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.14.1`) | [Rust](https://github.com/second-state/wasmedge-wasi-nn) | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-whisper-backend) | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) Burn.rs backend (Squeezenet) | AI inference using Squeezenet models in Burn.rs. | `ubuntu 20.04 (x86_64)`
(since `0.14.1`) | [Rust](https://github.com/second-state/wasmedge-wasi-nn) | | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) Burn.rs backend (Whisper) | AI inference using Whisper models in Burn.rs. | `ubuntu 20.04 (x86_64)`
(since `0.14.1`) | [Rust](https://github.com/second-state/wasmedge-wasi-nn) | | +| WasmEdge-ffmpeg | | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.14.0`) | | | | [WasmEdge-Image](../../../contribute/source/plugin/image.md) | A native library to manipulate images for AI inference tasks. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.0`) | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface) (0.3.0) | [Steps](../../../contribute/source/plugin/image.md) | +| WasmEdge-LLMC | | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.14.1`) | | | +| WasmEdge-OpenCV | Very popular utility functions to process images and videos for AI input/output. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.3`) | | | +| [WasmEdge-Process](../../../contribute/source/plugin/process.md) | Allows WebAssembly programs to execute native commands in the host operating system. It supports passing arguments, environment variables, `STDIN`/`STDOUT` pipes, and security policies for host access. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.10.0`) | [Rust](https://crates.io/crates/wasmedge_process_interface) | [Steps](../../../contribute/source/plugin/process.md) | +| WasmEdge-StableDiffusion | | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.14.1`) | | | | [WasmEdge-Tensorflow](../../../contribute/source/plugin/tensorflow.md) | A native library for inferring TensorFlow models.| `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.0`) | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface) (0.3.0) | [Steps](../../../contribute/source/plugin/tensorflow.md) | | [WasmEdge-TensorflowLite](../../../contribute/source/plugin/tensorflowlite.md)| A native library for inferring TensorFlow-Lite models. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.0`) | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface) (0.3.0) | [Steps](../../../contribute/source/plugin/tensorflowlite.md) | -| WasmEdge-OpenCV | Very popular utility functions to process images and videos for AI input/output. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.3`) | Rust | | +| WasmEdge-zlib | ??? | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.5`) | | | | [WasmEdge-eBPF](../../../contribute/source/plugin/ebpf.md) | A native library for inferring eBPF applications | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.13.2`) | Rust | [Steps](../../../contribute/source/plugin/ebpf.md) | | [WasmEdge-rustls](../../../contribute/source/plugin/rusttls.md) (DEPRECATED) | A native library for inferring Rust and TLS Library | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.0`, until `0.13.5`) | [Rust](https://crates.io/crates/wasmedge_rustls_api) | [Steps](../../../contribute/source/plugin/rusttls.md) | - -## Old WasmEdge Extensions - -Besides the plug-ins, WasmEdge provides the extensions before the `0.13.0` versions. Noticed that the extensions are replaced by the corresponding plug-ins after the `0.13.0` version. - -The latest version supporting the extensions is `0.12.1`. This chapter will be deprecated when the `0.12.x` versions are no longer supported by the WasmEdge installer. - -| Extension | Description | Platform Support | Language support | -| --- | --- | --- | --- | -| [Image processing](https://github.com/second-state/WasmEdge-image) | A native library to manipulate images for AI inference tasks. Migrated into the plug-in after WasmEdge `0.13.0`. | `manylinux2014 x86_64`, `manylinux2014 aarch64`, `android aarch64`, `ubuntu 20.04 x86_64`, and `darwin x86_64` | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface) (0.2.2) | -| [TensorFlow and Tensorflow-Lite](https://github.com/second-state/WasmEdge-tensorflow) | A native library to inferring TensorFlow and TensorFlow-Lite models. Migrated into the plug-in after WasmEdge `0.13.0`. | `manylinux2014 x86_64`, `manylinux2014 aarch64` (TensorFlow-Lite only), `android aarch64` (TensorFlow-Lite only), `ubuntu 20.04 x86_64`, and `darwin x86_64` | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface) (0.2.2) | diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/contribute.md b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/contribute.md index 18a2be344..0770f571e 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/contribute.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/contribute.md @@ -2,72 +2,161 @@ sidebar_position: 8 --- -# Contributing Steps +# Contributing Guide -## Setup Development Environment +* [New Contributor Guide](#contributing-guide) + * [Ways to Contribute](#ways-to-contribute) + * [Find an Issue](#find-an-issue) + * [Ask for Help](#ask-for-help) + * [Pull Request Lifecycle](#pull-request-lifecycle) + * [Development Environment Setup](#development-environment-setup) + * [Sign Your Commits](#sign-your-commits) + * [Pull Request Checklist](#pull-request-checklist) -The WasmEdge is developed on Ubuntu 20.04 to take advantage of advanced LLVM features for the AOT compiler. The WasmEdge team also builds and releases statically linked WasmEdge binaries for older Linux distributions. +Welcome! We are glad that you want to contribute to our project! 💖 -Our development environment requires `libLLVM-12` and `>=GLIBCXX_3.4.26`. +As you get started, you are in the best position to give us feedback on areas of +the project that we need help with includes: -If you use an operating system older than Ubuntu 20.04, please use our [special docker image] to build WasmEdge. If you are looking for the pre-built binaries for the older operating system, we also provide several pre-built binaries based on the `manylinux2014` distribution. +* Problems found during setting up a new developer environment +* Gaps in our Quickstart Guide or documentation +* Bugs in our automation scripts -To build WasmEdge from the source, please refer to: [Build WasmEdge from source](/category/build-wasmedge-from-source). +If anything doesn't make sense, or doesn't work when you run it, please open a +bug report and let us know! + +## Ways to Contribute + +We welcome many different types of contributions including: + +* New features +* Report a bug +* Builds, CI/CD +* Bug fixes +* Documentation +* Issue Triage +* Answering questions on Slack/Mailing List/GitHub issues +* Web design +* Communications / Social Media / Blog Posts +* Release management + +Not everything happens through a GitHub pull request. Please come to our +[meetings](https://docs.google.com/document/d/1iFlVl7R97Lze4RDykzElJGDjjWYDlkI8Rhf8g4dQ5Rk/edit?usp=sharing) or [contact us](https://groups.google.com/g/wasmedge) and let's discuss how we can work +together. + +### Come to Meetings + +Absolutely everyone is welcome to come to any of our meetings. You never need an +invite to join us. In fact, we want you to join us, even if you don’t have +anything you feel like you want to contribute. Just being there is enough! + +You can find out more about our meetings [here](https://docs.google.com/document/d/1iFlVl7R97Lze4RDykzElJGDjjWYDlkI8Rhf8g4dQ5Rk/edit?usp=sharing). You don’t have to turn on +your video. The first time you come, introducing yourself is more than enough. +Over time, we hope that you feel comfortable voicing your opinions, giving +feedback on others’ ideas, and even sharing your own ideas, and experiences. + +## Find an Issue + +We have good first issues for new contributors and help wanted issues suitable +for any contributor. [good first issue](https://github.com/WasmEdge/WasmEdge/labels/good%20first%20issue) has extra information to +help you make your first contribution. [help wanted](https://github.com/WasmEdge/WasmEdge/labels/help%20wanted) are issues +suitable for someone who isn't a core maintainer and is good to move onto after +your first pull request. + +Sometimes there won’t be any issues with these labels. That’s ok! There is +likely still something for you to work on. If you want to contribute but +don’t know where to start or can't find a suitable issue, you can leave a comment under this issue like "I'd like to work on this. Can you tell XYZ (list the stuff you want to communicate)" or send your questions to our discord server or slack channel. + +Once you see an issue that you'd like to work on, please post a comment saying +that you want to work on it. Something like "I want to work on this" is fine. + +## Ask for Help + +The best way to reach us with a question when contributing is to ask on: + +* The original github issue +* Mailing list: Send an email to [our email list](https://groups.google.com/g/wasmedge) +* Discord: Join the [WasmEdge Discord server](https://discord.gg/h4KDyB8XTt) +* Slack: Join the #WasmEdge channel on the [CNCF Slack](https://slack.cncf.io/) + +Before opening any issue, please look up the existing [issues](https://github.com/WasmEdge/WasmEdge/issues) to avoid submitting a duplication. If you find a match, you can "subscribe" to it to get notified of updates. If you have additional helpful information about the issue, please leave a comment. -## Contribution Workflow +When reporting issues, always include: + +* Version of your system +* Configuration files of WasmEdge + +Because the issues are open to the public, when submitting the log and configuration files, be sure to remove any sensitive information, e.g. user name, password, IP address, and company name. You can replace those parts with "REDACTED" or other strings like "\*\*\*\*". Be sure to include the steps to reproduce the problem if applicable. It can help us understand and fix your issue faster. + +## Pull Request Lifecycle Pull requests are always welcome, even if they only contain minor fixes like typos or a few lines of code. If there will be a significant effort, please document it as an issue and get a discussion going before starting to work on it. Please submit a pull request broken down into small changes bit by bit. A pull request consisting of many features and code changes may take a lot of work to review. It is recommended to submit pull requests incrementally. - -:::note -If you split your pull request into small changes, please ensure any changes that go to the main branch will not break anything. Otherwise, it can only be merged once this feature is complete. -::: +Generally, once your pull request has been opened, it will be assigned to one or more reviewers. Those reviewers will do a thorough code review, looking for correctness, bugs, opportunities for improvement, documentation and comments, and coding style. If your PR is not ready to review, please mark your PR as a draft. -### Fork and Clone the Repository +The reviewers will give you some feedback in three work days. -Fork [the WasmEdge repository](https://github.com/WasmEdge/WasmEdge) and clone the code to your local workspace +After the first review is done, the PR contributor is expected to review and make some changes based on the review in 5 workdays. -### Branches and Commits +If you have finished the adjustments, mark the problem as solved, then the reviewers will review your PR again in 2 workdays. -Changes should be made on your own fork in a new branch. Pull requests should be rebased on the top of the main branch. +If the PR contributor doesn't respond to the PR in 30 days, the maintainer will close the PR. The original PR contributor is welcome to open it again. -The WasmEdge project adopts [DCO](https://www.secondstate.io/articles/dco/) to manage all contributions. Please ensure you add your `sign-off-statement` through the `-s` or `--signoff` flag or the GitHub Web UI before committing the pull request message. +If the PR contributor doesn't want to maintain the PR due to some reason, please enable maintainers to edit this PR if you still want this PR to be merged. -### Develop, Build, and Test +When your PR is merged, your contribution will be implemented in the next release. And we will add the contributors' GitHub name in the release note. -Write code on the new branch in your fork, and [build from source code](/category/build-wasmedge-from-source) with the option `-DWASMEDGE_BUILD_TESTS=ON`. +## Development Environment Setup -Then you can use these tests to verify the correctness of WasmEdge binaries. +The WasmEdge is developed on Ubuntu 20.04 to take advantage of advanced LLVM features for the AOT compiler. The WasmEdge team also builds and releases statically linked WasmEdge binaries for older Linux distributions. -```bash -cd -LD_LIBRARY_PATH=$(pwd)/lib/api ctest -``` +Our development environment requires `libLLVM-12` and `>=GLIBCXX_3.4.26`. -### Push and Create A Pull Request +If you use an operating system older than Ubuntu 20.04, please use our [special docker image] to build WasmEdge. If you are looking for the pre-built binaries for the older operating system, we also provide several pre-built binaries based on the `manylinux2014` distribution. -When ready for review, push your branch to your fork repository on github. +To build WasmEdge from the source, please refer to: [Build WasmEdge from source](/category/build-wasmedge-from-source). -Then visit your fork at and click the `Compare & Pull Request` button next to your branch to create a new pull request. The pull request description should refer to all the issues it addresses. Remember to reference issues (such as Closes #XXX and Fixes #XXX) in the comment so that the issues can be closed when the PR is merged. After creating a pull request, please check that the CI passes with your code changes. +## Sign Your Commits -Once your pull request has been opened, it will be assigned to one or more reviewers. Those reviewers will do a thorough code review, looking for correctness, bugs, opportunities for improvement, documentation and comments, and coding style. +### DCO -Commit changes made in response to review comments to the same branch on your fork. +Licensing is important to open source projects. It provides some assurances that +the software will continue to be available based under the terms that the +author(s) desired. We require that contributors sign off on commits submitted to +our project's repositories. The [Developer Certificate of Origin +(DCO)](https://probot.github.io/apps/dco/) is a way to certify that you wrote and +have the right to contribute the code you are submitting to the project. -## Reporting issues +You sign-off by adding the following to your commit messages. Your sign-off must +match the git user and email associated with the commit. -It is a great way to contribute to WasmEdge by reporting an issue. Well-written and complete bug reports are always welcome! Please open an issue on GitHub. + This is my commit message -Before opening any issue, please look up the existing [issues](https://github.com/WasmEdge/WasmEdge/issues) to avoid submitting a duplication. If you find a match, you can "subscribe" to it to get notified of updates. If you have additional helpful information about the issue, please leave a comment. + Signed-off-by: Your Name -When reporting issues, always include: +Git has a `-s` command line option to do this automatically: -- Version of your system -- Configuration files of WasmEdge + git commit -s -m 'This is my commit message' -Because the issues are open to the public, when submitting the log and configuration files, be sure to remove any sensitive information, e.g. user name, password, IP address, and company name. You can replace those parts with "REDACTED" or other strings like "\*\*\*\*". Be sure to include the steps to reproduce the problem if applicable. It can help us understand and fix your issue faster. +If you forgot to do this and have not yet pushed your changes to the remote +repository, you can amend your commit with the sign-off by running + + git commit --amend -s + +## Pull Request Checklist + +When you submit your pull request, or you push new commits to it, our automated +systems will run some checks on your new code. We require that your pull request +passes these checks, but we also have more criteria than just that before we can +accept and merge it. We recommend that you check the following things locally +before you submit your code: + +* DCO: Did you sign off your commit +* Code of conduct: Did you follow the CNCF code of conduct + +## Reporting issues ## Documenting @@ -78,3 +167,5 @@ Update the documentation if you are creating or changing features. Good document You can propose new designs for existing WasmEdge features. You can also design new features; please submit a proposal via the GitHub issues. WasmEdge maintainers will review this proposal as soon as possible to ensure the overall architecture is consistent and to avoid duplicated work in the roadmap. + +New features of WasmEdge will be discussed via a GitHub issue or the community meeting. diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/build_from_src.md b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/build_from_src.md index 768656d5d..33f498d8d 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/build_from_src.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/build_from_src.md @@ -76,7 +76,8 @@ Developers can set the CMake options to customize the WasmEdge building. - To build the WASI-NN plug-in with multiple backends, please use `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=,`. 13. `WASMEDGE_PLUGIN_WASI_CRYPTO`: build the WasmEdge WASI-Crypto plug-in (Linux and MacOS platforms only). Default is `OFF`. - This option is useless if the option `WASMEDGE_BUILD_PLUGINS` is set as `OFF`. -14. `WASMEDGE_PLUGIN_WASI_LOGGING`: build the WasmEdge WASI-Logging plug-in (Linux and MacOS platforms only). Default is `OFF`. +14. `WASMEDGE_PLUGIN_WASI_LOGGING`: build the WasmEdge WASI-Logging plug-in (Linux and MacOS platforms only). Default is `ON`. + - In WasmEdge `0.14.1`, the WASI-Logging plug-in is bundled into the WasmEdge library and will not generate the plug-in shared library target. - This option is useless if the option `WASMEDGE_BUILD_PLUGINS` is set as `OFF`. 15. `WASMEDGE_PLUGIN_WASM_BPF`: build the WasmEdge wasm_bpf plugin (Linux platforms only). Default is `OFF`. - This option is useless if the option `WASMEDGE_BUILD_PLUGINS` is set as `OFF`. @@ -91,7 +92,7 @@ Developers can set the CMake options to customize the WasmEdge building. Developers can follow the steps to build WasmEdge with plug-ins from source. -- [WASI-NN (OpenVINO, PyTorch, or TensorFlow-Lite backends)](plugin/wasi_nn.md) +- [WASI-NN (with several backends)](plugin/wasi_nn.md) - [WASI-Crypto](plugin/wasi_crypto.md) - [WasmEdge-Image](plugin/image.md) - [WasmEdge-TensorFlow](plugin/tensorflow.md) diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/os/linux.md b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/os/linux.md index 25a8e7766..87173214c 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/os/linux.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/os/linux.md @@ -34,6 +34,8 @@ Please check that these dependencies are satisfied. - LLVM 12.0.0 (>= 10.0.0) - _(Optional)_ GCC 11.1.0 (>= 9.4.0), install it if you prefer to use GCC toolchain. +After `WasmEdge 0.13.0`, the `boost` dependency is not needed. + #### For Ubuntu 22.04 ```bash diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/rusttls.md b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/rusttls.md index 6f8986f8c..f7aa4b6fa 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/rusttls.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/rusttls.md @@ -2,7 +2,12 @@ sidebar_position: 8 --- -# Build with Rustls Plug-in +# (DEPRECATED after `0.14.0`) Build with Rustls Plug-in + + +:::note +This plug-in has been deprecated after WasmEdge `0.14.0` because the `rustls` is replaced by [`reqwest`](../../../develop/rust/http_service/client.md#the-reqwest-api). +::: The WasmEdge Rustls plug-in is a replacement for the OpenSSL plug-in in WasmEdge. It provides a Rust-friendly interface to the Rustls library, which is a modern, fast, and more secure alternative to OpenSSL. diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/wasi_logging.md b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/wasi_logging.md index 70fa900c1..b12118dcc 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/wasi_logging.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/wasi_logging.md @@ -6,6 +6,12 @@ sidebar_position: 1 WASI-Logging allows WebAssembly applications to log messages in a standardized way. This becomes particularly helpful when debugging applications or understanding the flow of execution within them. The WASI-Logging plug-in is designed to be straightforward to use, enabling developers to focus more on their application logic and less on logging mechanics. + +:::note +In WasmEdge `0.14.1` version, this plug-in is bundled into the WasmEdge library and not generate the plug-in shared library. +The plug-in building architecture will be refactored in the future. Therefore we reserve this page for documentation before `0.14.0` versions. +::: + ## Prerequisites The prerequisite of the Wasi-Logging plug-in is the same as the WasmEdge building environment on the [Linux](../os/linux.md) and [MacOS](../os/macos.md) platforms. diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/wasi_nn.md b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/wasi_nn.md index 19ffc95c4..1901f6eda 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/wasi_nn.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/source/plugin/wasi_nn.md @@ -2,15 +2,47 @@ sidebar_position: 2 --- -# Build with WASI-nn Plug-in +# Build with WASI-NN Plug-in The WASI-NN plug-in is a proposed WebAssembly System Interface (WASI) API for machine learning. It allows WebAssembly programs to access host-provided machine learning functions. ## Prerequisites -Currently, WasmEdge used OpenVINO™, PyTorch, TensorFlow Lite, or llama.cpp as the WASI-NN backend implementation. For using WASI-NN on WasmEdge, you need to install [OpenVINO™](https://docs.openvino.ai/2023.0/openvino_docs_install_guides_installing_openvino_apt.html)(2023), [TensorFlow Lite](https://www.tensorflow.org/install/lang_c), or [PyTorch 1.8.2 LTS](https://pytorch.org/get-started/locally/) for the backend. +Currently, WasmEdge supports following backends for WASI-NN proposal: -By default, we don't enable any WASI-NN backend in WasmEdge. Therefore developers should [build the WasmEdge from source](../os/linux.md) with the cmake option `WASMEDGE_PLUGIN_WASI_NN_BACKEND` to enable the backends. +| Backend | Dependency | CMake Option | +|---------|------------|--------------| +| [OpenVINO](#build-wasmedge-with-wasi-nn-openvino-backend) | [OpenVINO™ (2023)](https://docs.openvino.ai/2023.0/openvino_docs_install_guides_installing_openvino_apt.html) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=OpenVINO` | +| [TensorFlow-Lite](#build-wasmedge-with-wasi-nn-tensorflow-lite-backend) | [TensorFlow Lite](https://www.tensorflow.org/install/lang_c) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=TensorFlowLite` | +| [PyTorch](#build-wasmedge-with-wasi-nn-pytorch-backend) | [PyTorch 1.8.2 LTS](https://pytorch.org/get-started/locally/) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=PyTorch` | +| [GGML](#build-wasmedge-with-wasi-nn-pytorch-backend) | [llama.cpp](https://github.com/ggerganov/llama.cpp) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=GGML` | +| [Piper](#build-wasmedge-with-wasi-nn-piper-backend) | [Piper](https://github.com/rhasspy/piper) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=Piper` | +| [Whisper](#build-wasmedge-with-wasi-nn-whisper-backend) | [whisper.cpp](https://github.com/ggerganov/whisper.cpp) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=Whisper` | +| [ChatTTS](#build-wasmedge-with-wasi-nn-chattts-backend) | [ChatTTS](https://github.com/2noise/ChatTTS) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=ChatTTS` | +| [MLX](#build-wasmedge-with-wasi-nn-mlx-backend) | [MLX](https://github.com/ml-explore/mlx) | `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND=MLX` | + +Developers can [build the WasmEdge from source](../os/linux.md) with the cmake option `WASMEDGE_PLUGIN_WASI_NN_BACKEND` to enable the backends. For supporting multiple backends, developers can assign the option such as `-DWASMEDGE_PLUGIN_WASI_NN_BACKEND="GGML;Whisper;TensorFlowLite"`. + +After building, you will have the WASI-NN with specified backend(s) plug-in shared library under `/plugins/wasi_nn/libwasmedgePluginWasiNN.so` (or `.dylib` extension on MacOS). + + +:::note +If the `wasmedge` CLI tool cannot find the WASI-NN plug-in, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plug-in installation path (such as `/usr/local/lib/wasmedge/`, or the built plug-in path `build/plugins/wasi_nn/`) to try to fix this issue. +::: + +For the `Burn.rs` backend, please use the cmake option `WASMEDGE_PLUGIN_WASI_NN_BURNRS_MODEL` to assign the model. + +| Model for `Burn.rs` backend | CMake Option | +|-------|--------------| +| Squeezenet | `-WASMEDGE_PLUGIN_WASI_NN_BURNRS_MODEL=Squeezenet` | +| Whisper | `-WASMEDGE_PLUGIN_WASI_NN_BURNRS_MODEL=Whisper` | + +After building, you will have the WASI-NN with specified backend(s) plug-in shared library under `/plugins/wasi_nn_burnrs/libwasmedgePluginWasiNN.so` (or `.dylib` extension on MacOS). + + +:::note +The `WASI-NN Burn.rs` backend cannot build with other backends. +::: ## Build WasmEdge with WASI-NN OpenVINO Backend @@ -31,17 +63,8 @@ Then build and install WasmEdge from source: cd cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="OpenVINO" cmake --build build -# For the WASI-NN plug-in, you should install this project. -cmake --install build ``` - -:::note -If the built `wasmedge` CLI tool cannot find the WASI-NN plug-in, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plug-in installation path (such as `/usr/local/lib/wasmedge/`, or the built plug-in path `build/plugins/wasi_nn/`) to try to fix this issue. -::: - -Then you will have an executable `wasmedge` runtime under `/usr/local/bin` and the WASI-NN with OpenVINO backend plug-in under `/usr/local/lib/wasmedge/libwasmedgePluginWasiNN.so` after installation. - ## Build WasmEdge with WASI-NN PyTorch Backend For choosing and installing PyTorch on `Ubuntu 20.04` for the backend, we recommend the following commands: @@ -74,17 +97,8 @@ Then build and install WasmEdge from source: cd cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="PyTorch" cmake --build build -# For the WASI-NN plug-in, you should install this project. -cmake --install build ``` - -:::note -If the built `wasmedge` CLI tool cannot find the WASI-NN plug-in, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plug-in installation path (such as `/usr/local/lib/wasmedge/`, or the built plug-in path `build/plugins/wasi_nn/`) to try to fix this issue. -::: - -Then you will have an executable `wasmedge` runtime under `/usr/local/bin` and the WASI-NN with PyTorch backend plug-in under `/usr/local/lib/wasmedge/libwasmedgePluginWasiNN.so` after installation. - ## Build WasmEdge with WASI-NN TensorFlow-Lite Backend You can build and install WasmEdge from source directly (on `Linux x86_64`, `Linux aarch64`, `MacOS x86_64`, or `MacOS arm64` platforms): @@ -93,17 +107,8 @@ You can build and install WasmEdge from source directly (on `Linux x86_64`, `Lin cd cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="TensorflowLite" cmake --build build -# For the WASI-NN plug-in, you should install this project. -cmake --install build ``` - -:::note -If the built `wasmedge` CLI tool cannot find the WASI-NN plug-in, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plug-in installation path (such as `/usr/local/lib/wasmedge/`, or the built plug-in path `build/plugins/wasi_nn/`) to try to fix this issue. -::: - -Then you will have an executable `wasmedge` runtime under `/usr/local/bin` and the WASI-NN with TensorFlow-lite backend plug-in under `/usr/local/lib/wasmedge/libwasmedgePluginWasiNN.so` after installation. - Installing the necessary `libtensorflowlite_c.so` and `libtensorflowlite_flex.so` on both `Ubuntu 20.04` and `manylinux2014` for the backend, we recommend the following commands: ```bash @@ -136,7 +141,7 @@ You don't need to install any llama.cpp libraries. WasmEdge will download it dur Due to the acceleration frameworks being various, you will need to use different compilation options to build this plugin. Please make sure you are following the same OS section to do this. -### MacOS +### Build with llama.cpp Backend on MacOS #### Intel Model @@ -151,8 +156,6 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ -DWASMEDGE_PLUGIN_WASI_NN_GGML_LLAMA_BLAS=OFF \ . cmake --build build -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` #### Apple Silicon Model @@ -168,11 +171,9 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ -DWASMEDGE_PLUGIN_WASI_NN_GGML_LLAMA_BLAS=OFF \ . cmake --build build -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` -### Linux +### Build with llama.cpp Backend on Linux #### Ubuntu/Debian with CUDA 12 @@ -203,9 +204,6 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ . cmake --build build - -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` #### Ubuntu on NVIDIA Jetson AGX Orin @@ -231,9 +229,6 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ . cmake --build build - -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` #### Ubuntu/Debian with OpenBLAS @@ -256,9 +251,6 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ . cmake --build build - -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` #### General Linux without any acceleration framework @@ -272,20 +264,88 @@ cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release \ . cmake --build build - -# For the WASI-NN plugin, you should install this project. -cmake --install build ``` -### Appendix +### Build with llama.cpp Backend on Windows - -:::note -If the built `wasmedge` CLI tool cannot find the WASI-NN plugin, you can set the `WASMEDGE_PLUGIN_PATH` environment variable to the plugin installation path (such as `/usr/local/lib/wasmedge/` or the built plugin path `build/plugins/wasi_nn/`) to try to fix this issue. -::: +#### Install Dependencies for llama.cpp And Build on Windows + +Developers can follow the steps for installing the requested dependencies. + +1. (Optional, skip this deps if you don't need to use GPU) Download and install CUDA toolkit + - We use CUDA Toolkit 12 for the release assets + - Link: + +2. Download and install Visual Studio 2022 Community Edition + - Link: + - Select the following components in the installer: + - msvc v143 - vs 2022 c++ x64/x86 build tools (latest) + - windows 11 sdk (10.0.22621.0) + - C++ ATL for v143 build tools (x86 & x64) + +3. Download and install cmake + - We use cmake 3.29.3 for the release assets + - Link: + +4. Download and install git + - We use git 2.45.1 + - Link: + +5. Download and install ninja-build + - We use ninja-build 1.12.1 + - Link: + - Installation: just unzip it to a custom folder + +Then developers can build by following the steps. + +1. Open Developer PowerShell for VS 2022 + - Start -> Visual Studio 2022 -> Visual Studio Tools -> Developer PowerShell for VS 2022 + +2. Inside the PowerShell, use git to download wasmedge repo + + ```console + cd $HOME + git clone https://github.com/WasmEdge/WasmEdge.git + cd WasmEdge + ``` + +3. Compile wasmedge with enabling the `wasi_nn_ggml` related options, please use the following commands. To build the plugin, you don't need to enable AOT/LLVM related features, so set them to OFF. + + - If you want to enable CUDA: + + ```console + # CUDA ENABLE: + & "C:\Program files\CMake\bin\cmake.exe" -Bbuild -GNinja -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND=ggml -DWASMEDGE_PLUGIN_WASI_NN_GGML_LLAMA_CUBLAS=ON -DWASMEDGE_USE_LLVM=OFF . + & "\ninja.exe" -C build + ``` + + - If you want to disable CUDA: + + ```console + # CUDA DISABLE: + & "C:\Program files\CMake\bin\cmake.exe" -Bbuild -GNinja -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND=ggml -DWASMEDGE_USE_LLVM=OFF . + & "\ninja.exe" -C build + ``` + +#### Execute the WASI-NN plugin with the llama example on Windows + +1. Set the environment variables + + ```console + $env:PATH += ";$pwd\build\lib\api" + $env:WASMEDGE_PLUGIN_PATH = "$pwd\build\plugins" + ``` + +2. Download the wasm and run + + ```console + wget https://github.com/second-state/WasmEdge-WASINN-examples/raw/master/wasmedge-ggml/llama/wasmedge-ggml-llama.wasm + wget https://huggingface.co/QuantFactory/Meta-Llama-3-8B-Instruct-GGUF/blob/main/Meta-Llama-3-8B-Instruct.Q5_K_M.gguf + wasmedge --dir .:. --env llama3=true --env n_gpu_layers=100 --nn-preload default:GGML:AUTO:Meta-Llama-3-8B-Instruct.Q5_K_M.gguf wasmedge-ggml-llama.wasm default + ``` + +### Appendix for llama.cpp backend - -:::note We also provided the pre-built ggml plugins on the following platforms: - darwin\_x86\_64: Intel Model macOS @@ -299,4 +359,55 @@ We also provided the pre-built ggml plugins on the following platforms: - manylinux2014\_x86\_64: x86\_64 Linux (the glibc is using CentOS 7 one) - manylinux2014\_aarch64: aarch64 Linux (the glibc is using CentOS 7 one) -::: +## Build WasmEdge with WASI-NN Piper Backend + +Build and install WasmEdge from source: + +```bash +cd +cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="Piper" +cmake --build build +``` + +## Build WasmEdge with WASI-NN Whisper Backend + +Build and install WasmEdge from source: + +```bash +cd +cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="Whisper" +cmake --build build +``` + +## Build WasmEdge with WASI-NN ChatTTS Backend + +The ChatTTS backend relies on ChatTTS and Python library, we recommend the following commands to install dependencies. + +```bash +sudo apt update +sudo apt upgrade +sudo apt install python3-dev +pip install chattts==0.1.1 +``` + +Then build and install WasmEdge from source: + +``` bash +cd + +cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="chatTTS" +cmake --build build +``` + +## Build WasmEdge with WASI-NN MLX Backend + +You can directly build and install WasmEdge from source or custom install mlx and set `CMAKE_INSTALL_PREFIX` variable. + +Build and install WasmEdge from source: + +``` bash +cd + +cmake -GNinja -Bbuild -DCMAKE_BUILD_TYPE=Release -DWASMEDGE_PLUGIN_WASI_NN_BACKEND="mlx" +cmake --build build +``` diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/users.md b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/users.md index 48fff2460..da670ba37 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/contribute/users.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/contribute/users.md @@ -9,6 +9,8 @@ This list is constantly being updated. Please submit a PR to add your own item i | Name | Desciption | PR or Docs | | --- | --- | --- | | WebAssembly Languages Runtime maintained by VMWare | Use WasmEdge to run PHP and python programs | | +| Cocos AI | WasmEdge is used for Confidential Computing inference of AI workloads in a secure TEE enclave | | +| WikiFunctions | Use WasmEdge to execute serverless functions to be embedded in Wikipedia | | | LF Edge eKuiper | Use WasmEdge to process data streamed from IoT devices | | | crun | Use WasmEdge to run WASM containers | | | youki | Use WasmEdge to run WASM containers | | @@ -30,7 +32,7 @@ This list is constantly being updated. Please submit a PR to add your own item i | ByteDance | Use WasmEdge as a Ray node | | | Huawei Cloud | Use WasmEdge to run Serverless functions | Internal use case | | 5miles | Use WasmEdge to run internal microservices | Internal use case | -| Bytetrade | Use WasmEdge to run microservices for automated crypto trading and marketing automation. | Internal use case | +| Bytetrade | Use WasmEdge to run microservices for automated crypto trading and marketing automation | Internal use case | | FutureWei | Use WasmEdge on automobile and OpenHarmony | | | WinSoft | Use WasmEdge to improve IDE’s user experience | | | ParaState | Use WasmEdge to execute smart contracts on the ParaState blockchain | | @@ -41,11 +43,11 @@ This list is constantly being updated. Please submit a PR to add your own item i | libsql | Use WasmEdge to support user-defined functions (UDF) in a database | | | Shifu | Use WasmEdge to support user-defined functions (UDF) in data streams from edge devices | | | Lnjoying | Offers WasmEdge as part of its edge cloud service. | Internal use case | -| Red Hat EPEL 9 | Offers support for wasmedge packages for Red Hat Linux 9 users. | | +| Red Hat EPEL 9 | Offers support for wasmedge packages for Red Hat Linux 9 users | | | Open Interpreter| Use WasmEdge as the LLM runtime | | | GaiaNet| Use WasmEdge as LLM runtime | | | MoXin| Use WasmEdge as the LLM runtime | | | CODA Bridge| Use WasmEdge as the container to run microservices | Internal use case | | CloudEvents | Support the Rust SDK to be compiled into Wasm so that a Wasm app can send and receive cloud events | | | Kagome | A C++ implementation of Polkadot host which runs Wasm smart contracts on WasmEdge | | -| Sealos | A Cloud Operating System designed for managing cloud-native applications. It uses WasmEdge to run LLMs locally in its cluster. | | +| Sealos | A Cloud Operating System designed for managing cloud-native applications. It uses WasmEdge to run LLMs locally in its cluster | | diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/cri-runtime/containerd.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/cri-runtime/containerd.md index 7db80ff78..a43919721 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/cri-runtime/containerd.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/cri-runtime/containerd.md @@ -19,7 +19,9 @@ The containerd-shim [runwasi](https://github.com/containerd/runwasi/) project su 3. Build and install the wasmedge-containerd-shim ```bash + # Reference: https://github.com/containerd/runwasi/blob/main/CONTRIBUTING.md#setting-up-your-local-environment cd runwasi + ./scripts/setup-linux.sh make build-wasmedge INSTALL="sudo install" LN="sudo ln -sf" make install-wasmedge ``` diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/gpu/_category_.json b/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/gpu/_category_.json new file mode 100644 index 000000000..2675be0c2 --- /dev/null +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/gpu/_category_.json @@ -0,0 +1,9 @@ +{ + "label": "Manage LLM workloads on GPU", + "position": 7, + "link": { + "type": "generated-index", + "description": "In this chapter, we will demonstrate how to use container tools to mange LLM WasmEdge workloads on GPU." + } + } + \ No newline at end of file diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/start/build-and-run/docker_wasm_gpu.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/gpu/docker_wasm_gpu.md similarity index 100% rename from i18n/zh/docusaurus-plugin-content-docs/current/start/build-and-run/docker_wasm_gpu.md rename to i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/gpu/docker_wasm_gpu.md diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/start/build-and-run/podman_wasm_gpu.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/gpu/podman_wasm_gpu.md similarity index 100% rename from i18n/zh/docusaurus-plugin-content-docs/current/start/build-and-run/podman_wasm_gpu.md rename to i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/gpu/podman_wasm_gpu.md diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/kubernetes/kubernetes-containerd-runwasi.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/kubernetes/kubernetes-containerd-runwasi.md index 87ec2fde7..2c783a5c5 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/kubernetes/kubernetes-containerd-runwasi.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/kubernetes/kubernetes-containerd-runwasi.md @@ -4,7 +4,132 @@ sidebar_position: 3 # Kubernetes + Containerd + Runwasi - -:::info -Work in Progress -::: +## Quick start + +The [GitHub repo](https://github.com/second-state/wasmedge-containers-examples/) contains scripts and GitHub Actions for running our example apps on Kubernetes + containerd + runwasi. + +- Simple WebAssembly example [Quick start](https://github.com/second-state/wasmedge-containers-examples/blob/main/kubernetes_containerd/README.md) | [Github Actions](https://github.com/second-state/wasmedge-containers-examples/blob/main/.github/workflows/kubernetes-containerd.yml) +- WebAssembly-based HTTP service [Quick start](https://github.com/second-state/wasmedge-containers-examples/blob/main/kubernetes_containerd/http_server/README.md) | [Github Actions](https://github.com/second-state/wasmedge-containers-examples/blob/main/.github/workflows/kubernetes-containerd-server.yml) + +In the rest of this section, we will explain the steps in detail. + +## Prerequisites for this setup + +Please ensure that you have completed the following steps before proceeding with this setup. + +- Install the latest version of [Wasmedge](../../../start/install.md) +- Ensure that you have containerd setup following the [instructions here](../../deploy/cri-runtime/containerd-crun.md). +- Ensure that you have installed and [setup runwasi](../../deploy/cri-runtime/containerd.md) for containerd-shim-wasmedge + +## Install and start Kubernetes + +Run the following commands from a terminal window. It sets up Kubernetes for local development. + +```bash +# Install go +$ wget https://golang.org/dl/go1.17.1.linux-amd64.tar.gz +$ sudo rm -rf /usr/local/go +$ sudo tar -C /usr/local -xzf go1.17.1.linux-amd64.tar.gz +$ source /home/${USER}/.profile + +# Clone k8s +$ git clone https://github.com/kubernetes/kubernetes.git +$ cd kubernetes +$ git checkout v1.22.2 + +# Install etcd with hack script in k8s +$ sudo CGROUP_DRIVER=systemd CONTAINER_RUNTIME=remote CONTAINER_RUNTIME_ENDPOINT='unix:///var/run/containerd/containerd.sock' ./hack/install-etcd.sh +$ export PATH="/home/${USER}/kubernetes/third_party/etcd:${PATH}" +$ sudo cp third_party/etcd/etcd* /usr/local/bin/ + +# After run the above command, you can find the following files: /usr/local/bin/etcd /usr/local/bin/etcdctl /usr/local/bin/etcdutl + +# Build and run k8s with containerd +$ sudo apt-get install -y build-essential +$ sudo CGROUP_DRIVER=systemd CONTAINER_RUNTIME=remote CONTAINER_RUNTIME_ENDPOINT='unix:///var/run/containerd/containerd.sock' ./hack/local-up-cluster.sh + +... ... +Local Kubernetes cluster is running. Press Ctrl-C to shut it down. +``` + +Do NOT close your terminal window. Kubernetes is running! + +## Run and test the Kubernetes Cluster + +Finally, we can run WebAssembly programs in Kubernetes as containers in pods. In this section, we will start from **another terminal window** and start using the cluster. + +```bash +export KUBERNETES_PROVIDER=local + +sudo cluster/kubectl.sh config set-cluster local --server=https://localhost:6443 --certificate-authority=/var/run/kubernetes/server-ca.crt +sudo cluster/kubectl.sh config set-credentials myself --client-key=/var/run/kubernetes/client-admin.key --client-certificate=/var/run/kubernetes/client-admin.crt +sudo cluster/kubectl.sh config set-context local --cluster=local --user=myself +sudo cluster/kubectl.sh config use-context local +sudo cluster/kubectl.sh +``` + +Let's check the status to make sure that the cluster is running. + +```bash +$ sudo cluster/kubectl.sh cluster-info + +# Expected output +Cluster "local" set. +User "myself" set. +Context "local" created. +Switched to context "local". +Kubernetes control plane is running at https://localhost:6443 +CoreDNS is running at https://localhost:6443/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy + +To further debug and diagnose cluster problems, use 'kubectl cluster-info dump'. +``` + +## Configure containerd and Kubernetes for Wasmedge Runtime + +Next we will configure containerd to add support for the containerd-shim-wasmedge. +Please ensure that you have [setup runwasi](../../deploy/cri-runtime/containerd.md) to work with WasmEdge container images. + +```bash +# Run the following command as root user +sudo bash -c "containerd config default > /etc/containerd/config.toml" +echo '[plugins."io.containerd.grpc.v1.cri".containerd.runtimes.wasmedge] runtime_type = "io.containerd.wasmedge.v1"' | sudo tee -a /etc/containerd/config.toml > /dev/null +sudo systemctl restart containerd +``` + +Next we will create a RuntimeClass in Kubernetes to specify usage of wasmedge runtime for objects labeled as `runtime=wasm` + +```bash +sudo cluster/kubectl.sh apply -f - <<< '{"apiVersion":"node.k8s.io/v1","kind":"RuntimeClass","metadata":{"name":"wasm"},"scheduling":{"nodeSelector":{"runtime":"wasm"}},"handler":"wasmedge"}' +``` + +Now we will label the kubernetes node as `runtime=wasm`. Note that the node where we changed the containerd configurations will be the one which we will label. + +An example of how we can label the node is given below: + +```bash +sudo cluster/kubectl.sh get nodes +# Sample output from the command above +NAME STATUS ROLES AGE VERSION +127.0.0.1 Ready 3h4m v1.22.2 +# Run the following command to label the node +sudo cluster/kubectl.sh label nodes 127.0.0.1 runtime=wasm +# A successful output from the above command looks like this +node/127.0.0.1 labeled +``` + +### A WebAssembly-based HTTP service + +[A separate article](https://github.com/second-state/wasmedge-containers-examples/blob/main/http_server_wasi_app.md) explains how to compile, package, and publish a simple WebAssembly HTTP service application as a container image to Docker hub. Run the WebAssembly-based image from Docker Hub in the Kubernetes cluster as follows. + +```bash +sudo cluster/kubectl.sh apply -f - <<< '{"apiVersion":"apps/v1","kind":"Deployment","metadata":{"name":"http-server-deployment"},"spec":{"replicas":1,"selector":{"matchLabels":{"app":"http-server"}},"template":{"metadata":{"labels":{"app":"http-server"}},"spec":{"hostNetwork":true,"runtimeClassName":"wasm","containers":[{"name":"http-server","image":"wasmedge/example-wasi-http:latest","ports":[{"containerPort":1234}]}]}}}}' +``` + +Since we are using `hostNetwork` in the `kubectl run` command, the HTTP server image is running on the local network with IP address `127.0.0.1`. Now, you can use the `curl` command to access the HTTP service. + +```bash +$ curl -d "name=WasmEdge" -X POST http://127.0.0.1:1234 +echo: name=WasmEdge +``` + +That's it! diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/oci-runtime/youki.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/oci-runtime/youki.md index 17712cfca..35d7a9b49 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/oci-runtime/youki.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/deploy/oci-runtime/youki.md @@ -15,25 +15,29 @@ youki is an OCI container runtime written in Rust. youki has WasmEdge baked in. Run the following command line to build and install youki on your machine. ```bash - $ sudo apt-get install \ - pkg-config \ - libsystemd-dev \ - libdbus-glib-1-dev \ - build-essential \ - libelf-dev \ - libseccomp-dev \ - libclang-dev + $ sudo apt-get install \ + curl \ + git \ + pkg-config \ + libsystemd-dev \ + libdbus-glib-1-dev \ + build-essential \ + libelf-dev \ + libzstd-dev \ + libseccomp-dev \ + libclang-dev + + # If you don't have the rust toolchain installed run: + $ curl https://sh.rustup.rs -sSf | sudo sh -s -- -y ``` Next, configure, build, and install a `youki` binary with WasmEdge support. ```bash - git clone https://github.com/containers/youki.git - go into the cloned directory + git clone --recurse-submodules https://github.com/containers/youki.git cd youki - make youki-dev - ./youki -h ./scripts/build.sh -o . -r -f wasm-wasmedge + ./youki -h export LD_LIBRARY_PATH=$HOME/.wasmedge/lib ``` diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/javascript/networking.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/javascript/networking.md index 7cea5f556..df1463bf2 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/javascript/networking.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/javascript/networking.md @@ -15,7 +15,7 @@ The networking API in WasmEdge is non-blocking and hence supports asynchronous I ## Prerequisites -[Install WasmEdge](../../start/install.md). To make HTTPS requests, install the [WasmEdge TLS plug-in](../../start/install.md#tls-plug-in). +[Install WasmEdge](../../start/install.md). To make HTTPS requests, install the [WasmEdge TLS plug-in](../../start/install.md#install-wasmedge-with-plug-ins). [Install WasmEdge-QuickJS](./hello_world#prerequisites). Make sure that the `modules` directory is located in your local directory where you want to execute the `wasmedge` command. diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/python/_category_.json b/i18n/zh/docusaurus-plugin-content-docs/current/develop/python/_category_.json index 953fc26e6..bce8d88bf 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/python/_category_.json +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/python/_category_.json @@ -1,6 +1,6 @@ { "label": "Develop WASM Apps in Python", - "position": 9, + "position": 6, "link": { "type": "generated-index", "description": "In this chapter, we will learn how to create WASM apps in Python." diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/_category_.json b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/_category_.json index 5029aa84a..54d18d9ac 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/_category_.json +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/_category_.json @@ -3,6 +3,6 @@ "position": 4, "link": { "type": "generated-index", - "description": "Rust is the first class citizen in WebAssembly ecosystem. In this chapter, we will learn how to create WASM apps in Rust." + "description": "Rust is well supported in the WebAssembly ecosystem. In this chapter, you will learn how to create Wasm apps in Rust." } } diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/my_sql_driver.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/my_sql_driver.md index 79c18a54a..7e550d348 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/my_sql_driver.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/my_sql_driver.md @@ -9,7 +9,7 @@ The database connection is necessary for today's enterprise development. WasmEdg :::note Before we start, [you need to have Rust and WasmEdge installed](../setup.md). -Make sure that you read the [special notes on networking apps](../setup#special-notes) especially if you are compiling Rust programs on a Mac. +Make sure that you read the [special notes on networking apps](../setup#special-notes-for-networking-apps) especially if you are compiling Rust programs on a Mac. ::: ## Run the example @@ -45,7 +45,7 @@ wasmedge --env "DATABASE_SSL=1" --env "DATABASE_URL=mysql://user:passwd@mydb.123 In order to compile the `mysql_async` and `tokio` crates, we will need to apply two patches to add WasmEdge-specific socket APIs to those crates. The following example shows that the TLS connection is enabled. -``` +```toml [patch.crates-io] tokio = { git = "https://github.com/second-state/wasi_tokio.git", branch = "v1.36.x" } socket2 = { git = "https://github.com/second-state/socket2.git", branch = "v0.5.x" } @@ -63,7 +63,7 @@ statements. Connect to a MySQL database. -``` +```rust // Below we create a customized connection pool let opts = Opts::from_url(&*get_url()).unwrap(); let mut builder = OptsBuilder::from_opts(opts); @@ -80,7 +80,7 @@ Connect to a MySQL database. Create a table on the connected database. -``` +```rust // create table if no tables exist let result = r"SHOW TABLES LIKE 'orders';" .with(()) @@ -100,7 +100,7 @@ Create a table on the connected database. Insert some records into the MySQL database using SQL. -``` +```rust let orders = vec![ Order::new(1, 12, 2, 56.0, 15.0, 2.0, String::from("Mataderos 2312")), Order::new(2, 15, 3, 256.0, 30.0, 16.0, String::from("1234 NW Bobcat")), @@ -128,7 +128,7 @@ Insert some records into the MySQL database using SQL. Query the database. -``` +```rust // query data let loaded_orders = "SELECT * FROM orders" .with(()) @@ -153,8 +153,8 @@ Query the database. Delete some records from the database. -``` - // // delete some data +```rust + // delete some data r"DELETE FROM orders WHERE order_id=4;" .ignore(&mut conn) .await?; @@ -183,8 +183,8 @@ Delete some records from the database. Update records in the MySQL database. -``` - // // update some data +```rust + // update some data r"UPDATE orders SET shipping_address = '8366 Elizabeth St.' WHERE order_id = 2;" @@ -214,8 +214,7 @@ Update records in the MySQL database. Close the database connection. -``` +```rust drop(conn); pool.disconnect().await.unwrap(); ``` - diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/postgres_driver.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/postgres_driver.md index 5fa8d0fa0..2968603db 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/postgres_driver.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/postgres_driver.md @@ -9,7 +9,7 @@ A database connection is necessary for today's enterprise development. WasmEdge :::note Before we start, [you need to have Rust and WasmEdge installed](../setup.md). -Make sure that you read the [special notes on networking apps](../setup#special-notes) especially if you are compiling Rust programs on a Mac. +Make sure that you read the [special notes on networking apps](../setup#special-notes-for-networking-apps) especially if you are compiling Rust programs on a Mac. ::: ## Run the example @@ -31,7 +31,7 @@ wasmedge --env "DATABASE_URL=postgres://user:passwd@localhost/testdb" target/was In order to compile the `tokio-postgres` and `tokio` crates, we will need to apply patches to add WasmEdge-specific socket APIs to those crates in `Cargo.toml`. -``` +```toml [patch.crates-io] tokio = { git = "https://github.com/second-state/wasi_tokio.git", branch = "v1.36.x" } socket2 = { git = "https://github.com/second-state/socket2.git", branch = "v0.5.x" } @@ -151,4 +151,3 @@ async fn main() -> Result<(), Error> { Ok(()) } ``` - diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/qdrant_driver.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/qdrant_driver.md index bcb3bb27c..f95526df2 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/qdrant_driver.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/qdrant_driver.md @@ -11,7 +11,7 @@ Hence, besides the LLM inference runtime, those LLM applications also need to ma :::note Before we start, [you need to have Rust and WasmEdge installed](../setup.md). -Make sure that you read the [special notes on networking apps](../setup#special-notes) especially if you are compiling Rust programs on a Mac. +Make sure that you read the [special notes on networking apps](../setup#special-notes-for-networking-apps) especially if you are compiling Rust programs on a Mac. ::: ## Run the example @@ -52,7 +52,7 @@ qdrant_rest_client = "0.1.0" ## Code explanation The following program uses the `qdrant_rest_client` crate to access local Qdrant server through its RESTful API. -It first creates several points (vectors), saves those vectors to the Qdrant database, retrieves some vectors, +It first creates several points (vectors), saves those vectors to the Qdrant database, retrieves some vectors, searches for vectors, and finally deletes them from the database. ```rust @@ -129,4 +129,3 @@ async fn main() -> Result<(), Box> { Ok(()) } ``` - diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/redis_driver.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/redis_driver.md index ce07650d2..b3f268a14 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/redis_driver.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/database/redis_driver.md @@ -9,7 +9,7 @@ WasmEdge provides a Redis driver for Rust developers, enabling developers to bui :::note Before we start, [you need to have Rust and WasmEdge installed](../setup.md). -Make sure that you read the [special notes on networking apps](../setup#special-notes) especially if you are compiling Rust programs on a Mac. +Make sure that you read the [special notes on networking apps](../setup#special-notes-for-networking-apps) especially if you are compiling Rust programs on a Mac. ::: ## Run the example @@ -70,4 +70,3 @@ async fn main() -> Result<()> { Ok(()) } ``` - diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/http_service/client.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/http_service/client.md index c96bc306a..e58f32d71 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/http_service/client.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/http_service/client.md @@ -9,7 +9,7 @@ WasmEdge allows Rust developers to use APIs they are already familiar with to ac :::note Before we start, [you need to have Rust and WasmEdge installed](../setup.md). -Make sure that you read the [special notes on networking apps](../setup#special-notes) especially if you are compiling Rust programs on a Mac. +Make sure that you read the [special notes on networking apps](../setup#special-notes-for-networking-apps) especially if you are compiling Rust programs on a Mac. ::: We will discuss HTTP and HTTPS clients using popular Rust APIs. @@ -110,7 +110,7 @@ wasmedge compile target/wasm32-wasi/release/wasmedge_hyper_client.wasm wasmedge_ wasmedge wasmedge_hyper_client.wasm ``` -In your Rust application, import the [hyper](https://crates.io/crates/hyper) crate, +In your Rust application, import the [hyper](https://crates.io/crates/hyper) crate, and patch it with WasmEdge sockets patches. Just add the following line to your `Cargo.toml`. @@ -139,7 +139,7 @@ wasmedge wasmedge_hyper_client_https.wasm In the HTTPS version of `Cargo.toml`, you just need to import the standard [hyper-rustls](https://crates.io/crates/hyper-rustls), [rustls](https://crates.io/crates/rustls) and [webpki-roots](https://crates.io/crates/webpki-roots) crates with the same patches as above. -``` +```toml [patch.crates-io] tokio = { git = "https://github.com/second-state/wasi_tokio.git", branch = "v1.36.x" } socket2 = { git = "https://github.com/second-state/socket2.git", branch = "v0.5.x" } @@ -157,7 +157,7 @@ pretty_env_logger = "0.4.0" :::note -If you need to compile `rustls` as shown in the `Cargo.toml` above on the MacOS, you will need the [wasi-sdk version of clang](../setup#compile-rust-tls-on-macos). +If you need to compile `rustls` as shown in the `Cargo.toml` above on the MacOS, you will need the [wasi-sdk version of clang](../setup#tls-on-macos). ::: The [Rust example code](https://github.com/WasmEdge/wasmedge_hyper_demo/blob/main/client/src/main.rs) below shows an HTTP GET request. @@ -212,4 +212,3 @@ async fn post_url_return_str (url: hyper::Uri, post_body: &'static [u8]) -> Resu Ok(()) } ``` - diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/http_service/server.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/http_service/server.md index 939478471..1f75887fc 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/http_service/server.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/http_service/server.md @@ -6,13 +6,13 @@ sidebar_position: 2 For WasmEdge to become a cloud-native runtime for microservices, it needs to support HTTP servers. By its very nature, the HTTP server is always asynchronous (non-blocking -- so that it can handle concurrent requests). This chapter will cover HTTP servers using popular Rust APIs. -- [The axum API](#the-warp-api) +- [The axum API](#the-axum-api) - [The hyper API](#the-hyper-api) :::note Before we start, [you need to have Rust and WasmEdge installed](../setup.md). -Make sure that you read the [special notes on networking apps](../setup#special-notes) especially if you are compiling Rust programs on a Mac. +Make sure that you read the [special notes on networking apps](../setup#special-notes-for-networking-apps) especially if you are compiling Rust programs on a Mac. ::: ## The axum API @@ -46,7 +46,7 @@ In your Rust application, you will apply a few patches developed by the WasmEdge POSIX sockets with WasmEdge sockets in standard libraries. With those patches, you can then use the official `tokio` and `axum` crates. -``` +```toml [patch.crates-io] tokio = { git = "https://github.com/second-state/wasi_tokio.git", branch = "v1.36.x" } socket2 = { git = "https://github.com/second-state/socket2.git", branch = "v0.5.x" } diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/setup.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/setup.md index 2047432cd..d04950517 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/setup.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/setup.md @@ -36,17 +36,16 @@ rustup target add wasm32-wasi ### Tokio support -WasmEdge supports async networking APIs provided by [Tokio](https://tokio.rs/) and related crates. If you have tokio in your `Cargo.toml`, you -need to add a few config flags to help the Rust compiler choose the correct feature branches in the library source code. Here is an example of `cargo build` command for -compiling a tokio app to Wasm. +WasmEdge supports async networking APIs provided by [Tokio](https://tokio.rs/) and related crates. If you have tokio in your `Cargo.toml`, you +need to add a few config flags to help the Rust compiler choose the correct feature branches in the library source code. Here is an example of `cargo build` command for compiling a tokio app to Wasm. -``` +```bash RUSTFLAGS="--cfg wasmedge --cfg tokio_unstable" cargo build --target wasm32-wasi --release ``` Alternatively, you could add these lines to the `.cargo/config.toml` file. -``` +```toml [build] target = "wasm32-wasi" rustflags = ["--cfg", "wasmedge", "--cfg", "tokio_unstable"] @@ -54,7 +53,7 @@ rustflags = ["--cfg", "wasmedge", "--cfg", "tokio_unstable"] Once you have these lines in `.cargo/config.toml`, you can simply use the regular `cargo` command. -``` +```bash cargo build --target wasm32-wasi --release ``` @@ -65,13 +64,12 @@ on MacOS, you need a special version of the Clang tool, released from the offici > When you compile Rust TLS source code to Wasm on Linux, the result Wasm file is cross-platform and can run correctly on any platform with WasmEdge installed. This section is only applicable when you need to **compile** Rust TLS source code on MacOS. -[Download the latest wasi-sdk release](https://github.com/WebAssembly/wasi-sdk/releases) for your platform and +[Download the latest wasi-sdk release](https://github.com/WebAssembly/wasi-sdk/releases) for your platform and expand it into a directory. Point the `WASI_SDK_PATH` variable to this directory and export a `CC` variable for the default Clang. -``` +```bash export WASI_SDK_PATH /path/to/wasi-sdk-22.0 export CC="${WASI_SDK_PATH}/bin/clang --sysroot=${WASI_SDK_PATH}/share/wasi-sysroot" ``` That's it. Now you can use the `cargo` tools on MacOS to compile tokio libraries with `rust-tls` feature turned on. - diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/llm_inference.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/llm_inference.md index 54d443bcf..c2a13b5a3 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/llm_inference.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/llm_inference.md @@ -2,44 +2,11 @@ sidebar_position: 1 --- -# Llama 2 inference - -WasmEdge now supports running llama2 series of models in Rust. We will use [this example project](https://github.com/second-state/LlamaEdge/tree/main/chat) to show how to make AI inferences with the llama2 model in WasmEdge and Rust. - -WasmEdge now supports the following models: - -1. Llama-2-7B-Chat -1. Llama-2-13B-Chat -1. CodeLlama-13B-Instruct -1. Mistral-7B-Instruct-v0.1 -1. Mistral-7B-Instruct-v0.2 -1. MistralLite-7B -1. OpenChat-3.5-0106 -1. OpenChat-3.5-1210 -1. OpenChat-3.5 -1. Wizard-Vicuna-13B-Uncensored-GGUF -1. TinyLlama-1.1B-Chat-v1.0 -1. Baichuan2-13B-Chat -1. OpenHermes-2.5-Mistral-7B -1. Dolphin-2.2-Yi-34B -1. Dolphin-2.6-Mistral-7B -1. Samantha-1.2-Mistral-7B -1. Samantha-1.11-CodeLlama-34B -1. WizardCoder-Python-7B-V1.0 -1. Zephyr-7B-Alpha -1. WizardLM-13B-V1.0-Uncensored -1. Orca-2-13B -1. Neural-Chat-7B-v3-1 -1. Yi-34B-Chat -1. Starling-LM-7B-alpha -1. DeepSeek-Coder-6.7B -1. DeepSeek-LLM-7B-Chat -1. SOLAR-10.7B-Instruct-v1.0 -1. Mixtral-8x7B-Instruct-v0.1 -1. Nous-Hermes-2-Mixtral-8x7B-DPO -1. Nous-Hermes-2-Mixtral-8x7B-SFT - -And more, please check [the supported models](https://github.com/second-state/LlamaEdge/blob/main/models.md) for detials. +# LLM inference + +WasmEdge now supports running open-source Large Language Models (LLMs) in Rust. We will use [this example project](https://github.com/second-state/LlamaEdge/tree/main/chat) to show how to make AI inferences with the llama-3.1-8B model in WasmEdge and Rust. + +Furthermore, WasmEdge can support any open-source LLMs. Please check [the supported models](https://github.com/second-state/LlamaEdge/blob/main/models.md) for details. ## Prerequisite @@ -55,23 +22,23 @@ First, get the latest llama-chat wasm application curl -LO https://github.com/LlamaEdge/LlamaEdge/releases/latest/download/llama-chat.wasm ``` -Next, let's get the model. In this example, we are going to use the llama2 7b chat model in GGUF format. You can also use other kinds of llama2 models, check out [here](https://github.com/second-state/llamaedge/blob/main/chat/README.md#get-model). +Next, let's get the model. In this example, we are going to use the llama-3.1-8B model in GGUF format. You can also use other kinds of LLMs, check out [here](https://github.com/second-state/llamaedge/blob/main/chat/README.md#get-model). ```bash -curl -LO https://huggingface.co/wasmedge/llama2/resolve/main/llama-2-7b-chat-q5_k_m.gguf +curl -LO https://huggingface.co/second-state/Meta-Llama-3.1-8B-Instruct-GGUF/resolve/main/Meta-Llama-3.1-8B-Instruct-Q5_K_M.gguf ``` Run the inference application in WasmEdge. ```bash -wasmedge --dir .:. --nn-preload default:GGML:AUTO:llama-2-7b-chat-q5_k_m.gguf llama-chat.wasm +wasmedge --dir .:. --nn-preload default:GGML:AUTO:Meta-Llama-3.1-8B-Instruct-Q5_K_M.gguf llama-chat.wasm -p llama-3-chat ``` After executing the command, you may need to wait a moment for the input prompt to appear. You can enter your question once you see the `[USER]:` prompt: ```bash [USER]: -I have two apples, each costing 5 dollars. What is the total cost of these apple +I have two apples, each costing 5 dollars. What is the total cost of these apples? [ASSISTANT]: The total cost of the two apples is 10 dollars. [USER]: @@ -95,19 +62,26 @@ Second, use `cargo` to build the example project. cargo build --target wasm32-wasi --release ``` -The output WASM file is `target/wasm32-wasi/release/llama-chat.wasm`. Next, use WasmEdge to load the llama-2-7b model and then ask the model to questions. +The output WASM file is `target/wasm32-wasi/release/llama-chat.wasm`. Next, use WasmEdge to load the llama-3.1-8b model and then ask the model questions. ```bash -wasmedge --dir .:. --nn-preload default:GGML:AUTO:llama-2-7b-chat-q5_k_m.gguf llama-chat.wasm +wasmedge --dir .:. --nn-preload default:GGML:AUTO:Meta-Llama-3.1-8B-Instruct-Q5_K_M.gguf llama-chat.wasm -p llama-3-chat ``` -After executing the command, you may need to wait a moment for the input prompt to appear. You can enter your question once you see the `[USER]:` prompt: +After executing the command, you may need to wait a moment for the input prompt to appear. You can enter your question once you see the `[You]:` prompt: ```bash -[USER]: -Who is Robert Oppenheimer? -[ASSISTANT]: -Robert Oppenheimer was an American theoretical physicist and director of the Manhattan Project, which developed the atomic bomb during World War II. He is widely regarded as one of the most important physicists of the 20th century and is known for his contributions to the development of quantum mechanics and the theory of the atomic nucleus. Oppenheimer was also a prominent figure in the post-war nuclear weapons debate and was a strong advocate for international cooperation on nuclear weapons control. +[You]: +Which one is greater? 9.11 or 9.8? + +[Bot]: +9.11 is greater. + +[You]: +why + +[Bot]: +11 is greater than 8. ``` ## Options @@ -118,13 +92,13 @@ You can configure the chat inference application through CLI options. -m, --model-alias Model alias [default: default] -c, --ctx-size - Size of the prompt context [default: 4096] + Size of the prompt context [default: 512] -n, --n-predict Number of tokens to predict [default: 1024] -g, --n-gpu-layers Number of layers to run on the GPU [default: 100] -b, --batch-size - Batch size for prompt processing [default: 4096] + Batch size for prompt processing [default: 512] -r, --reverse-prompt Halt generation at PROMPT, return control. -s, --system-prompt @@ -145,22 +119,13 @@ You can configure the chat inference application through CLI options. The `--prompt-template` option is perhaps the most interesting. It allows the application to support different open source LLM models beyond llama2. Check out more prompt templates [here](https://github.com/LlamaEdge/LlamaEdge/tree/main/api-server/chat-prompts). -| Template name | Model | Download | -| ------------ | ------------------------------ | --- | -| llama-2-chat | [The standard llama2 chat model](https://ai.meta.com/llama/) | [7b](https://huggingface.co/wasmedge/llama2/resolve/main/llama-2-7b-chat-q5_k_m.gguf) | -| codellama-instruct | [CodeLlama](https://about.fb.com/news/2023/08/code-llama-ai-for-coding/) | [7b](https://huggingface.co/TheBloke/CodeLlama-7B-Instruct-GGUF/resolve/main/codellama-7b-instruct.Q5_K_M.gguf) | -| mistral-instruct-v0.1 | [Mistral](https://mistral.ai/) | [7b](https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF/resolve/main/mistral-7b-instruct-v0.1.Q5_K_M.gguf) | -| mistrallite | [Mistral Lite](https://huggingface.co/amazon/MistralLite) | [7b](https://huggingface.co/TheBloke/MistralLite-7B-GGUF/resolve/main/mistrallite.Q5_K_M.gguf) | -| openchat | [OpenChat](https://github.com/imoneoi/openchat) | [7b](https://huggingface.co/TheBloke/openchat_3.5-GGUF/resolve/main/openchat_3.5.Q5_K_M.gguf) | -| belle-llama-2-chat | [BELLE](https://github.com/LianjiaTech/BELLE) | [13b](https://huggingface.co/second-state/BELLE-Llama2-13B-Chat-0.4M-GGUF/resolve/main/BELLE-Llama2-13B-Chat-0.4M-ggml-model-q4_0.gguf) | -| vicuna-chat | [Vicuna](https://lmsys.org/blog/2023-03-30-vicuna/) | [7b](https://huggingface.co/TheBloke/vicuna-7B-v1.5-GGUF/resolve/main/vicuna-7b-v1.5.Q5_K_M.gguf) | -| chatml | [ChatML](https://huggingface.co/chargoddard/rpguild-chatml-13b) | [13b](https://huggingface.co/TheBloke/rpguild-chatml-13B-GGUF/resolve/main/rpguild-chatml-13b.Q5_K_M.gguf) | +The `--ctx-size` option specifies the context windows size of the application. It is limited by the model's intrinsic context window size. -Furthermore, the following command tells WasmEdge to print out logs and statistics of the model at runtime. +The `--log-stat` tells WasmEdge to print out logs and statistics of the model at runtime. ```bash -wasmedge --dir .:. --nn-preload default:GGML:AUTO:llama-2-7b-chat-q5_k_m.gguf \ - llama-chat.wasm --prompt-template llama-2-chat --log-stat +wasmedge --dir .:. --nn-preload default:GGML:AUTO:Meta-Llama-3.1-8B-Instruct-Q5_K_M.gguf \ + llama-chat.wasm --prompt-template llama-3-chat --log-stat .................................................................................................. llama_new_context_with_model: n_ctx = 512 llama_new_context_with_model: freq_base = 10000.0 @@ -184,20 +149,20 @@ You can make the inference program run faster by AOT compiling the wasm file fir ```bash wasmedge compile llama-chat.wasm llama-chat.wasm -wasmedge --dir .:. --nn-preload default:GGML:AUTO:llama-2-7b-chat-q5_k_m.gguf llama-chat.wasm +wasmedge --dir .:. --nn-preload default:GGML:AUTO:Meta-Llama-3.1-8B-Instruct-Q5_K_M.gguf llama-chat.wasm -p llama-3-chat ``` ## Understand the code -The [main.rs](https://github.com/second-state/llamaedge/blob/main/chat/src/main.rs) is the full Rust code to create an interactive chatbot using a LLM. The Rust program manages the user input, tracks the conversation history, transforms the text into the llama2 and other model’s chat templates, and runs the inference operations using the WASI NN standard API. The code logic for the chat interaction is somewhat complex. In this section, we will use the [simple example](https://github.com/second-state/llamaedge/tree/main/simple) to explain how to set up and perform one inference round trip. Here is how you use the simple example. +The [main.rs](https://github.com/second-state/llamaedge/blob/main/chat/src/main.rs) is the full Rust code to create an interactive chatbot using a LLM. The Rust program manages the user input, tracks the conversation history, transforms the text into the model’s chat templates, and runs the inference operations using the WASI NN standard API. The code logic for the chat interaction is somewhat complex. In this section, we will use the [simple example](https://github.com/second-state/llamaedge/tree/main/simple) to explain how to set up and perform one inference round trip. Here is how you use the simple example. ```bash # Download the compiled simple inference wasm curl -LO https://github.com/second-state/llamaedge/releases/latest/download/llama-simple.wasm # Give it a prompt and ask it to use the model to complete it. -wasmedge --dir .:. --nn-preload default:GGML:AUTO:llama-2-7b-chat-q5_k_m.gguf llama-simple.wasm \ - --prompt 'Robert Oppenheimer most important achievement is ' --ctx-size 4096 +wasmedge --dir .:. --nn-preload default:GGML:AUTO:Meta-Llama-3.1-8B-Instruct-Q5_K_M.gguf llama-simple.wasm \ + --prompt 'Robert Oppenheimer most important achievement is ' --ctx-size 512 output: in 1942, when he led the team that developed the first atomic bomb, which was dropped on Hiroshima, Japan in 1945. ``` @@ -306,6 +271,6 @@ println!("\noutput: {}", output); ## Resources -* If you're looking for multi-turn conversations with llama 2 models, please check out the above mentioned chat example source code [here](https://github.com/second-state/llamaedge/tree/main/chat). +* If you're looking for multi-turn conversations with llama models, please check out the above mentioned chat example source code [here](https://github.com/second-state/llamaedge/tree/main/chat). * If you want to construct OpenAI-compatible APIs specifically for your llama2 model, or the Llama2 model itself, please check out the source code [for the API server](https://github.com/second-state/llamaedge/tree/main/api-server). * To learn more, please check out [this article](https://medium.com/stackademic/fast-and-portable-llama2-inference-on-the-heterogeneous-edge-a62508e82359). diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/openvino.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/openvino.md index 76e24107d..844b959a5 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/openvino.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/openvino.md @@ -1,5 +1,5 @@ --- -sidebar_position: 4 +sidebar_position: 3 --- # OpenVINO Backend diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/piper.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/piper.md new file mode 100644 index 000000000..5dad97e03 --- /dev/null +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/piper.md @@ -0,0 +1,22 @@ +--- +sidebar_position: 6 +--- + +# Piper Backend + +We will use [this example project](https://github.com/second-state/WasmEdge-WASINN-examples/tree/master/wasmedge-piper) to show how to make AI inference with a Piper model in WasmEdge and Rust. + +## Prerequisite + +Besides the [regular WasmEdge and Rust requirements](../../rust/setup.md), please make sure that you have the [WASI-NN plugin with Piper installed](../../../start/install.md#wasi-nn-plug-in-with-piper-backend). + +## Quick start + +Because the example already includes a compiled WASM file from the Rust code, we could use WasmEdge CLI to execute the example directly. First, git clone the `WasmEdge-WASINN-examples` repo. + +```bash +git clone https://github.com/second-state/WasmEdge-WASINN-examples.git +cd WasmEdge-WASINN-examples/wasmedge-piper/ +``` + +Please follow the `README.md` to run the example. diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/pytorch.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/pytorch.md index 8a22d704c..c924d636c 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/pytorch.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/pytorch.md @@ -1,5 +1,5 @@ --- -sidebar_position: 2 +sidebar_position: 5 --- # PyTorch Backend diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/tensorflow_lite.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/tensorflow_lite.md index 28b217b94..5e061338b 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/tensorflow_lite.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/tensorflow_lite.md @@ -1,5 +1,5 @@ --- -sidebar_position: 3 +sidebar_position: 4 --- # TensorFlow Lite Backend diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/tf_plugin.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/tf_plugin.md index 09b48ea69..de7144a52 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/tf_plugin.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/tf_plugin.md @@ -1,8 +1,8 @@ --- -sidebar_position: 5 +sidebar_position: 8 --- -# TensorFlow Plug-in For WasmEdge +# TensorFlow And TensorFlow-Lite Plug-in For WasmEdge Developers can use [WASI-NN](https://github.com/WebAssembly/wasi-nn) to inference the models. However, for the TensorFlow and TensorFlow-Lite users, the WASI-NN APIs could be more friendly to retrieve the input and output tensors. Therefore WasmEdge provides the TensorFlow-related plug-in and rust SDK for inferencing models in WASM. @@ -138,10 +138,3 @@ Please refer to [WasmEdge CLI](../../../start/build-and-run/cli.md) for WASM exe :::info Work in Progress ::: - -## Old WasmEdge TensorFlow extension - - -:::info -Work in Progress -::: diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/whisper.md b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/whisper.md new file mode 100644 index 000000000..aae57e06c --- /dev/null +++ b/i18n/zh/docusaurus-plugin-content-docs/current/develop/rust/wasinn/whisper.md @@ -0,0 +1,22 @@ +--- +sidebar_position: 7 +--- + +# Whisper Backend + +We will use [this example project](https://github.com/second-state/WasmEdge-WASINN-examples/tree/master/whisper-basic) to show how to make AI inference with a Whisper model in WasmEdge and Rust. + +## Prerequisite + +Besides the [regular WasmEdge and Rust requirements](../../rust/setup.md), please make sure that you have the [WASI-NN plugin with Whisper installed](../../../start/install.md#install-wasmedge-with-plug-ins). + +## Quick start + +Because the example already includes a compiled WASM file from the Rust code, we could use WasmEdge CLI to execute the example directly. First, git clone the `WasmEdge-WASINN-examples` repo. + +```bash +git clone https://github.com/second-state/WasmEdge-WASINN-examples.git +cd WasmEdge-WASINN-examples/whisper-basic/ +``` + +Please follow the `README.md` to run the example. diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/embed/c/reference/latest.md b/i18n/zh/docusaurus-plugin-content-docs/current/embed/c/reference/latest.md index 8abc3b951..cc7b45da1 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/embed/c/reference/latest.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/embed/c/reference/latest.md @@ -2,7 +2,7 @@ sidebar_position: 1 --- -# C API 0.14.0 Documentation +# C API 0.14.1 Documentation [WasmEdge C API](https://github.com/WasmEdge/WasmEdge/blob/master/include/api/wasmedge/wasmedge.h) denotes an interface to access the WasmEdge runtime at version `{{ wasmedge_version }}`. The following are the guides to working with the C APIs of WasmEdge. diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/start/faq.md b/i18n/zh/docusaurus-plugin-content-docs/current/start/faq.md index 4038d8953..bbd134129 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/start/faq.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/start/faq.md @@ -34,4 +34,8 @@ Yes, WasmEdge can use Tensorflow as its [inference](https://wasmedge.org/docs/em WasmEdge provides the WASI (WebAssembly System Interface) API for interacting with the host system, including file operations. You can use the [WASI API](https://wasmedge.org/docs/embed/go/reference/0.11.x?_highlight=wasi&_highlight=api#preregistrations) to open and read files from the host system. +## 8. What's the relationship between WasmEdge and Second State + +The relationship between WasmEdge and Second State is rooted in the latter contributing their WasmEdge Runtime project to the Cloud Native Computing Foundation (CNCF). Subsequently, Second State became one of the maintainers for WasmEdge. As WasmEdge seeks to broaden its community, it continues to search for additional maintainers. + Please remember, this FAQ page is not exhaustive, and the WasmEdge community is always ready to help with any questions or issues you may have. Don't hesitate to reach out if you need assistance in our [Discord server](https://discord.gg/h4KDyB8XTt). diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/start/wasmedge/extensions/plugins.md b/i18n/zh/docusaurus-plugin-content-docs/current/start/wasmedge/extensions/plugins.md index 4d09e016f..d6fcece47 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/start/wasmedge/extensions/plugins.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/start/wasmedge/extensions/plugins.md @@ -2,37 +2,36 @@ sidebar_position: 2 --- -# WasmEdge 插件 - -对于那些过于重而难以编译成 WebAssembly的工作负载,将它们构建成本机主机函数是更好的选择。为了满足 WebAssembly 运行时的可移植性,WasmEdge 引入了插件机制,使主机函数可以加载和传输。 - -WasmEdge 的插件机制是一种扩展主机模块的简便方法,用户可以通过插件从由 WasmEdge 官方发布或其他开发人员发布的共享库中加载和实例化主机函数。 - -## 官方插件 - -下面列出了 WasmEdge 官方发布的插件。用户可以通过安装程序轻松安装它们。 - -| 插件 | 描述 | 平台支持 | 语言支持 | -| --- | --- | --- | --- | -| [WasmEdge-Process](../../../contribute/source/plugin/process.md) | 允许 WebAssembly 程序在主机操作系统中执行本机命令。它支持传递参数、环境变量、`STDIN`/`STDOUT` 管道以及主机访问的安全策略。 | `manylinux2014 x86_64`,`manylinux2014 aarch64` 和 `ubuntu 20.04 x86_64`(自`0.10.0`) | [Rust](https://crates.io/crates/wasmedge_process_interface) | -| [WASI-Crypto](https://github.com/WebAssembly/wasi-crypto) | 用于运行时向 WebAssembly 模块公开的 API,以执行加密操作和密钥管理。 | `manylinux2014 x86_64`,`manylinux2014 aarch64` 和 `ubuntu 20.04 x86_64`(自`0.10.1`) | [Rust](https://crates.io/crates/wasi-crypto) | -| [WASI-NN](https://github.com/WebAssembly/wasi-nn)[(OpenVINO 后端)](../../../develop/rust/wasinn/openvino.md) | 使用 OpenVINO 模型进行 AI 推理。 | `ubuntu 20.04 x86_64`(自`0.10.1`) | [Rust](https://crates.io/crates/wasi-nn),JavaScript | -| [WASI-NN](https://github.com/WebAssembly/wasi-nn)[(Pytorch 后端)](../../../develop/rust/wasinn/pytorch.md) | 使用 Pytorch 模型进行 AI 推理。 | `manylinux2014 x86_64` 和 `ubuntu 20.04 x86_64`(自`0.11.1`) | [Rust](https://crates.io/crates/wasi-nn),JavaScript | -| [WASI-NN](https://github.com/WebAssembly/wasi-nn)[(TensorFlow-Lite 后端)](../../../develop/rust/wasinn/tensorflow_lite.md) | 使用 TensorFlow-Lite 模型进行 AI 推理。 | `manylinux2014 x86_64`,`manylinux2014 aarch64` 和 `ubuntu 20.04 x86_64`(自`0.11.2`) | [Rust](https://crates.io/crates/wasi-nn),JavaScript | -| [WasmEdge-Image](../../../contribute/source/plugin/image.md) | 用于 AI 推理任务中处理图像的本机库。 | `manylinux2014 x86_64`,`manylinux2014 aarch64`,`ubuntu 20.04 x86_64`,`darwin x86_64` 和 `darwin arm64`(自`0.13.0`) | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface)(0.3.0) | -| [WasmEdge-Tensorflow](../../../contribute/source/plugin/tensorflow.md) | 用于推理 TensorFlow 模型的本机库。 | `manylinux2014 x86_64`,`manylinux2014 aarch64`,`ubuntu 20.04 x86_64`,`darwin x86_64` 和 `darwin arm64`(自`0.13.0`) | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface)(0.3.0) | -| [WasmEdge-TensorflowLite](../../../contribute/source/plugin/tensorflowlite.md) | 用于推理 TensorFlow-Lite 模型的本机库。 | `manylinux2014 x86_64`,`manylinux2014 aarch64`,`ubuntu 20.04 x86_64`,`darwin x86_64` 和 `darwin arm64`(自`0.13.0`) | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface) | -| WasmEdge-OpenCV | 一个非常流行的常用于处理图像和视频以供 AI 输入/输出函数库。 | 未发布 | Rust | -| WasmEdge-eBPF | 一个用于进行 eBPF 应用推理的原生库 | `manylinux2014 x86_64`, `manylinux2014 aarch64`, `ubuntu 20.04 x86_64`, `darwin x86_64`, and `darwin arm64` (since `0.13.0`) | Rust | -| WasmEdge-rustls | 一个用于进行 Rust 和 TLS 推理的原生库 | `manylinux2014 x86_64`, `manylinux2014 aarch64`, `ubuntu 20.04 x86_64`, `darwin x86_64`, and `darwin arm64` (since `0.13.0`) | [Rust](https://crates.io/crates/wasmedge_rustls_api) | - -## (过去的)WasmEdge 拓展 - -除了插件,WasmEdge 在 `0.13.0` 版本之前还提供了扩展功能。请注意,在 `0.13.0` 版本之后,这些扩展已经被相应的插件所取代。 - -支持这些扩展的最新版本是 `0.12.1`。当 WasmEdge 安装程序不再支持安装 `0.12.x` 版本时,本段将被废弃。 - -| 扩展 | 描述 | 平台支持 | 语言支持 | -| --- | --- | --- | --- | -| [图像处理](https://github.com/second-state/WasmEdge-image) | 用于处理人工智能推推理任务中的图像的本地库。在 WasmEdge `0.13.0` 版本后迁移到插件中。 | `manylinux2014 x86_64`,`manylinux2014 aarch64`,`android aarch64`,`ubuntu 20.04 x86_64` 和 `darwin x86_64` | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface) (0.2.2) | -| [TensorFlow 和 Tensorflow-Lite](https://github.com/second-state/WasmEdge-tensorflow) | 用于 TensorFlow 和 TensorFlow-Lite 模型推理的本地库。在 WasmEdge `0.13.0` 版本后迁移到插件中。 | `manylinux2014 x86_64`,`manylinux2014 aarch64`(仅限TensorFlow-Lite),`android aarch64`(仅限TensorFlow-Lite),`ubuntu 20.04 x86_64` 和 `darwin x86_64` | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface) (0.2.2) | +# WasmEdge Plug-ins + +For those workloads which are too heavy to compile into WebAssembly, it would be more appropriate to build them into native host functions. To satisfy the portability of WebAssembly runtime, WasmEdge introduced the plug-in mechanism to make the host functions loadable and portable. + +The plug-in mechanism for WasmEdge is an easy way to extend the host modules from loadable shared libraries. With the plug-ins, users can load and instantiate the host functions from the shared libraries released by WasmEdge official or even by other developers. + +## Official Released Plug-ins + +The following lists are the WasmEdge official released plug-ins. Users can install them easily by the installer. + +| Plug-in | Description | Platform Support | Guest Language Support | Build From Source | +|---------|-------------|------------------|------------------------|-------------------| +| [WASI-Logging](https://github.com/WebAssembly/wasi-logging) | Logging API for WebAssembly program to log messages. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.0`) | Rust | [Steps](../../../contribute/source/plugin/wasi_logging.md) | +| [WASI-Crypto](https://github.com/WebAssembly/wasi-crypto) | APIs that a runtime can expose to WebAssembly modules in order to perform cryptographic operations and key management. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.10.1`) | [Rust](https://crates.io/crates/wasi-crypto) | [Steps](../../../contribute/source/plugin/wasi_crypto.md) | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(OpenVINO backend)](../../../develop/rust/wasinn/openvino.md) | AI inference using OpenVINO models. | `ubuntu 20.04 (x86_64)`
(since `0.10.1`) | [Rust](https://crates.io/crates/wasi-nn), JavaScript | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-openvino-backend) | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(Pytorch backend)](../../../develop/rust/wasinn/pytorch.md) | AI inference using Pytorch models. | `manylinux2014 (x86_64)`
`ubuntu 20.04 (x86_64)`
(since `0.11.1`) | [Rust](https://crates.io/crates/wasi-nn), JavaScript | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-pytorch-backend) | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(TensorFlow-Lite backend)](../../../develop/rust/wasinn/tensorflow_lite.md) | AI inference using TensorFlow-Lite models. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.11.2`) | [Rust](https://crates.io/crates/wasi-nn), JavaScript | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-tensorflow-lite-backend) | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(Ggml backend)](../../../develop/rust/wasinn/llm_inference.md) | AI inference using LLM interfaces. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.4`) | [Rust](https://github.com/second-state/wasmedge-wasi-nn) | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-llamacpp-backend) | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(Piper backend)](../../../develop/rust/wasinn/piper.md) | AI inference using Piper models. | `manylinux_2_28 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.14.1`) | [Rust](https://github.com/second-state/wasmedge-wasi-nn) | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-piper-backend) | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) [(Whisper backend)](../../../develop/rust/wasinn/whisper.md) | AI inference using Whisper models. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.14.1`) | [Rust](https://github.com/second-state/wasmedge-wasi-nn) | [Steps](../../../contribute/source/plugin/wasi_nn.md#build-wasmedge-with-wasi-nn-whisper-backend) | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) Burn.rs backend (Squeezenet) | AI inference using Squeezenet models in Burn.rs. | `ubuntu 20.04 (x86_64)`
(since `0.14.1`) | [Rust](https://github.com/second-state/wasmedge-wasi-nn) | | +| [WASI-NN](https://github.com/WebAssembly/wasi-nn) Burn.rs backend (Whisper) | AI inference using Whisper models in Burn.rs. | `ubuntu 20.04 (x86_64)`
(since `0.14.1`) | [Rust](https://github.com/second-state/wasmedge-wasi-nn) | | +| WasmEdge-ffmpeg | | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.14.0`) | | | +| [WasmEdge-Image](../../../contribute/source/plugin/image.md) | A native library to manipulate images for AI inference tasks. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.0`) | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface) (0.3.0) | [Steps](../../../contribute/source/plugin/image.md) | +| WasmEdge-LLMC | | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.14.1`) | | | +| WasmEdge-OpenCV | Very popular utility functions to process images and videos for AI input/output. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.3`) | | | +| [WasmEdge-Process](../../../contribute/source/plugin/process.md) | Allows WebAssembly programs to execute native commands in the host operating system. It supports passing arguments, environment variables, `STDIN`/`STDOUT` pipes, and security policies for host access. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.10.0`) | [Rust](https://crates.io/crates/wasmedge_process_interface) | [Steps](../../../contribute/source/plugin/process.md) | +| WasmEdge-StableDiffusion | | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.14.1`) | | | +| [WasmEdge-Tensorflow](../../../contribute/source/plugin/tensorflow.md) | A native library for inferring TensorFlow models.| `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.0`) | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface) (0.3.0) | [Steps](../../../contribute/source/plugin/tensorflow.md) | +| [WasmEdge-TensorflowLite](../../../contribute/source/plugin/tensorflowlite.md)| A native library for inferring TensorFlow-Lite models. | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.0`) | [Rust](https://crates.io/crates/wasmedge_tensorflow_interface) (0.3.0) | [Steps](../../../contribute/source/plugin/tensorflowlite.md) | +| WasmEdge-zlib | ??? | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.5`) | | | +| [WasmEdge-eBPF](../../../contribute/source/plugin/ebpf.md) | A native library for inferring eBPF applications | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
(since `0.13.2`) | Rust | [Steps](../../../contribute/source/plugin/ebpf.md) | +| [WasmEdge-rustls](../../../contribute/source/plugin/rusttls.md) (DEPRECATED) | A native library for inferring Rust and TLS Library | `manylinux2014 (x86_64, aarch64)`
`ubuntu 20.04 (x86_64)`
`darwin (x86_64, arm64)`
(since `0.13.0`, until `0.13.5`) | [Rust](https://crates.io/crates/wasmedge_rustls_api) | [Steps](../../../contribute/source/plugin/rusttls.md) |