From 20c26a33531783991168ec1a8d1aaf47461085d7 Mon Sep 17 00:00:00 2001 From: Yulong Wang <7679871+fs-eire@users.noreply.github.com> Date: Sun, 18 Feb 2024 21:52:16 -0800 Subject: [PATCH 1/3] re-arrange web docs --- docs/get-started/training-on-device.md | 2 +- docs/get-started/with-javascript.md | 152 ------------------ docs/get-started/with-javascript/index.md | 49 ++++++ docs/get-started/with-javascript/node.md | 42 +++++ .../with-javascript/react-native.md | 46 ++++++ docs/get-started/with-javascript/web.md | 84 ++++++++++ docs/get-started/with-web.md | 21 --- docs/tutorials/web/build-web-app.md | 2 +- .../classify-images-nextjs-github-template.md | 2 +- docs/tutorials/web/excel-addin-bert-js.md | 2 +- docs/tutorials/web/index.md | 2 +- 11 files changed, 226 insertions(+), 178 deletions(-) delete mode 100644 docs/get-started/with-javascript.md create mode 100644 docs/get-started/with-javascript/index.md create mode 100644 docs/get-started/with-javascript/node.md create mode 100644 docs/get-started/with-javascript/react-native.md create mode 100644 docs/get-started/with-javascript/web.md delete mode 100644 docs/get-started/with-web.md diff --git a/docs/get-started/training-on-device.md b/docs/get-started/training-on-device.md index 53e57b5b02626..c22b2749f97be 100644 --- a/docs/get-started/training-on-device.md +++ b/docs/get-started/training-on-device.md @@ -1,7 +1,7 @@ --- title: On-Device Training parent: Get Started -nav_order: 12 +nav_order: 11 --- # On-Device Training with ONNX Runtime diff --git a/docs/get-started/with-javascript.md b/docs/get-started/with-javascript.md deleted file mode 100644 index 77f17520eff88..0000000000000 --- a/docs/get-started/with-javascript.md +++ /dev/null @@ -1,152 +0,0 @@ ---- -title: JavaScript -parent: Get Started -toc: true -nav_order: 6 ---- - -# Get started with ORT for JavaScript -{: .no_toc } - -ONNX Runtime JavaScript API is the unified interface used by [ONNX Runtime Node.js binding](https://github.com/microsoft/onnxruntime/tree/main/js/node), [ONNX Runtime Web](https://github.com/microsoft/onnxruntime/tree/main/js/web), and [ONNX Runtime for React Native](https://github.com/microsoft/onnxruntime/tree/main/js/react_native). - -## Contents -{: .no_toc } - -* TOC placeholder -{:toc} - -## ONNX Runtime Node.js binding -ONNX Runtime Node.js binding can be achieved by installing and importing. -### Install - -```bash -# install latest release version -npm install onnxruntime-node -``` - -### Import - -```js -// use ES6 style import syntax (recommended) -import * as ort from 'onnxruntime-node'; -``` -```js -// or use CommonJS style import syntax -const ort = require('onnxruntime-node'); -``` - -### Examples - -- Follow the [Quick Start](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-node) instructions for ONNX Runtime Node.js binding. - -### Supported Versions - -ONNX Runtime Node.js binding supports Node.js v12.x+ or Electron v5.x+ - -## ONNX Runtime Web -You can install and import ONNX Runtime Web. -### Install - -```bash -# install latest release version -npm install onnxruntime-web - -# install nightly build dev version -npm install onnxruntime-web@dev -``` - -### Import - - -```js -// use ES6 style import syntax (recommended) -import * as ort from 'onnxruntime-web'; -``` -```js -// or use CommonJS style import syntax -const ort = require('onnxruntime-web'); -``` - -If you want to use ONNX Runtime Web with WebGPU support (experimental feature), you need to import as below: - -```js -// use ES6 style import syntax (recommended) -import * as ort from 'onnxruntime-web/webgpu'; -``` -```js -// or use CommonJS style import syntax -const ort = require('onnxruntime-web/webgpu'); -``` - -### Examples - -ONNX Runtime Web can also be imported via a script tag in a HTML file, from a CDN server. Here are some examples: -- [Quick Start (using bundler)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-bundler) -- [Quick Start (using script tag)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-script-tag) -- [ONNX Runtime Web for In Browser Inference](https://youtu.be/0dskvE4IvGM) -- [Inference in Javascript with ONNX Runtime Web](https://youtu.be/vYzWrT3A7wQ) - - -### Supported Versions - - -ONNX Runtime supports mainstream modern browsers/OS on Windows, Ubuntu, macOS, Android, and iOS. You can check the [compatibility](https://github.com/Microsoft/onnxjs#Compatibility) of ONNX Runtime with modern browsers and operating systems for your desktop and mobile platforms. In-browser inference is possible with [ONNX Runtime Web JavaScript](https://cloudblogs.microsoft.com/opensource/2021/09/02/onnx-runtime-web-running-your-machine-learning-model-in-browser/) that can enable cross-platform portability for web-applications. - - - -## ONNX Runtime for React Native -You can install and import ONNX Runtime Web for React Native. -### Install - - -```bash -# install latest release version -npm install onnxruntime-react-native -``` - -### Import - - -```js -// use ES6 style import syntax (recommended) -import * as ort from 'onnxruntime-react-native'; -``` -```js -// or use CommonJS style import syntax -const ort = require('onnxruntime-react-native'); -``` - - -#### Enable ONNX Runtime Extensions for React Native -To enable support for [ONNX Runtime Extensions](https://github.com/microsoft/onnxruntime-extensions) in your React Native app, -you need to specify the following configuration as a top-level entry (note: usually where the package `name`and `version`fields are) in your project's root directory `package.json` file. - -```js -"onnxruntimeExtensionsEnabled": "true" -``` - - -## Builds - -[Builds](https://onnxruntime.ai/docs/build/web.html) are published to **npm** and can be installed using `npm install` - -| Package | Artifact | Description | Supported Platforms | -|---------|-----------|-------------|---------------------| -|Node.js binding|[onnxruntime-node](https://www.npmjs.com/package/onnxruntime-node)|CPU (Release)| Windows x64 CPU NAPI_v3, Linux x64 CPU NAPI_v3, MacOS x64 CPU NAPI_v3| -|Web|[onnxruntime-web](https://www.npmjs.com/package/onnxruntime-web)|CPU and GPU|Browsers (wasm, webgl), Node.js (wasm)| -|React Native|[onnxruntime-react-native](https://www.npmjs.com/package/onnxruntime-react-native)|CPU|Android, iOS| - -- For Node.js binding, to use on platforms without pre-built binaries, you can [build Node.js binding from source](../build/inferencing.md#apis-and-language-bindings) and consume using `npm install /js/node/`. -- Consider the [options and considerations](https://onnxruntime.ai/docs/reference/build-web-app.html) for building a Web app with ONNX Runtime Web using JavaScript. -- Explore a simple web application to [classify images with ONNX Runtime Web](https://onnxruntime.ai/docs/tutorials/web/classify-images-nextjs-github-template.html). - -## API Reference - -See [ONNX Runtime JavaScript API](../api/js/index.html){:target="_blank"} for API reference. Check out the [ONNX Runtime Web demos!](https://microsoft.github.io/onnxruntime-web-demo/#/) for image recognition, handwriting analysis, real-time emotion detection, object detection, and so on. - -See also: - -- [ONNX Runtime JavaScript examples and API Usage](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js). - -- Typescript declarations for [Inference Session](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/inference-session.ts), [Tensor](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/tensor.ts), and [Environment Flags](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/env.ts) for reference. diff --git a/docs/get-started/with-javascript/index.md b/docs/get-started/with-javascript/index.md new file mode 100644 index 0000000000000..62438b7975a8b --- /dev/null +++ b/docs/get-started/with-javascript/index.md @@ -0,0 +1,49 @@ +--- +title: JavaScript +parent: Get Started +has_children: true +toc: true +nav_order: 6 +--- + +# Get started with ORT for JavaScript +{: .no_toc } + +ONNX Runtime JavaScript API is the unified interface used by [ONNX Runtime Node.js binding](https://github.com/microsoft/onnxruntime/tree/main/js/node), [ONNX Runtime Web](https://github.com/microsoft/onnxruntime/tree/main/js/web), and [ONNX Runtime for React Native](https://github.com/microsoft/onnxruntime/tree/main/js/react_native). + +See [how to choose the right package](../tutorials/web/build-web-app#options-for-deployment-target) for your JavaScript application. + +## Contents +{: .no_toc } + +* Get Started with [ONNX Runtime Web](web.md) +* Get Started with [ONNX Runtime Node.js binding](node.md) +* Get Started with [ONNX Runtime for React Native](react-native.md) +* [Builds](#builds) +* [API Reference](#api-reference) + +## Builds + +[Builds](https://onnxruntime.ai/docs/build/web.html) are published to **npm** and can be installed using `npm install` + +| Package | Artifact | Description | Supported Platforms | +|---------|-----------|-------------|---------------------| +|Node.js binding|[onnxruntime-node](https://www.npmjs.com/package/onnxruntime-node)|CPU and GPU (Release/NAPI_v3)| Windows x64: cpu, dml
Windows arm64: cpu, dml
Linux x64: cpu, cuda
Linux arm64: cpu
MacOS x64: cpu
MacOS arm64: cpu| +|Web|[onnxruntime-web](https://www.npmjs.com/package/onnxruntime-web)|CPU and GPU|Chromium Browsers (Chrome, Edge): wasm, webgl, webgpu, webnn
Safari: wasm, webgl
Other Browsers: wasm
Node.js: wasm| +|React Native|[onnxruntime-react-native](https://www.npmjs.com/package/onnxruntime-react-native)|CPU|Android, iOS| + +- For Web, pre-built binaries are published in NPM package as well as served in CDNs. See [Deploy ONNX Runtime Web](TBD) for more details. If you want to use a custom build, you can [build ONNX Runtime Web from source](../build/web.md). +- For Node.js binding, to use on platforms without pre-built binaries, you can [build Node.js binding from source](../build/inferencing.md#apis-and-language-bindings) and consume using `npm install /js/node/`. +- Explore a simple web application to [classify images with ONNX Runtime Web](https://onnxruntime.ai/docs/tutorials/web/classify-images-nextjs-github-template.html). + +## API Reference + +See [ONNX Runtime JavaScript API](../api/js/index.html){:target="_blank"} for API reference. + +See also: + +- [ONNX Runtime JavaScript examples and API Usage](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js). + +- [ONNX Runtime Web demos](https://microsoft.github.io/onnxruntime-web-demo/#/) for image recognition, handwriting analysis, real-time emotion detection, object detection, and so on. + +- Typescript declarations for [Inference Session](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/inference-session.ts), [Tensor](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/tensor.ts), and [Environment Flags](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/env.ts) for reference. diff --git a/docs/get-started/with-javascript/node.md b/docs/get-started/with-javascript/node.md new file mode 100644 index 0000000000000..af1e934d0fde8 --- /dev/null +++ b/docs/get-started/with-javascript/node.md @@ -0,0 +1,42 @@ +--- +title: Node.js binding +parent: JavaScript +grand_parent: Get Started +has_children: false +nav_order: 2 +--- + +# Get started with ONNX Runtime Node.js binding + +## Contents +{: .no_toc } + +* TOC placeholder +{:toc} + +## Install + +```bash +# install latest release version +npm install onnxruntime-node +``` + +## Import + +```js +// use ES6 style import syntax (recommended) +import * as ort from 'onnxruntime-node'; +``` +```js +// or use CommonJS style import syntax +const ort = require('onnxruntime-node'); +``` + +## Examples + +- Follow the [Quick Start](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-node) instructions for ONNX Runtime Node.js binding. + +## Supported Versions + +ONNX Runtime Node.js binding supports Node.js v12.x+ or Electron v5.x+ + diff --git a/docs/get-started/with-javascript/react-native.md b/docs/get-started/with-javascript/react-native.md new file mode 100644 index 0000000000000..2e4ec725cd40e --- /dev/null +++ b/docs/get-started/with-javascript/react-native.md @@ -0,0 +1,46 @@ +--- +title: React Native +parent: JavaScript +grand_parent: Get Started +has_children: false +nav_order: 3 +--- + +# Get started with ONNX Runtime for React Native + +## Contents +{: .no_toc } + +* TOC placeholder +{:toc} + + +## Install + + +```bash +# install latest release version +npm install onnxruntime-react-native +``` + +## Import + + +```js +// use ES6 style import syntax (recommended) +import * as ort from 'onnxruntime-react-native'; +``` +```js +// or use CommonJS style import syntax +const ort = require('onnxruntime-react-native'); +``` + + +### Enable ONNX Runtime Extensions for React Native +To enable support for [ONNX Runtime Extensions](https://github.com/microsoft/onnxruntime-extensions) in your React Native app, +you need to specify the following configuration as a top-level entry (note: usually where the package `name`and `version`fields are) in your project's root directory `package.json` file. + +```js +"onnxruntimeExtensionsEnabled": "true" +``` + diff --git a/docs/get-started/with-javascript/web.md b/docs/get-started/with-javascript/web.md new file mode 100644 index 0000000000000..4ebcc4ffe767f --- /dev/null +++ b/docs/get-started/with-javascript/web.md @@ -0,0 +1,84 @@ +--- +title: Web +parent: JavaScript +grand_parent: Get Started +has_children: false +nav_order: 1 +--- + +# Get started with ONNX Runtime Web + +## Contents +{: .no_toc } + +* TOC placeholder +{:toc} + +## Install + +Use the following command in shell to install ONNX Runtime Web: + +```bash +# install latest release version +npm install onnxruntime-web + +# install nightly build dev version +npm install onnxruntime-web@dev +``` + +## Import + +Use the following JavaScript code to import ONNX Runtime Web: + +```js +// use ES6 style import syntax (recommended) +import * as ort from 'onnxruntime-web'; +``` +```js +// or use CommonJS style import syntax +const ort = require('onnxruntime-web'); +``` + +If you want to use ONNX Runtime Web with WebGPU support (experimental feature), you need to import as below: + +```js +// use ES6 style import syntax (recommended) +import * as ort from 'onnxruntime-web/webgpu'; +``` +```js +// or use CommonJS style import syntax +const ort = require('onnxruntime-web/webgpu'); +``` + +For a complete table for importing, see [Conditional Importing](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/importing_onnxruntime-web#conditional-importing). + +## Documentation + +See [Tutorial: Web](../tutorials/web/index.md) for more details. Please also check the following links: +- [Tensor](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_tensor) - a demonstration of basic usage of Tensor. +- [Tensor <--> Image conversion](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage-tensor-image) - a demonstration of conversions from Image elements to and from Tensor. +- [InferenceSession](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_inference-session) - a demonstration of basic usage of InferenceSession. +- [SessionOptions](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_session-options) - a demonstration of how to configure creation of an InferenceSession instance. +- [ort.env flags](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_ort-env-flags) - a demonstration of how to configure a set of global flags. + +See [Training on web demo](https://github.com/microsoft/onnxruntime-training-examples/tree/master/on_device_training/web) for training using onnxruntime-web. + +## Examples + +The following examples describe how to use ONNX Runtime Web in your web applications for model inferencing: +- [Quick Start (using bundler)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-bundler) +- [Quick Start (using script tag)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-script-tag) + +The following are E2E examples that uses ONNX Runtime Web in web applications: +- [OpenAI Whisper](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/ort-whisper) - demonstrates how to run [whisper tiny.en](https://github.com/openai/whisper) in your browser using onnxruntime-web and the browser's audio interfaces. +- [Facebook Segment-Anything](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/segment-anything) - demonstrates how to run [segment-anything](https://github.com/facebookresearch/segment-anything) in your browser using onnxruntime-web with webgpu. + +The following are video tutorials that use ONNX Runtime Web in web applications: +- [ONNX Runtime Web for In Browser Inference](https://youtu.be/0dskvE4IvGM) +- [Inference in Javascript with ONNX Runtime Web](https://youtu.be/vYzWrT3A7wQ) + + +## Supported Versions + + +ONNX Runtime supports mainstream modern browsers/OS on Windows, Ubuntu, macOS, Android, and iOS. Specifically, for Chromium-based browsers, ONNX Runtime Web supports wasm, webgl, webgpu, and webnn EPs. For Safari, ONNX Runtime Web supports wasm and webgl EPs. For other browsers or Node.js, ONNX Runtime Web supports wasm EP. diff --git a/docs/get-started/with-web.md b/docs/get-started/with-web.md deleted file mode 100644 index ab757108ba9bd..0000000000000 --- a/docs/get-started/with-web.md +++ /dev/null @@ -1,21 +0,0 @@ ---- -title: Web -description: ONNX Runtime for web-based deployments and considerations and options for building a web application with ONNX Runtime -has_toc: false -nav_order: 11 -parent: Get Started -redirect_from: /docs/reference/build-web-app - ---- - -# Get started with ONNX Runtime Web -ORT Web can be used in your web applications for model inferencing. - -{: .no_toc} - -## Reference -* [Install ONNX Runtime Web](./../install/index.md#install-on-web-and-mobile) -* [Build from source](./../build/web.md) -* [Tutorials: Deploy on web](./../tutorials/web/index.md) - * [Guide: Build a web application with ONNX Runtime](./../tutorials/web/build-web-app) -* [Training on web demo](https://github.com/microsoft/onnxruntime-training-examples/tree/master/on_device_training/web) diff --git a/docs/tutorials/web/build-web-app.md b/docs/tutorials/web/build-web-app.md index 6e453bf8a8f7a..6fd14260a9e78 100644 --- a/docs/tutorials/web/build-web-app.md +++ b/docs/tutorials/web/build-web-app.md @@ -1,7 +1,7 @@ --- title: Build a web app with ONNX Runtime description: Considerations and options for building a web application with ONNX Runtime -parent: Deploy on web +parent: Web grand_parent: Tutorials nav_order: 3 redirect_from: /reference/build-web-app diff --git a/docs/tutorials/web/classify-images-nextjs-github-template.md b/docs/tutorials/web/classify-images-nextjs-github-template.md index cdea907742d73..022768f526df6 100644 --- a/docs/tutorials/web/classify-images-nextjs-github-template.md +++ b/docs/tutorials/web/classify-images-nextjs-github-template.md @@ -1,7 +1,7 @@ --- title: Classify images with ONNX Runtime and Next.js description: Classify images in a NextJS web application built from a GitHub template repo -parent: Deploy on web +parent: Web grand_parent: Tutorials has_children: false nav_order: 1 diff --git a/docs/tutorials/web/excel-addin-bert-js.md b/docs/tutorials/web/excel-addin-bert-js.md index 544ba147ea75e..a99e07a3c1291 100644 --- a/docs/tutorials/web/excel-addin-bert-js.md +++ b/docs/tutorials/web/excel-addin-bert-js.md @@ -1,7 +1,7 @@ --- title: Custom Excel Functions for BERT Tasks in JavaScript description: Custom Excel Functions for BERT Tasks in JavaScript -parent: Deploy on web +parent: Web grand_parent: Tutorials has_children: false nav_order: 2 diff --git a/docs/tutorials/web/index.md b/docs/tutorials/web/index.md index e36391237e449..32b19c8dbf752 100644 --- a/docs/tutorials/web/index.md +++ b/docs/tutorials/web/index.md @@ -1,5 +1,5 @@ --- -title: Deploy on web +title: Web parent: Tutorials has_children: true nav_order: 7 From 9befc5c0f3d7233c9788faf791c279d57ef6eaad Mon Sep 17 00:00:00 2001 From: Yulong Wang <7679871+fs-eire@users.noreply.github.com> Date: Mon, 19 Feb 2024 15:52:51 -0800 Subject: [PATCH 2/3] fix 404 --- docs/get-started/with-javascript/index.md | 8 ++++---- docs/get-started/with-javascript/web.md | 2 +- docs/tutorials/web/build-web-app.md | 2 +- 3 files changed, 6 insertions(+), 6 deletions(-) diff --git a/docs/get-started/with-javascript/index.md b/docs/get-started/with-javascript/index.md index 62438b7975a8b..c62d8a5c87705 100644 --- a/docs/get-started/with-javascript/index.md +++ b/docs/get-started/with-javascript/index.md @@ -11,7 +11,7 @@ nav_order: 6 ONNX Runtime JavaScript API is the unified interface used by [ONNX Runtime Node.js binding](https://github.com/microsoft/onnxruntime/tree/main/js/node), [ONNX Runtime Web](https://github.com/microsoft/onnxruntime/tree/main/js/web), and [ONNX Runtime for React Native](https://github.com/microsoft/onnxruntime/tree/main/js/react_native). -See [how to choose the right package](../tutorials/web/build-web-app#options-for-deployment-target) for your JavaScript application. +See [how to choose the right package](../../tutorials/web/build-web-app#options-for-deployment-target) for your JavaScript application. ## Contents {: .no_toc } @@ -32,13 +32,13 @@ See [how to choose the right package](../tutorials/web/build-web-app#options-for |Web|[onnxruntime-web](https://www.npmjs.com/package/onnxruntime-web)|CPU and GPU|Chromium Browsers (Chrome, Edge): wasm, webgl, webgpu, webnn
Safari: wasm, webgl
Other Browsers: wasm
Node.js: wasm| |React Native|[onnxruntime-react-native](https://www.npmjs.com/package/onnxruntime-react-native)|CPU|Android, iOS| -- For Web, pre-built binaries are published in NPM package as well as served in CDNs. See [Deploy ONNX Runtime Web](TBD) for more details. If you want to use a custom build, you can [build ONNX Runtime Web from source](../build/web.md). -- For Node.js binding, to use on platforms without pre-built binaries, you can [build Node.js binding from source](../build/inferencing.md#apis-and-language-bindings) and consume using `npm install /js/node/`. +- For Web, pre-built binaries are published in NPM package as well as served in CDNs. If you want to use a custom build, you can [build ONNX Runtime Web from source](../../build/web.md). +- For Node.js binding, to use on platforms without pre-built binaries, you can [build Node.js binding from source](../../build/inferencing.md#apis-and-language-bindings) and consume using `npm install /js/node/`. - Explore a simple web application to [classify images with ONNX Runtime Web](https://onnxruntime.ai/docs/tutorials/web/classify-images-nextjs-github-template.html). ## API Reference -See [ONNX Runtime JavaScript API](../api/js/index.html){:target="_blank"} for API reference. +See [ONNX Runtime JavaScript API](../../api/js/index.html){:target="_blank"} for API reference. See also: diff --git a/docs/get-started/with-javascript/web.md b/docs/get-started/with-javascript/web.md index 4ebcc4ffe767f..fd7c8e5638f27 100644 --- a/docs/get-started/with-javascript/web.md +++ b/docs/get-started/with-javascript/web.md @@ -54,7 +54,7 @@ For a complete table for importing, see [Conditional Importing](https://github.c ## Documentation -See [Tutorial: Web](../tutorials/web/index.md) for more details. Please also check the following links: +See [Tutorial: Web](../../tutorials/web/index.md) for more details. Please also check the following links: - [Tensor](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_tensor) - a demonstration of basic usage of Tensor. - [Tensor <--> Image conversion](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage-tensor-image) - a demonstration of conversions from Image elements to and from Tensor. - [InferenceSession](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_inference-session) - a demonstration of basic usage of InferenceSession. diff --git a/docs/tutorials/web/build-web-app.md b/docs/tutorials/web/build-web-app.md index 6fd14260a9e78..8dbe1bdca9b33 100644 --- a/docs/tutorials/web/build-web-app.md +++ b/docs/tutorials/web/build-web-app.md @@ -79,7 +79,7 @@ Add "@dev" to the package name to use the nightly build (eg. npm install onnxrun ## Consume onnxruntime-web in your code 1. Import onnxruntime-web - See [import onnxruntime-web](../../get-started/with-javascript.md#import-1) + See [import onnxruntime-web](../../get-started/with-javascript/web.md#import) 2. Initialize the inference session See [InferenceSession.create](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/quick-start_onnxruntime-web-bundler/main.js#L14) From 1271da4ce4966bb3b9a513a84c88b58f926eeae8 Mon Sep 17 00:00:00 2001 From: Yulong Wang <7679871+fs-eire@users.noreply.github.com> Date: Mon, 19 Feb 2024 20:34:17 -0800 Subject: [PATCH 3/3] revise index --- docs/get-started/with-javascript/index.md | 30 +---------------------- docs/get-started/with-javascript/node.md | 14 ++++++++++- docs/get-started/with-javascript/web.md | 23 ++++++++++++++--- 3 files changed, 34 insertions(+), 33 deletions(-) diff --git a/docs/get-started/with-javascript/index.md b/docs/get-started/with-javascript/index.md index c62d8a5c87705..26c03b6366f9f 100644 --- a/docs/get-started/with-javascript/index.md +++ b/docs/get-started/with-javascript/index.md @@ -2,7 +2,7 @@ title: JavaScript parent: Get Started has_children: true -toc: true +toc: false nav_order: 6 --- @@ -19,31 +19,3 @@ See [how to choose the right package](../../tutorials/web/build-web-app#options- * Get Started with [ONNX Runtime Web](web.md) * Get Started with [ONNX Runtime Node.js binding](node.md) * Get Started with [ONNX Runtime for React Native](react-native.md) -* [Builds](#builds) -* [API Reference](#api-reference) - -## Builds - -[Builds](https://onnxruntime.ai/docs/build/web.html) are published to **npm** and can be installed using `npm install` - -| Package | Artifact | Description | Supported Platforms | -|---------|-----------|-------------|---------------------| -|Node.js binding|[onnxruntime-node](https://www.npmjs.com/package/onnxruntime-node)|CPU and GPU (Release/NAPI_v3)| Windows x64: cpu, dml
Windows arm64: cpu, dml
Linux x64: cpu, cuda
Linux arm64: cpu
MacOS x64: cpu
MacOS arm64: cpu| -|Web|[onnxruntime-web](https://www.npmjs.com/package/onnxruntime-web)|CPU and GPU|Chromium Browsers (Chrome, Edge): wasm, webgl, webgpu, webnn
Safari: wasm, webgl
Other Browsers: wasm
Node.js: wasm| -|React Native|[onnxruntime-react-native](https://www.npmjs.com/package/onnxruntime-react-native)|CPU|Android, iOS| - -- For Web, pre-built binaries are published in NPM package as well as served in CDNs. If you want to use a custom build, you can [build ONNX Runtime Web from source](../../build/web.md). -- For Node.js binding, to use on platforms without pre-built binaries, you can [build Node.js binding from source](../../build/inferencing.md#apis-and-language-bindings) and consume using `npm install /js/node/`. -- Explore a simple web application to [classify images with ONNX Runtime Web](https://onnxruntime.ai/docs/tutorials/web/classify-images-nextjs-github-template.html). - -## API Reference - -See [ONNX Runtime JavaScript API](../../api/js/index.html){:target="_blank"} for API reference. - -See also: - -- [ONNX Runtime JavaScript examples and API Usage](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js). - -- [ONNX Runtime Web demos](https://microsoft.github.io/onnxruntime-web-demo/#/) for image recognition, handwriting analysis, real-time emotion detection, object detection, and so on. - -- Typescript declarations for [Inference Session](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/inference-session.ts), [Tensor](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/tensor.ts), and [Environment Flags](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/env.ts) for reference. diff --git a/docs/get-started/with-javascript/node.md b/docs/get-started/with-javascript/node.md index af1e934d0fde8..48d6a370a3852 100644 --- a/docs/get-started/with-javascript/node.md +++ b/docs/get-started/with-javascript/node.md @@ -38,5 +38,17 @@ const ort = require('onnxruntime-node'); ## Supported Versions -ONNX Runtime Node.js binding supports Node.js v12.x+ or Electron v5.x+ +The following table lists the supported versions of ONNX Runtime Node.js binding provided with pre-built binaries. + +| EPs/Platforms | Windows x64 | Windows arm64 | Linux x64 | Linux arm64 | MacOS x64 | MacOS arm64 | +|--------------|--------|---------|--------|------|---|----| +| CPU | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | +| DirectML | ✔️ | ✔️ | ❌ | ❌ | ❌ | ❌ | +| CUDA | ❌ | ❌ | ✔️\[1] | ❌ | ❌ | ❌ | + + +- \[1]: CUDA v11.8. + + +For platforms not on the list or want a custom build, you can [build Node.js binding from source](../../build/inferencing.md#apis-and-language-bindings) and consume using `npm install /js/node/`. diff --git a/docs/get-started/with-javascript/web.md b/docs/get-started/with-javascript/web.md index fd7c8e5638f27..6a8d38da35354 100644 --- a/docs/get-started/with-javascript/web.md +++ b/docs/get-started/with-javascript/web.md @@ -54,13 +54,17 @@ For a complete table for importing, see [Conditional Importing](https://github.c ## Documentation -See [Tutorial: Web](../../tutorials/web/index.md) for more details. Please also check the following links: +See [ONNX Runtime JavaScript API](../../api/js/index.html){:target="_blank"} for API reference. Please also check the following links for API usage examples: - [Tensor](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_tensor) - a demonstration of basic usage of Tensor. - [Tensor <--> Image conversion](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage-tensor-image) - a demonstration of conversions from Image elements to and from Tensor. - [InferenceSession](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_inference-session) - a demonstration of basic usage of InferenceSession. - [SessionOptions](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_session-options) - a demonstration of how to configure creation of an InferenceSession instance. - [ort.env flags](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_ort-env-flags) - a demonstration of how to configure a set of global flags. +- See also: Typescript declarations for [Inference Session](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/inference-session.ts), [Tensor](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/tensor.ts), and [Environment Flags](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/env.ts) for reference. + +See [Tutorial: Web](../../tutorials/web/index.md) for tutorials. + See [Training on web demo](https://github.com/microsoft/onnxruntime-training-examples/tree/master/on_device_training/web) for training using onnxruntime-web. ## Examples @@ -70,9 +74,12 @@ The following examples describe how to use ONNX Runtime Web in your web applicat - [Quick Start (using script tag)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-script-tag) The following are E2E examples that uses ONNX Runtime Web in web applications: +- [Classify images with ONNX Runtime Web](https://onnxruntime.ai/docs/tutorials/web/classify-images-nextjs-github-template.html) - a simple web application using Next.js for image classifying. +- [ONNX Runtime Web demos](https://microsoft.github.io/onnxruntime-web-demo/#/) for image recognition, handwriting analysis, real-time emotion detection, object detection, and so on. - [OpenAI Whisper](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/ort-whisper) - demonstrates how to run [whisper tiny.en](https://github.com/openai/whisper) in your browser using onnxruntime-web and the browser's audio interfaces. - [Facebook Segment-Anything](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/segment-anything) - demonstrates how to run [segment-anything](https://github.com/facebookresearch/segment-anything) in your browser using onnxruntime-web with webgpu. + The following are video tutorials that use ONNX Runtime Web in web applications: - [ONNX Runtime Web for In Browser Inference](https://youtu.be/0dskvE4IvGM) - [Inference in Javascript with ONNX Runtime Web](https://youtu.be/vYzWrT3A7wQ) @@ -80,5 +87,15 @@ The following are video tutorials that use ONNX Runtime Web in web applications: ## Supported Versions - -ONNX Runtime supports mainstream modern browsers/OS on Windows, Ubuntu, macOS, Android, and iOS. Specifically, for Chromium-based browsers, ONNX Runtime Web supports wasm, webgl, webgpu, and webnn EPs. For Safari, ONNX Runtime Web supports wasm and webgl EPs. For other browsers or Node.js, ONNX Runtime Web supports wasm EP. +| EPs/Browsers | Chrome/Edge (Windows) | Chrome/Edge (Android) | Chrome/Edge (MacOS) | Chrome/Edge (iOS) | Safari (MacOS) | Safari (iOS) | Firefox (Windows) | Node.js | +|--------------|--------|---------|--------|------|---|----|------|-----| +| WebAssembly (CPU) | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️\[1] | +| WebGPU | ✔️\[2] | ✔️\[3] | ✔️ | ❌ | ❌ | ❌ | ❌ | ❌ | +| WebGL | ✔️\[4] | ✔️\[4] | ✔️\[4] | ✔️\[4] | ✔️\[4] | ✔️\[4] | ✔️\[4] | ❌ | +| WebNN | ✔️\[5] | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | + +- \[1]: Node.js only support single-threaded `wasm` EP. +- \[2]: WebGPU requires Chromium v113 or later on Windows. Float16 support requires Chrome v121 or later, and Edge v122 or later. +- \[3]: WebGPU requires Chromium v121 or later on Windows. +- \[4]: WebGL support is in maintenance mode. It is recommended to use WebGPU for better performance. +- \[5]: Requires to launch browser with commandline flag `--enable-experimental-web-platform-features`. \ No newline at end of file