diff --git a/docs/get-started/training-on-device.md b/docs/get-started/training-on-device.md index 53e57b5b02626..c22b2749f97be 100644 --- a/docs/get-started/training-on-device.md +++ b/docs/get-started/training-on-device.md @@ -1,7 +1,7 @@ --- title: On-Device Training parent: Get Started -nav_order: 12 +nav_order: 11 --- # On-Device Training with ONNX Runtime diff --git a/docs/get-started/with-javascript.md b/docs/get-started/with-javascript.md deleted file mode 100644 index 77f17520eff88..0000000000000 --- a/docs/get-started/with-javascript.md +++ /dev/null @@ -1,152 +0,0 @@ ---- -title: JavaScript -parent: Get Started -toc: true -nav_order: 6 ---- - -# Get started with ORT for JavaScript -{: .no_toc } - -ONNX Runtime JavaScript API is the unified interface used by [ONNX Runtime Node.js binding](https://github.com/microsoft/onnxruntime/tree/main/js/node), [ONNX Runtime Web](https://github.com/microsoft/onnxruntime/tree/main/js/web), and [ONNX Runtime for React Native](https://github.com/microsoft/onnxruntime/tree/main/js/react_native). - -## Contents -{: .no_toc } - -* TOC placeholder -{:toc} - -## ONNX Runtime Node.js binding -ONNX Runtime Node.js binding can be achieved by installing and importing. -### Install - -```bash -# install latest release version -npm install onnxruntime-node -``` - -### Import - -```js -// use ES6 style import syntax (recommended) -import * as ort from 'onnxruntime-node'; -``` -```js -// or use CommonJS style import syntax -const ort = require('onnxruntime-node'); -``` - -### Examples - -- Follow the [Quick Start](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-node) instructions for ONNX Runtime Node.js binding. - -### Supported Versions - -ONNX Runtime Node.js binding supports Node.js v12.x+ or Electron v5.x+ - -## ONNX Runtime Web -You can install and import ONNX Runtime Web. -### Install - -```bash -# install latest release version -npm install onnxruntime-web - -# install nightly build dev version -npm install onnxruntime-web@dev -``` - -### Import - - -```js -// use ES6 style import syntax (recommended) -import * as ort from 'onnxruntime-web'; -``` -```js -// or use CommonJS style import syntax -const ort = require('onnxruntime-web'); -``` - -If you want to use ONNX Runtime Web with WebGPU support (experimental feature), you need to import as below: - -```js -// use ES6 style import syntax (recommended) -import * as ort from 'onnxruntime-web/webgpu'; -``` -```js -// or use CommonJS style import syntax -const ort = require('onnxruntime-web/webgpu'); -``` - -### Examples - -ONNX Runtime Web can also be imported via a script tag in a HTML file, from a CDN server. Here are some examples: -- [Quick Start (using bundler)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-bundler) -- [Quick Start (using script tag)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-script-tag) -- [ONNX Runtime Web for In Browser Inference](https://youtu.be/0dskvE4IvGM) -- [Inference in Javascript with ONNX Runtime Web](https://youtu.be/vYzWrT3A7wQ) - - -### Supported Versions - - -ONNX Runtime supports mainstream modern browsers/OS on Windows, Ubuntu, macOS, Android, and iOS. You can check the [compatibility](https://github.com/Microsoft/onnxjs#Compatibility) of ONNX Runtime with modern browsers and operating systems for your desktop and mobile platforms. In-browser inference is possible with [ONNX Runtime Web JavaScript](https://cloudblogs.microsoft.com/opensource/2021/09/02/onnx-runtime-web-running-your-machine-learning-model-in-browser/) that can enable cross-platform portability for web-applications. - - - -## ONNX Runtime for React Native -You can install and import ONNX Runtime Web for React Native. -### Install - - -```bash -# install latest release version -npm install onnxruntime-react-native -``` - -### Import - - -```js -// use ES6 style import syntax (recommended) -import * as ort from 'onnxruntime-react-native'; -``` -```js -// or use CommonJS style import syntax -const ort = require('onnxruntime-react-native'); -``` - - -#### Enable ONNX Runtime Extensions for React Native -To enable support for [ONNX Runtime Extensions](https://github.com/microsoft/onnxruntime-extensions) in your React Native app, -you need to specify the following configuration as a top-level entry (note: usually where the package `name`and `version`fields are) in your project's root directory `package.json` file. - -```js -"onnxruntimeExtensionsEnabled": "true" -``` - - -## Builds - -[Builds](https://onnxruntime.ai/docs/build/web.html) are published to **npm** and can be installed using `npm install` - -| Package | Artifact | Description | Supported Platforms | -|---------|-----------|-------------|---------------------| -|Node.js binding|[onnxruntime-node](https://www.npmjs.com/package/onnxruntime-node)|CPU (Release)| Windows x64 CPU NAPI_v3, Linux x64 CPU NAPI_v3, MacOS x64 CPU NAPI_v3| -|Web|[onnxruntime-web](https://www.npmjs.com/package/onnxruntime-web)|CPU and GPU|Browsers (wasm, webgl), Node.js (wasm)| -|React Native|[onnxruntime-react-native](https://www.npmjs.com/package/onnxruntime-react-native)|CPU|Android, iOS| - -- For Node.js binding, to use on platforms without pre-built binaries, you can [build Node.js binding from source](../build/inferencing.md#apis-and-language-bindings) and consume using `npm install /js/node/`. -- Consider the [options and considerations](https://onnxruntime.ai/docs/reference/build-web-app.html) for building a Web app with ONNX Runtime Web using JavaScript. -- Explore a simple web application to [classify images with ONNX Runtime Web](https://onnxruntime.ai/docs/tutorials/web/classify-images-nextjs-github-template.html). - -## API Reference - -See [ONNX Runtime JavaScript API](../api/js/index.html){:target="_blank"} for API reference. Check out the [ONNX Runtime Web demos!](https://microsoft.github.io/onnxruntime-web-demo/#/) for image recognition, handwriting analysis, real-time emotion detection, object detection, and so on. - -See also: - -- [ONNX Runtime JavaScript examples and API Usage](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js). - -- Typescript declarations for [Inference Session](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/inference-session.ts), [Tensor](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/tensor.ts), and [Environment Flags](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/env.ts) for reference. diff --git a/docs/get-started/with-javascript/index.md b/docs/get-started/with-javascript/index.md new file mode 100644 index 0000000000000..26c03b6366f9f --- /dev/null +++ b/docs/get-started/with-javascript/index.md @@ -0,0 +1,21 @@ +--- +title: JavaScript +parent: Get Started +has_children: true +toc: false +nav_order: 6 +--- + +# Get started with ORT for JavaScript +{: .no_toc } + +ONNX Runtime JavaScript API is the unified interface used by [ONNX Runtime Node.js binding](https://github.com/microsoft/onnxruntime/tree/main/js/node), [ONNX Runtime Web](https://github.com/microsoft/onnxruntime/tree/main/js/web), and [ONNX Runtime for React Native](https://github.com/microsoft/onnxruntime/tree/main/js/react_native). + +See [how to choose the right package](../../tutorials/web/build-web-app#options-for-deployment-target) for your JavaScript application. + +## Contents +{: .no_toc } + +* Get Started with [ONNX Runtime Web](web.md) +* Get Started with [ONNX Runtime Node.js binding](node.md) +* Get Started with [ONNX Runtime for React Native](react-native.md) diff --git a/docs/get-started/with-javascript/node.md b/docs/get-started/with-javascript/node.md new file mode 100644 index 0000000000000..48d6a370a3852 --- /dev/null +++ b/docs/get-started/with-javascript/node.md @@ -0,0 +1,54 @@ +--- +title: Node.js binding +parent: JavaScript +grand_parent: Get Started +has_children: false +nav_order: 2 +--- + +# Get started with ONNX Runtime Node.js binding + +## Contents +{: .no_toc } + +* TOC placeholder +{:toc} + +## Install + +```bash +# install latest release version +npm install onnxruntime-node +``` + +## Import + +```js +// use ES6 style import syntax (recommended) +import * as ort from 'onnxruntime-node'; +``` +```js +// or use CommonJS style import syntax +const ort = require('onnxruntime-node'); +``` + +## Examples + +- Follow the [Quick Start](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-node) instructions for ONNX Runtime Node.js binding. + +## Supported Versions + +The following table lists the supported versions of ONNX Runtime Node.js binding provided with pre-built binaries. + + +| EPs/Platforms | Windows x64 | Windows arm64 | Linux x64 | Linux arm64 | MacOS x64 | MacOS arm64 | +|--------------|--------|---------|--------|------|---|----| +| CPU | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | +| DirectML | ✔️ | ✔️ | ❌ | ❌ | ❌ | ❌ | +| CUDA | ❌ | ❌ | ✔️\[1] | ❌ | ❌ | ❌ | + + +- \[1]: CUDA v11.8. + + +For platforms not on the list or want a custom build, you can [build Node.js binding from source](../../build/inferencing.md#apis-and-language-bindings) and consume using `npm install /js/node/`. diff --git a/docs/get-started/with-javascript/react-native.md b/docs/get-started/with-javascript/react-native.md new file mode 100644 index 0000000000000..2e4ec725cd40e --- /dev/null +++ b/docs/get-started/with-javascript/react-native.md @@ -0,0 +1,46 @@ +--- +title: React Native +parent: JavaScript +grand_parent: Get Started +has_children: false +nav_order: 3 +--- + +# Get started with ONNX Runtime for React Native + +## Contents +{: .no_toc } + +* TOC placeholder +{:toc} + + +## Install + + +```bash +# install latest release version +npm install onnxruntime-react-native +``` + +## Import + + +```js +// use ES6 style import syntax (recommended) +import * as ort from 'onnxruntime-react-native'; +``` +```js +// or use CommonJS style import syntax +const ort = require('onnxruntime-react-native'); +``` + + +### Enable ONNX Runtime Extensions for React Native +To enable support for [ONNX Runtime Extensions](https://github.com/microsoft/onnxruntime-extensions) in your React Native app, +you need to specify the following configuration as a top-level entry (note: usually where the package `name`and `version`fields are) in your project's root directory `package.json` file. + +```js +"onnxruntimeExtensionsEnabled": "true" +``` + diff --git a/docs/get-started/with-javascript/web.md b/docs/get-started/with-javascript/web.md new file mode 100644 index 0000000000000..6a8d38da35354 --- /dev/null +++ b/docs/get-started/with-javascript/web.md @@ -0,0 +1,101 @@ +--- +title: Web +parent: JavaScript +grand_parent: Get Started +has_children: false +nav_order: 1 +--- + +# Get started with ONNX Runtime Web + +## Contents +{: .no_toc } + +* TOC placeholder +{:toc} + +## Install + +Use the following command in shell to install ONNX Runtime Web: + +```bash +# install latest release version +npm install onnxruntime-web + +# install nightly build dev version +npm install onnxruntime-web@dev +``` + +## Import + +Use the following JavaScript code to import ONNX Runtime Web: + +```js +// use ES6 style import syntax (recommended) +import * as ort from 'onnxruntime-web'; +``` +```js +// or use CommonJS style import syntax +const ort = require('onnxruntime-web'); +``` + +If you want to use ONNX Runtime Web with WebGPU support (experimental feature), you need to import as below: + +```js +// use ES6 style import syntax (recommended) +import * as ort from 'onnxruntime-web/webgpu'; +``` +```js +// or use CommonJS style import syntax +const ort = require('onnxruntime-web/webgpu'); +``` + +For a complete table for importing, see [Conditional Importing](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/importing_onnxruntime-web#conditional-importing). + +## Documentation + +See [ONNX Runtime JavaScript API](../../api/js/index.html){:target="_blank"} for API reference. Please also check the following links for API usage examples: +- [Tensor](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_tensor) - a demonstration of basic usage of Tensor. +- [Tensor <--> Image conversion](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage-tensor-image) - a demonstration of conversions from Image elements to and from Tensor. +- [InferenceSession](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_inference-session) - a demonstration of basic usage of InferenceSession. +- [SessionOptions](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_session-options) - a demonstration of how to configure creation of an InferenceSession instance. +- [ort.env flags](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_ort-env-flags) - a demonstration of how to configure a set of global flags. + +- See also: Typescript declarations for [Inference Session](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/inference-session.ts), [Tensor](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/tensor.ts), and [Environment Flags](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/env.ts) for reference. + +See [Tutorial: Web](../../tutorials/web/index.md) for tutorials. + +See [Training on web demo](https://github.com/microsoft/onnxruntime-training-examples/tree/master/on_device_training/web) for training using onnxruntime-web. + +## Examples + +The following examples describe how to use ONNX Runtime Web in your web applications for model inferencing: +- [Quick Start (using bundler)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-bundler) +- [Quick Start (using script tag)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-script-tag) + +The following are E2E examples that uses ONNX Runtime Web in web applications: +- [Classify images with ONNX Runtime Web](https://onnxruntime.ai/docs/tutorials/web/classify-images-nextjs-github-template.html) - a simple web application using Next.js for image classifying. +- [ONNX Runtime Web demos](https://microsoft.github.io/onnxruntime-web-demo/#/) for image recognition, handwriting analysis, real-time emotion detection, object detection, and so on. +- [OpenAI Whisper](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/ort-whisper) - demonstrates how to run [whisper tiny.en](https://github.com/openai/whisper) in your browser using onnxruntime-web and the browser's audio interfaces. +- [Facebook Segment-Anything](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/segment-anything) - demonstrates how to run [segment-anything](https://github.com/facebookresearch/segment-anything) in your browser using onnxruntime-web with webgpu. + + +The following are video tutorials that use ONNX Runtime Web in web applications: +- [ONNX Runtime Web for In Browser Inference](https://youtu.be/0dskvE4IvGM) +- [Inference in Javascript with ONNX Runtime Web](https://youtu.be/vYzWrT3A7wQ) + + +## Supported Versions + +| EPs/Browsers | Chrome/Edge (Windows) | Chrome/Edge (Android) | Chrome/Edge (MacOS) | Chrome/Edge (iOS) | Safari (MacOS) | Safari (iOS) | Firefox (Windows) | Node.js | +|--------------|--------|---------|--------|------|---|----|------|-----| +| WebAssembly (CPU) | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️\[1] | +| WebGPU | ✔️\[2] | ✔️\[3] | ✔️ | ❌ | ❌ | ❌ | ❌ | ❌ | +| WebGL | ✔️\[4] | ✔️\[4] | ✔️\[4] | ✔️\[4] | ✔️\[4] | ✔️\[4] | ✔️\[4] | ❌ | +| WebNN | ✔️\[5] | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | + +- \[1]: Node.js only support single-threaded `wasm` EP. +- \[2]: WebGPU requires Chromium v113 or later on Windows. Float16 support requires Chrome v121 or later, and Edge v122 or later. +- \[3]: WebGPU requires Chromium v121 or later on Windows. +- \[4]: WebGL support is in maintenance mode. It is recommended to use WebGPU for better performance. +- \[5]: Requires to launch browser with commandline flag `--enable-experimental-web-platform-features`. \ No newline at end of file diff --git a/docs/get-started/with-web.md b/docs/get-started/with-web.md deleted file mode 100644 index ab757108ba9bd..0000000000000 --- a/docs/get-started/with-web.md +++ /dev/null @@ -1,21 +0,0 @@ ---- -title: Web -description: ONNX Runtime for web-based deployments and considerations and options for building a web application with ONNX Runtime -has_toc: false -nav_order: 11 -parent: Get Started -redirect_from: /docs/reference/build-web-app - ---- - -# Get started with ONNX Runtime Web -ORT Web can be used in your web applications for model inferencing. - -{: .no_toc} - -## Reference -* [Install ONNX Runtime Web](./../install/index.md#install-on-web-and-mobile) -* [Build from source](./../build/web.md) -* [Tutorials: Deploy on web](./../tutorials/web/index.md) - * [Guide: Build a web application with ONNX Runtime](./../tutorials/web/build-web-app) -* [Training on web demo](https://github.com/microsoft/onnxruntime-training-examples/tree/master/on_device_training/web) diff --git a/docs/tutorials/web/build-web-app.md b/docs/tutorials/web/build-web-app.md index 6e453bf8a8f7a..8dbe1bdca9b33 100644 --- a/docs/tutorials/web/build-web-app.md +++ b/docs/tutorials/web/build-web-app.md @@ -1,7 +1,7 @@ --- title: Build a web app with ONNX Runtime description: Considerations and options for building a web application with ONNX Runtime -parent: Deploy on web +parent: Web grand_parent: Tutorials nav_order: 3 redirect_from: /reference/build-web-app @@ -79,7 +79,7 @@ Add "@dev" to the package name to use the nightly build (eg. npm install onnxrun ## Consume onnxruntime-web in your code 1. Import onnxruntime-web - See [import onnxruntime-web](../../get-started/with-javascript.md#import-1) + See [import onnxruntime-web](../../get-started/with-javascript/web.md#import) 2. Initialize the inference session See [InferenceSession.create](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/quick-start_onnxruntime-web-bundler/main.js#L14) diff --git a/docs/tutorials/web/classify-images-nextjs-github-template.md b/docs/tutorials/web/classify-images-nextjs-github-template.md index cdea907742d73..022768f526df6 100644 --- a/docs/tutorials/web/classify-images-nextjs-github-template.md +++ b/docs/tutorials/web/classify-images-nextjs-github-template.md @@ -1,7 +1,7 @@ --- title: Classify images with ONNX Runtime and Next.js description: Classify images in a NextJS web application built from a GitHub template repo -parent: Deploy on web +parent: Web grand_parent: Tutorials has_children: false nav_order: 1 diff --git a/docs/tutorials/web/excel-addin-bert-js.md b/docs/tutorials/web/excel-addin-bert-js.md index 544ba147ea75e..a99e07a3c1291 100644 --- a/docs/tutorials/web/excel-addin-bert-js.md +++ b/docs/tutorials/web/excel-addin-bert-js.md @@ -1,7 +1,7 @@ --- title: Custom Excel Functions for BERT Tasks in JavaScript description: Custom Excel Functions for BERT Tasks in JavaScript -parent: Deploy on web +parent: Web grand_parent: Tutorials has_children: false nav_order: 2 diff --git a/docs/tutorials/web/index.md b/docs/tutorials/web/index.md index e36391237e449..32b19c8dbf752 100644 --- a/docs/tutorials/web/index.md +++ b/docs/tutorials/web/index.md @@ -1,5 +1,5 @@ --- -title: Deploy on web +title: Web parent: Tutorials has_children: true nav_order: 7