Skip to content

Commit

Permalink
[doc] Update "Get Started" page for ORT web (#19568)
Browse files Browse the repository at this point in the history
### Description
This PR re-arranges documentation of ORT web.

Current doc: https://onnxruntime.ai/docs/
Change preview: https://fs-eire.github.io/onnxruntime/docs/

- split `Doc -> Get Started -> JavaScript` from single page into
multiple pages, to make it easier to split from web, node and
react-native
- remove `Doc -> Get Started -> with Web`. The page was a few links and
now they are moved to other places
- add a few contents into `Doc -> Get Started -> JavaScript -> Web`. Now
this is the main page for ORT web doc.
- rename `Tutorials -> deploy on Web` to `Tutorials -> Web`. "deploy" is
confusing as it's not the accurate term to describe the content.

=================================================================


### Discussions:
TBD


=================================================================


### ORT web documentation work item list:

- [ ] Update `Get Started` page for ORT web at onnxruntime.ai **( this
PR )**
    - to make it easier for users to navigate
- [ ] Update `Tutorials` page for ORT web at onnxruntime.ai
- The purpose of this page is unclear. It contains descriptive
information and step-by-step instructions on an E2E examples. It can be
a good Blog, but not good for documentation, which usually has a
well-structured content.
- The [index
page](https://fs-eire.github.io/onnxruntime/docs/tutorials/web/) has
duplicated contents with its child page ["Build a web application with
ONNX
Runtime"](https://fs-eire.github.io/onnxruntime/docs/tutorials/web/build-web-app.html)
    - Some contents are out-of-dated.
- [ ] Update [`API Usage -
SessionOptions`](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/api-usage_session-options)
on onnxruntime-inference-examples repo.
- Add explaination and examples if necessary for all new WebGPU session
options.
- [ ] Update [`API usage - ort.env
flags`](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_ort-env-flags)
on onnxruntime-inference-examples repo.
    - Add all newly introduced flags.
  • Loading branch information
fs-eire authored Feb 23, 2024
1 parent 154a0be commit cfdb434
Show file tree
Hide file tree
Showing 11 changed files with 228 additions and 179 deletions.
2 changes: 1 addition & 1 deletion docs/get-started/training-on-device.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: On-Device Training
parent: Get Started
nav_order: 12
nav_order: 11
---

# On-Device Training with ONNX Runtime
Expand Down
152 changes: 0 additions & 152 deletions docs/get-started/with-javascript.md

This file was deleted.

21 changes: 21 additions & 0 deletions docs/get-started/with-javascript/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
---
title: JavaScript
parent: Get Started
has_children: true
toc: false
nav_order: 6
---

# Get started with ORT for JavaScript
{: .no_toc }

ONNX Runtime JavaScript API is the unified interface used by [ONNX Runtime Node.js binding](https://github.com/microsoft/onnxruntime/tree/main/js/node), [ONNX Runtime Web](https://github.com/microsoft/onnxruntime/tree/main/js/web), and [ONNX Runtime for React Native](https://github.com/microsoft/onnxruntime/tree/main/js/react_native).

See [how to choose the right package](../../tutorials/web/build-web-app#options-for-deployment-target) for your JavaScript application.

## Contents
{: .no_toc }

* Get Started with [ONNX Runtime Web](web.md)
* Get Started with [ONNX Runtime Node.js binding](node.md)
* Get Started with [ONNX Runtime for React Native](react-native.md)
54 changes: 54 additions & 0 deletions docs/get-started/with-javascript/node.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
---
title: Node.js binding
parent: JavaScript
grand_parent: Get Started
has_children: false
nav_order: 2
---

# Get started with ONNX Runtime Node.js binding

## Contents
{: .no_toc }

* TOC placeholder
{:toc}

## Install

```bash
# install latest release version
npm install onnxruntime-node
```

## Import

```js
// use ES6 style import syntax (recommended)
import * as ort from 'onnxruntime-node';
```
```js
// or use CommonJS style import syntax
const ort = require('onnxruntime-node');
```

## Examples

- Follow the [Quick Start](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-node) instructions for ONNX Runtime Node.js binding.

## Supported Versions

The following table lists the supported versions of ONNX Runtime Node.js binding provided with pre-built binaries.


| EPs/Platforms | Windows x64 | Windows arm64 | Linux x64 | Linux arm64 | MacOS x64 | MacOS arm64 |
|--------------|--------|---------|--------|------|---|----|
| CPU | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ |
| DirectML | ✔️ | ✔️ |||||
| CUDA ||| ✔️<sup>\[1]</sup> ||||


- \[1]: CUDA v11.8.


For platforms not on the list or want a custom build, you can [build Node.js binding from source](../../build/inferencing.md#apis-and-language-bindings) and consume using `npm install <onnxruntime_repo_root>/js/node/`.
46 changes: 46 additions & 0 deletions docs/get-started/with-javascript/react-native.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
---
title: React Native
parent: JavaScript
grand_parent: Get Started
has_children: false
nav_order: 3
---

# Get started with ONNX Runtime for React Native

## Contents
{: .no_toc }

* TOC placeholder
{:toc}


## Install


```bash
# install latest release version
npm install onnxruntime-react-native
```

## Import


```js
// use ES6 style import syntax (recommended)
import * as ort from 'onnxruntime-react-native';
```
```js
// or use CommonJS style import syntax
const ort = require('onnxruntime-react-native');
```


### Enable ONNX Runtime Extensions for React Native
To enable support for [ONNX Runtime Extensions](https://github.com/microsoft/onnxruntime-extensions) in your React Native app,
you need to specify the following configuration as a top-level entry (note: usually where the package `name`and `version`fields are) in your project's root directory `package.json` file.

```js
"onnxruntimeExtensionsEnabled": "true"
```

101 changes: 101 additions & 0 deletions docs/get-started/with-javascript/web.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
---
title: Web
parent: JavaScript
grand_parent: Get Started
has_children: false
nav_order: 1
---

# Get started with ONNX Runtime Web

## Contents
{: .no_toc }

* TOC placeholder
{:toc}

## Install

Use the following command in shell to install ONNX Runtime Web:

```bash
# install latest release version
npm install onnxruntime-web

# install nightly build dev version
npm install onnxruntime-web@dev
```

## Import

Use the following JavaScript code to import ONNX Runtime Web:

```js
// use ES6 style import syntax (recommended)
import * as ort from 'onnxruntime-web';
```
```js
// or use CommonJS style import syntax
const ort = require('onnxruntime-web');
```

If you want to use ONNX Runtime Web with WebGPU support (experimental feature), you need to import as below:

```js
// use ES6 style import syntax (recommended)
import * as ort from 'onnxruntime-web/webgpu';
```
```js
// or use CommonJS style import syntax
const ort = require('onnxruntime-web/webgpu');
```

For a complete table for importing, see [Conditional Importing](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/importing_onnxruntime-web#conditional-importing).

## Documentation

See [ONNX Runtime JavaScript API](../../api/js/index.html){:target="_blank"} for API reference. Please also check the following links for API usage examples:
- [Tensor](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_tensor) - a demonstration of basic usage of Tensor.
- [Tensor <--> Image conversion](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage-tensor-image) - a demonstration of conversions from Image elements to and from Tensor.
- [InferenceSession](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_inference-session) - a demonstration of basic usage of InferenceSession.
- [SessionOptions](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_session-options) - a demonstration of how to configure creation of an InferenceSession instance.
- [ort.env flags](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_ort-env-flags) - a demonstration of how to configure a set of global flags.

- See also: Typescript declarations for [Inference Session](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/inference-session.ts), [Tensor](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/tensor.ts), and [Environment Flags](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/env.ts) for reference.

See [Tutorial: Web](../../tutorials/web/index.md) for tutorials.

See [Training on web demo](https://github.com/microsoft/onnxruntime-training-examples/tree/master/on_device_training/web) for training using onnxruntime-web.

## Examples

The following examples describe how to use ONNX Runtime Web in your web applications for model inferencing:
- [Quick Start (using bundler)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-bundler)
- [Quick Start (using script tag)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-script-tag)

The following are E2E examples that uses ONNX Runtime Web in web applications:
- [Classify images with ONNX Runtime Web](https://onnxruntime.ai/docs/tutorials/web/classify-images-nextjs-github-template.html) - a simple web application using Next.js for image classifying.
- [ONNX Runtime Web demos](https://microsoft.github.io/onnxruntime-web-demo/#/) for image recognition, handwriting analysis, real-time emotion detection, object detection, and so on.
- [OpenAI Whisper](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/ort-whisper) - demonstrates how to run [whisper tiny.en](https://github.com/openai/whisper) in your browser using onnxruntime-web and the browser's audio interfaces.
- [Facebook Segment-Anything](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/segment-anything) - demonstrates how to run [segment-anything](https://github.com/facebookresearch/segment-anything) in your browser using onnxruntime-web with webgpu.


The following are video tutorials that use ONNX Runtime Web in web applications:
- [ONNX Runtime Web for In Browser Inference](https://youtu.be/0dskvE4IvGM)
- [Inference in Javascript with ONNX Runtime Web](https://youtu.be/vYzWrT3A7wQ)


## Supported Versions

| EPs/Browsers | Chrome/Edge (Windows) | Chrome/Edge (Android) | Chrome/Edge (MacOS) | Chrome/Edge (iOS) | Safari (MacOS) | Safari (iOS) | Firefox (Windows) | Node.js |
|--------------|--------|---------|--------|------|---|----|------|-----|
| WebAssembly (CPU) | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️<sup>\[1]</sup> |
| WebGPU | ✔️<sup>\[2]</sup> | ✔️<sup>\[3]</sup> | ✔️ ||||||
| WebGL | ✔️<sup>\[4]</sup> | ✔️<sup>\[4]</sup> | ✔️<sup>\[4]</sup> | ✔️<sup>\[4]</sup> | ✔️<sup>\[4]</sup> | ✔️<sup>\[4]</sup> | ✔️<sup>\[4]</sup> ||
| WebNN | ✔️<sup>\[5]</sup> ||||||||

- \[1]: Node.js only support single-threaded `wasm` EP.
- \[2]: WebGPU requires Chromium v113 or later on Windows. Float16 support requires Chrome v121 or later, and Edge v122 or later.
- \[3]: WebGPU requires Chromium v121 or later on Windows.
- \[4]: WebGL support is in maintenance mode. It is recommended to use WebGPU for better performance.
- \[5]: Requires to launch browser with commandline flag `--enable-experimental-web-platform-features`.
Loading

0 comments on commit cfdb434

Please sign in to comment.