Skip to content

Commit

Permalink
re-arrange web docs
Browse files Browse the repository at this point in the history
  • Loading branch information
fs-eire committed Feb 19, 2024
1 parent 9f8bb73 commit 20c26a3
Show file tree
Hide file tree
Showing 11 changed files with 226 additions and 178 deletions.
2 changes: 1 addition & 1 deletion docs/get-started/training-on-device.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: On-Device Training
parent: Get Started
nav_order: 12
nav_order: 11
---

# On-Device Training with ONNX Runtime
Expand Down
152 changes: 0 additions & 152 deletions docs/get-started/with-javascript.md

This file was deleted.

49 changes: 49 additions & 0 deletions docs/get-started/with-javascript/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
---
title: JavaScript
parent: Get Started
has_children: true
toc: true
nav_order: 6
---

# Get started with ORT for JavaScript
{: .no_toc }

ONNX Runtime JavaScript API is the unified interface used by [ONNX Runtime Node.js binding](https://github.com/microsoft/onnxruntime/tree/main/js/node), [ONNX Runtime Web](https://github.com/microsoft/onnxruntime/tree/main/js/web), and [ONNX Runtime for React Native](https://github.com/microsoft/onnxruntime/tree/main/js/react_native).

See [how to choose the right package](../tutorials/web/build-web-app#options-for-deployment-target) for your JavaScript application.

## Contents
{: .no_toc }

* Get Started with [ONNX Runtime Web](web.md)
* Get Started with [ONNX Runtime Node.js binding](node.md)
* Get Started with [ONNX Runtime for React Native](react-native.md)
* [Builds](#builds)
* [API Reference](#api-reference)

## Builds

[Builds](https://onnxruntime.ai/docs/build/web.html) are published to **npm** and can be installed using `npm install`

| Package | Artifact | Description | Supported Platforms |
|---------|-----------|-------------|---------------------|
|Node.js binding|[onnxruntime-node](https://www.npmjs.com/package/onnxruntime-node)|CPU and GPU (Release/NAPI_v3)| Windows x64: cpu, dml<br/> Windows arm64: cpu, dml<br/> Linux x64: cpu, cuda<br/> Linux arm64: cpu<br/> MacOS x64: cpu<br/> MacOS arm64: cpu|
|Web|[onnxruntime-web](https://www.npmjs.com/package/onnxruntime-web)|CPU and GPU|Chromium Browsers (Chrome, Edge): wasm, webgl, webgpu, webnn<br/>Safari: wasm, webgl<br/>Other Browsers: wasm<br/> Node.js: wasm|
|React Native|[onnxruntime-react-native](https://www.npmjs.com/package/onnxruntime-react-native)|CPU|Android, iOS|

- For Web, pre-built binaries are published in NPM package as well as served in CDNs. See [Deploy ONNX Runtime Web](TBD) for more details. If you want to use a custom build, you can [build ONNX Runtime Web from source](../build/web.md).
- For Node.js binding, to use on platforms without pre-built binaries, you can [build Node.js binding from source](../build/inferencing.md#apis-and-language-bindings) and consume using `npm install <onnxruntime_repo_root>/js/node/`.
- Explore a simple web application to [classify images with ONNX Runtime Web](https://onnxruntime.ai/docs/tutorials/web/classify-images-nextjs-github-template.html).

## API Reference

See [ONNX Runtime JavaScript API](../api/js/index.html){:target="_blank"} for API reference.

See also:

- [ONNX Runtime JavaScript examples and API Usage](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js).

- [ONNX Runtime Web demos](https://microsoft.github.io/onnxruntime-web-demo/#/) for image recognition, handwriting analysis, real-time emotion detection, object detection, and so on.

- Typescript declarations for [Inference Session](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/inference-session.ts), [Tensor](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/tensor.ts), and [Environment Flags](https://github.com/microsoft/onnxruntime/blob/main/js/common/lib/env.ts) for reference.
42 changes: 42 additions & 0 deletions docs/get-started/with-javascript/node.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
---
title: Node.js binding
parent: JavaScript
grand_parent: Get Started
has_children: false
nav_order: 2
---

# Get started with ONNX Runtime Node.js binding

## Contents
{: .no_toc }

* TOC placeholder
{:toc}

## Install

```bash
# install latest release version
npm install onnxruntime-node
```

## Import

```js
// use ES6 style import syntax (recommended)
import * as ort from 'onnxruntime-node';
```
```js
// or use CommonJS style import syntax
const ort = require('onnxruntime-node');
```

## Examples

- Follow the [Quick Start](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-node) instructions for ONNX Runtime Node.js binding.

## Supported Versions

ONNX Runtime Node.js binding supports Node.js v12.x+ or Electron v5.x+

46 changes: 46 additions & 0 deletions docs/get-started/with-javascript/react-native.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
---
title: React Native
parent: JavaScript
grand_parent: Get Started
has_children: false
nav_order: 3
---

# Get started with ONNX Runtime for React Native

## Contents
{: .no_toc }

* TOC placeholder
{:toc}


## Install


```bash
# install latest release version
npm install onnxruntime-react-native
```

## Import


```js
// use ES6 style import syntax (recommended)
import * as ort from 'onnxruntime-react-native';
```
```js
// or use CommonJS style import syntax
const ort = require('onnxruntime-react-native');
```


### Enable ONNX Runtime Extensions for React Native
To enable support for [ONNX Runtime Extensions](https://github.com/microsoft/onnxruntime-extensions) in your React Native app,
you need to specify the following configuration as a top-level entry (note: usually where the package `name`and `version`fields are) in your project's root directory `package.json` file.

```js
"onnxruntimeExtensionsEnabled": "true"
```

84 changes: 84 additions & 0 deletions docs/get-started/with-javascript/web.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
---
title: Web
parent: JavaScript
grand_parent: Get Started
has_children: false
nav_order: 1
---

# Get started with ONNX Runtime Web

## Contents
{: .no_toc }

* TOC placeholder
{:toc}

## Install

Use the following command in shell to install ONNX Runtime Web:

```bash
# install latest release version
npm install onnxruntime-web

# install nightly build dev version
npm install onnxruntime-web@dev
```

## Import

Use the following JavaScript code to import ONNX Runtime Web:

```js
// use ES6 style import syntax (recommended)
import * as ort from 'onnxruntime-web';
```
```js
// or use CommonJS style import syntax
const ort = require('onnxruntime-web');
```

If you want to use ONNX Runtime Web with WebGPU support (experimental feature), you need to import as below:

```js
// use ES6 style import syntax (recommended)
import * as ort from 'onnxruntime-web/webgpu';
```
```js
// or use CommonJS style import syntax
const ort = require('onnxruntime-web/webgpu');
```

For a complete table for importing, see [Conditional Importing](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/importing_onnxruntime-web#conditional-importing).

## Documentation

See [Tutorial: Web](../tutorials/web/index.md) for more details. Please also check the following links:
- [Tensor](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_tensor) - a demonstration of basic usage of Tensor.
- [Tensor <--> Image conversion](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage-tensor-image) - a demonstration of conversions from Image elements to and from Tensor.
- [InferenceSession](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_inference-session) - a demonstration of basic usage of InferenceSession.
- [SessionOptions](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_session-options) - a demonstration of how to configure creation of an InferenceSession instance.
- [ort.env flags](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/api-usage_ort-env-flags) - a demonstration of how to configure a set of global flags.

See [Training on web demo](https://github.com/microsoft/onnxruntime-training-examples/tree/master/on_device_training/web) for training using onnxruntime-web.

## Examples

The following examples describe how to use ONNX Runtime Web in your web applications for model inferencing:
- [Quick Start (using bundler)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-bundler)
- [Quick Start (using script tag)](https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-script-tag)

The following are E2E examples that uses ONNX Runtime Web in web applications:
- [OpenAI Whisper](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/ort-whisper) - demonstrates how to run [whisper tiny.en](https://github.com/openai/whisper) in your browser using onnxruntime-web and the browser's audio interfaces.
- [Facebook Segment-Anything](https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/segment-anything) - demonstrates how to run [segment-anything](https://github.com/facebookresearch/segment-anything) in your browser using onnxruntime-web with webgpu.

The following are video tutorials that use ONNX Runtime Web in web applications:
- [ONNX Runtime Web for In Browser Inference](https://youtu.be/0dskvE4IvGM)
- [Inference in Javascript with ONNX Runtime Web](https://youtu.be/vYzWrT3A7wQ)


## Supported Versions


ONNX Runtime supports mainstream modern browsers/OS on Windows, Ubuntu, macOS, Android, and iOS. Specifically, for Chromium-based browsers, ONNX Runtime Web supports wasm, webgl, webgpu, and webnn EPs. For Safari, ONNX Runtime Web supports wasm and webgl EPs. For other browsers or Node.js, ONNX Runtime Web supports wasm EP.
Loading

0 comments on commit 20c26a3

Please sign in to comment.