Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

Commit

Permalink
Adding a description of our supported platforms. (#26)
Browse files Browse the repository at this point in the history
doc: update compatibility and operators supported
  • Loading branch information
liuziyue authored Dec 8, 2018
1 parent d889e77 commit 7652e0c
Show file tree
Hide file tree
Showing 3 changed files with 101 additions and 59 deletions.
24 changes: 21 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,15 @@ ONNX.js is a Javascript library for running ONNX models on browsers and on Node.
ONNX.js has adopted WebAssembly and WebGL technologies for providing an optimized ONNX model inference runtime for both CPUs and GPUs.

### Why ONNX models
The [Open Neural Network Exchange](http://onnx.ai/) (ONNX) is an open standard for representing machine learning models. The biggest advantage of ONNX is that it allows interoperability across different open source AI frameworks, which itself offers more flexibility for AI frameworks adoption. [This](#Getting-ONNX-models) is a great place to start getting acquainted with ONNX models.
The [Open Neural Network Exchange](http://onnx.ai/) (ONNX) is an open standard for representing machine learning models. The biggest advantage of ONNX is that it allows interoperability across different open source AI frameworks, which itself offers more flexibility for AI frameworks adoption. See [Getting ONNX Models](#Getting-ONNX-models).

### Why ONNX.js
With ONNX.js, web developers can score pre-trained ONNX models directly on browsers with various benefits of reducing server-client communication and protecting user privacy, as well as offering install-free and cross-platform in-browser ML experience.

ONNX.js can run on both CPU and GPU. For running on CPU, [WebAssembly](https://developer.mozilla.org/en-US/docs/WebAssembly) is adopted to execute model at near-native speed. Furthermore, ONNX.js utilizes [Web Workers](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers) to provide a "multi-threaded" environment to parallelize data processing. Empirical evaluation shows very promising performance gains on CPU by taking full advantage of WebAssembly and Web Workers. For running on GPUs, a popular standard for accessing GPU capabilities - WebGL is adopted. ONNX.js has further adopted several novel optimization techniques for reducing data transfer between CPU and GPU, as well as some techniques to reduce GPU processing cycles to further push the performance to the maximum.

See [Compatibility](#Compatibility) and [Operators Supported](#Operators) for a list of platforms and operators ONNX.js currently supports.

### Benchmarks

Benchmarks have been run against the most prominent open source solutions in the same market. Below are the results collected for Chrome and Edge browsers on one sample machine (computations run on both CPU and GPU):
Expand Down Expand Up @@ -134,8 +136,24 @@ Learn more about ONNX
- [ONNX website](http://onnx.ai/)
- [ONNX on GitHub](https://github.com/onnx/onnx)

### Operators supported
The [list](./docs/operators.md) of ONNX operators supported by each of the 3 available builtin backends (cpu, wasm, and webgl).
### Compatibility
#### Desktop Platforms
| OS/Browser | Chrome | Edge | FireFox | Safari | Opera | Electron | Node.js |
|:----------:|:------:|:----:|:-------:|:------:|:-----:|:-----:|:-----:|
| Windows 10 | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | - | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: |
| macOS | :heavy_check_mark: | - | :heavy_check_mark: | Coming soon | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: |
| Ubuntu LTS 18.04 | :heavy_check_mark: | - | :heavy_check_mark: | - | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: |

#### Mobile Platforms
| OS/Browser | Chrome | Edge | FireFox | Safari | Opera |
|:----------:|:------:|:----:|:-------:|:------:|:-----:|
| iOS | Coming soon | - | Coming soon | Coming soon | Coming soon |
| Android | :heavy_check_mark: | :heavy_check_mark: | Coming soon | - | :heavy_check_mark: |

### Operators
ONNX.js currently supports most operators in [ai.onnx](https://github.com/onnx/onnx/blob/rel-1.2.3/docs/Operators.md) operator set v7 (opset v7). See [operators.md](./docs/operators.md) for a complete, detailed list of which ONNX operators are supported by the 3 available builtin backends (cpu, wasm, and webgl).

Support for [ai.onnx.ml](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md) operators is coming soon. [operators-ml.md](./docs/operators-ml.md) has the most recent status of ai.onnx.ml operators.

## Contribute
We’d love to embrace your contribution to ONNX.js. Please refer to [CONTRIBUTING.md](./CONTRIBUTING.md).
Expand Down
24 changes: 24 additions & 0 deletions docs/operators-ml.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
The following table lists the [ai.onnx.ml](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md) operators supported by each of the available backends.

See [Compatibility](../README.md#Compatibility) for a list of the supported platforms.

| Operator | Cpu Backend | Wasm Backend | WebGl Backend |
|:------------------------------------------------------------------------------------------------------:|:-----------:|:------------:|:-------------:|
| [ArrayFeatureExtractor](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#ai.onnx.ml.ArrayFeatureExtractor) | | | |
| [Binarizer](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#ai.onnx.ml.Binarizer) | | | |
| [CastMap](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmlcastmap) | | | |
| [CategoryMapper](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmlcategorymapper) | | | |
| [DictVectorizer](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmldictvectorizer) | | | |
| [FeatureVectorizer](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmlfeaturevectorizer) | | | |
| [Imputer](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmlimputer) | | | |
| [LabelEncoder](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmllabelencoder) | | | |
| [LinearClassifier](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmllinearclassifier) | | | |
| [LinearRegressor](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmllinearregressor) | | | |
| [Normalizer](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmlnormalizer) | | | |
| [OneHotEncoder](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmlonehotencoder) | | | |
| [SVMClassifier](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmlsvmclassifier) | | | |
| [SVMRegressor](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmlsvmregressor) | | | |
| [Scaler](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmlscaler) | | | |
| [TreeEnsembleClassifier](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmltreeensembleclassifier) | | | |
| [TreeEnsembleRegressor](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmltreeensembleregressor) | | | |
| [ZipMap](https://github.com/onnx/onnx/blob/master/docs/Operators-ml.md#aionnxmlzipmap) | | | |
Loading

0 comments on commit 7652e0c

Please sign in to comment.