Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

Commit

Permalink
api: add ENV implementation (#27)
Browse files Browse the repository at this point in the history
  • Loading branch information
fs-eire authored Dec 10, 2018
1 parent 7652e0c commit 61b22c9
Show file tree
Hide file tree
Showing 8 changed files with 123 additions and 50 deletions.
65 changes: 37 additions & 28 deletions docs/api.md
Original file line number Diff line number Diff line change
@@ -1,53 +1,62 @@
# **API Documentation**

## **Table of Contents**
### 1. [Globals](#Globals)
1. [onnx](#onnx)
2. [onnx.backend](#onnx.backend)
3. [onnx.debug](#onnx.debug)

### 2. [Inference Session](#Inference-Session)
1. [Creating an Inference Session](#Creating-an-Inference-Session)
2. [Run in Inference Session](#Run-in-Inference-Session)
3. [Profile a Session](#Profile-a-Session)

### 3. [Tensor](#Tensor)
1. [Create a Tensor](#Create-a-Tensor)
2. [Tensor Properties](#Tensor-Properties)
3. [Access Tensor Elements](#Access-Tensor-Elements)

## **Globals**
- ### **onnx**
The `onnx` object is available in global context (window.onnx in browser, global.onnx in Node.js) after require /import 'onnxjs' module, or imported from a `<script> tag`.

- ### **onnx.backend**
### - [Onnx](#ref-Onnx)
- [Tensor](#ref-Onnx-Tensor)
- [InferenceHandler](#ref-Onnx-InferenceHandler)
- [backend](#ref-Onnx-backend)
- [ENV](#ref-Onnx-ENV)

### - [Inference Session](#ref-InferenceSession)
- [Creating an Inference Session](#Creating-an-Inference-Session)
- [Run in Inference Session](#Run-in-Inference-Session)
- [Profile a Session](#Profile-a-Session)

### - [Tensor](#ref-Tensor)
- [Create a Tensor](#Create-a-Tensor)
- [Tensor Properties](#Tensor-Properties)
- [Access Tensor Elements](#Access-Tensor-Elements)

## <a name="ref-Onnx"></a>**Onnx**

The `onnx` object is the exported object of the module. It's available in global context (`window.onnx` in browser, `global.onnx` in Node.js) after require/import 'onnxjs' module, or imported from a `<script> tag`.

- ### <a name="ref-Onnx-Tensor"></a>**onnx.Tensor**
See [Tensor](#ref-Tensor).

- ### <a name="ref-Onnx-InferenceSession"></a>**onnx.InferenceSession**
See [InferenceSession](#ref-InferenceSession).

- ### <a name="ref-Onnx-backend"></a>**onnx.backend**
Customizes settings for all available backends. `ONNX.js` currently supports three types of backend - *cpu* (pure JavaScript backend), *webgl* (WebGL backend), and *wasm* (WebAssembly backend).

### `onnx.backend.cpu`
### `backend.cpu`
An object specifying CPU backend settings. Available soon.
***
### `onnx.backend.webgl`
### `backend.webgl`
An object specifying WebGL backend settings. Available soon.
***
### `onnx.backend.wasm`
### `backend.wasm`
An object specifying WebAssembly backend settings. The supported member variables are:
- **worker** (`number`)

Optional. Specifies the number of [web workers](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers) to run in background threads. If not set, run with number of `CPU cores - 1` workers.
Optional. Specifies the number of [web workers](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers) to run in background threads. If not set, run with number of `(CPU cores - 1)` workers.
- **cpuFallback** (`boolean`)

Optional. Determines whether to fall back to use CPU backend if WebAssembly backend is missing certain ONNX operators. Default is set to true.

- ### **onnx.debug**
- ### <a name="ref-Onnx-backend"></a>**ENV**
Represent runtime environment settings and status of ONNX.js
### `ENV.debug`
A global flag to indicate whether to run `ONNX.js` in debug mode.

## **Inference Session**
## <a name="ref-InferenceSession"></a>**Inference Session**
An `InferenceSession` encapsulates the environment for `ONNX.js` operations to execute. It loads and runs `ONNX` models with the desired configurations.

To configure an `InferenceSession`, use an object with the following parameters-
- **backendHint** (`string`)
Specify a preferred backend to start an `InferenceSession`. Current available backend hints are:
- `'cpu'` : CPU backend
- `'cpu'`: CPU backend
- `'wasm'`: WebAssembly backend
- `'webgl'`: WebGL backend
If not set, the backend will be determined by the platform and environment.
Expand Down Expand Up @@ -181,7 +190,7 @@ To configure an `InferenceSession`, use an object with the following parameters-
```


## **Tensor**
## <a name="ref-Tensor"></a>**Tensor**
Tensor is a representation of vectors, matrices and n-dimension data in `ONNX.js`. Tensors are used in `InferenceSession` as inputs for models to run.

- ### **Create a Tensor**
Expand Down
17 changes: 17 additions & 0 deletions lib/api/env-impl.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT license.

import {env} from '../env';

import {Environment} from './env';

class ENV implements Environment {
public set debug(value: boolean) {
env.debug = value;
}
public get debug(): boolean {
return env.debug;
}
}

export const envImpl = new ENV();
12 changes: 12 additions & 0 deletions lib/api/env.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT license.

/**
* represent runtime environment settings and status of ONNX.js
*/
export interface Environment {
/**
* a global flag to indicate whether to run ONNX.js in debug mode
*/
debug: boolean;
}
6 changes: 3 additions & 3 deletions lib/api/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,15 +2,15 @@
// Licensed under the MIT license.

import {Onnx} from './onnx';
import * as OnnxImpl from './onnx-impl';
import * as onnxImpl from './onnx-impl';

// get or create the onnx object in the global context
const onnx: Onnx = OnnxImpl;
const onnxGlobal = ((typeof window !== 'undefined') ? window : global) as {onnx?: Onnx};
const onnx: Onnx = onnxImpl;
onnxGlobal.onnx = onnx;

// set module exported object to global.onnx
export = OnnxImpl;
export = onnxImpl;

// declaration of object global.onnx
declare global {
Expand Down
5 changes: 4 additions & 1 deletion lib/api/onnx-impl.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,11 @@ import {CpuBackend} from '../backends/backend-cpu';
import {WasmBackend} from '../backends/backend-wasm';
import {WebGLBackend} from '../backends/backend-webgl';

import {Environment} from './env';
import {envImpl} from './env-impl';
import {Backend} from './onnx';

export * from './env';
export * from './onnx';
export * from './tensor';
export * from './inference-session';
Expand All @@ -17,4 +20,4 @@ export const backend: Backend = {
webgl: new WebGLBackend()
};

export let debug = false;
export const ENV: Environment = envImpl;
17 changes: 6 additions & 11 deletions lib/api/onnx.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT license.

import {Environment} from './env';
import {InferenceSessionConstructor} from './inference-session';
import {TensorConstructor} from './tensor';

Expand Down Expand Up @@ -54,17 +55,7 @@ export interface Backend {

//#endregion Backends

/**
* represent runtime environment settings and status of ONNX.js
*/
export interface Environment {
/**
* a global flag to indicate whether to run ONNX.js in debug mode
*/
debug: boolean;
}

export interface Onnx extends Environment {
export interface Onnx {
/**
* represent a tensor with specified dimensions and data type.
*/
Expand All @@ -77,4 +68,8 @@ export interface Onnx extends Environment {
* represent all available backends and settings of them
*/
readonly backend: Backend;
/**
* represent runtime environment settings and status of ONNX.js
*/
readonly ENV: Environment;
}
20 changes: 20 additions & 0 deletions lib/env.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
import * as platform from 'platform';

import * as onnx from './api';
import {Backend, Environment, Onnx} from './api';

interface ENV extends Environment {
readonly onnx: Onnx;
readonly backend: Backend;
readonly platform: Platform;
}

class EnvironmentImpl implements ENV {
public readonly onnx = onnx;
public readonly backend = onnx.backend;
public readonly platform = platform;

public debug = false;
}

export const env: ENV = new EnvironmentImpl();
31 changes: 24 additions & 7 deletions test/unittests/api/onnx.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ import {expect} from 'chai';
const apiRequireIndex = require('../../../lib/api');
const onnxImpl = require('../../../lib/api/onnx-impl');

import {InferenceSession, Tensor, backend} from '../../../lib/api';
import {InferenceSession, Tensor, backend, ENV} from '../../../lib/api';

describe('#UnitTest# - API - Check Globals and Imports', () => {
it('Compare Global onnx and Imported onnx', () => {
Expand All @@ -18,16 +18,18 @@ describe('#UnitTest# - API - Check Globals and Imports', () => {
expect(apiRequireIndex).is.equal(onnxImpl);
});

it('Compare Global Tensor and Imported Tensor', () => {
it('Compare Global and Imported variables', () => {
expect(onnx.Tensor).is.equal(Tensor);
});

it('Compare Global InferenceSession and Imported InferenceSession', () => {
expect(onnx.InferenceSession).is.equal(InferenceSession);
expect(onnx.backend).is.equal(backend);
expect(onnx.ENV).is.equal(ENV);
});

it('Compare Global backend and Imported backend', () => {
expect(onnx.backend).is.equal(backend);
it('Check type members', () => {
expect(backend).to.have.property('cpu');
expect(backend).to.have.property('webgl');
expect(backend).to.have.property('wasm');
expect(ENV).to.have.property('debug');
});

it('Ensure no value exported from interface file', () => {
Expand All @@ -36,15 +38,26 @@ describe('#UnitTest# - API - Check Globals and Imports', () => {
const onnxExportedValues = onnxPropertyNames.filter(name => name !== '__esModule');
expect(onnxExportedValues).to.have.lengthOf(0);

const env = require('../../../lib/api/env');
const envPropertyNames = Object.getOwnPropertyNames(env);
const envExportedValues = envPropertyNames.filter(name => name !== '__esModule');
expect(envExportedValues).to.have.lengthOf(0);

const tensor = require('../../../lib/api/tensor');
const tensorPropertyNames = Object.getOwnPropertyNames(tensor);
const tensorExportedValues = tensorPropertyNames.filter(name => name !== '__esModule');
// this module should only contains 'Tensor'
// this is becaues we need to put all definitions in one file to allow typescript to merge declarations of
// interface, namespace and varaible
expect(tensorExportedValues).to.have.lengthOf(1);
expect(tensorExportedValues).to.contain('Tensor');

const inferenceSession = require('../../../lib/api/inference-session');
const inferenceSessionPropertyNames = Object.getOwnPropertyNames(inferenceSession);
const inferenceSessionExportedValues = inferenceSessionPropertyNames.filter(name => name !== '__esModule');
// this module should only contains 'InferenceSession'
// this is becaues we need to put all definitions in one file to allow typescript to merge declarations of
// interface, namespace and varaible
expect(inferenceSessionExportedValues).to.have.lengthOf(1);
expect(inferenceSessionExportedValues).to.contain('InferenceSession');
});
Expand All @@ -53,6 +66,10 @@ describe('#UnitTest# - API - Check Globals and Imports', () => {
const onnxImplPropertyNames = Object.getOwnPropertyNames(onnxImpl);
expect(onnxImplPropertyNames).to.have.lengthOf.at.least(1);

const envImpl = require('../../../lib/api/env-impl');
const envImplPropertyNames = Object.getOwnPropertyNames(envImpl);
expect(envImplPropertyNames).to.contain('envImpl');

const tensorImpl = require('../../../lib/api/tensor-impl');
const tensorImplPropertyNames = Object.getOwnPropertyNames(tensorImpl);
expect(tensorImplPropertyNames).to.contain('Tensor');
Expand Down

0 comments on commit 61b22c9

Please sign in to comment.