Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

Support super resolution models and benchmarking #291

Open
wants to merge 7 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,9 @@
/tools/**/*.js
/tools/**/*.js.map

/benchmark/**/node_modules/
/benchmark/**/dist/

npm-debug.log
.DS_Store
yarn-error.log
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
5 changes: 5 additions & 0 deletions benchmark/super_resolution_model_zoo/LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Licenses
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this be license or readme?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since we put tfjs in our benchmark, we need to specify tfjs's license here to state that we are using some other project. I think i included a README in the same folder.


TensorFlow.js:
https://github.com/tensorflow/tfjs/blob/master/LICENSE

38 changes: 38 additions & 0 deletions benchmark/super_resolution_model_zoo/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# Benchmarks
Copy link
Member

@duli2012 duli2012 Apr 27, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could we generalize this sub project to other models besides SR, which would be very useful feature?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes i put a TODO in the main index.js file to improve this benchmark to take arbitrary models and configs. For that, we would need a benchmark driver similar to our test-runner-cli. I will do a followup change just for the benchmarking tool.

This sub project is to benchmark model zoo's super resolution model and compare ONNX.js peformance vs other leading in-browser AI Inference frameworks.

## Frameworks
- TensorFlow.js
- ONNX.js

## Backends
- WebGL

## Browsers
(not all framework/backend combinations are supported by all browsers)
- Chrome (WebGL 2)
- Edge (WebGL 1)

## Instructions

1. Ensure that the ONNX.js project (the parent) is already installed and built:
```bash
npm ci
npm run build
```
2. Change to `benchmark/super_resolution_model_zoo` subfolder and run npm ci and build in the benchmark folder
```bash
cd benchmark
npm install
npm run build
```
3. Run tests (Chrome)
```bash
npm run test
```

The test command supports enabling pack mode through environment variables, use:
```bash
PACK=1 npm run test
```
to enable webGL texture packing for onnxjs and tfjs.
Binary file not shown.
68 changes: 68 additions & 0 deletions benchmark/super_resolution_model_zoo/karma.conf.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
// Karma configuration
const path = require('path')
function getMachineIpAddress() {
var os = require('os');
var ifaces = os.networkInterfaces();

for (const ifname in ifaces) {
for (const iface of ifaces[ifname]) {
if ('IPv4' !== iface.family || iface.internal !== false) {
// skip over internal (i.e. 127.0.0.1) and non-ipv4 addresses
continue;
}

// returns the first available IP address
return iface.address;
}
}

// if no available IP address, fallback to "localhost".
return 'localhost';
}

module.exports = function(config) {
config.set({
basePath: './',
frameworks: ['mocha'],
files: [
{ pattern: 'dist/main.js' },
{ pattern: 'dist/onnx-wasm.wasm', included: false},
{ pattern: 'dist/onnx-worker.js', included: false},
{ pattern: 'data/**/*', watched: false, included: false, served: true, nocache: true }
],
proxies: {
'/onnx-wasm.wasm': '/base/dist/onnx-wasm.wasm',
'/onnx-worker.js': '/base/dist/onnx-worker.js',
},
exclude: [
],
// available preprocessors: https://npmjs.org/browse/keyword/karma-preprocessor
preprocessors: {
},
reporters: ['mocha'],
captureTimeout: 120000,
reportSlowerThan: 100,
browserDisconnectTimeout: 600000,
browserNoActivityTimeout: 300000,
browserDisconnectTolerance: 0,
browserSocketTimeout: 60000,
logLevel: config.LOG_VERBOSE,
hostname: getMachineIpAddress(),
customLaunchers: {
ChromeTest: {base: 'Chrome', flags: ['--window-size=1,1']},
ChromeDebug: {debug: true, base: 'Chrome', flags: ['--remote-debugging-port=9333']}
},
client: {
captureConsole: true,
mocha: {expose: ['body'], timeout: 3000000},
browser: config.browsers,
printMatches: false,
// To enable pack, run 'PACK=1 npm run test'
usePackedGlTexture: config.usePackedGlTexture==1 ? true : false,
runIteration: config.runIteration ? config.runIteration : 10,
profile: config.profile
},
browsers: ['ChromeTest', 'ChromeDebug', 'Edge', 'Safari'],
browserConsoleLogOptions: {level: "debug", format: "%b %T: %m", terminal: true},
})
}
Loading