Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] [onnxruntime-node] Error: no available backend found. ERR: [wasm] backend not found. #21813

Open
kabyanil opened this issue Aug 21, 2024 · 4 comments
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template

Comments

@kabyanil
Copy link

I am creating an Electron app where I need to run inference on some onnx models through the Node.js backend. I have installed "onnxruntime-node": "^1.19.0" as dependency. I am initializing the models with -

const sessionOptions = {
         executionProviders: ['wasm'],
      }

const session = {
         model_1: await ort.InferenceSession.create("./assets/models/model_1.onnx", sessionOptions),
      }

This throws the following error when I run electron . -

Error: no available backend found. ERR: [wasm] backend not found.
    at resolveBackendAndExecutionProviders (~/my-electron-app/node_modules/onnxruntime-common/dist/cjs/backend-impl.js:124:15)
    at async InferenceSession.create (~/my-electron-app/node_modules/onnxruntime-common/dist/cjs/inference-session-impl.js:183:52)

I am on intel x64 MacOS 12.7.5 with Node.js version 20.16.0.

This issue to very urgent to me, as I am not able to write the inference code without first loading the model.

What is the issue here and how to resolve it?

@github-actions github-actions bot added the platform:web issues related to ONNX Runtime web; typically submitted using template label Aug 21, 2024
@guschmue
Copy link
Contributor

guschmue commented Aug 21, 2024

Try to replace 'wasm' with 'cpu'. onnxruntime-node would not include the webassemly flavor.
If you want webassembly you can replace onnxruntime-node with onnxruntime-web - should be the same api but that one will support 'wasm' ep.

@kabyanil
Copy link
Author

Thanks, I tried 'cpu' right after creating this issue, and it worked! Btw, which one is faster in your opinion?

@fs-eire
Copy link
Contributor

fs-eire commented Aug 29, 2024

usually cpu is faster than wasm in Node.js. however the best answer is to try both and check which one is faster in your application.

@kabyanil
Copy link
Author

@fs-eire I found the cpu backend to be decently fast, with the models loading in ~1.x second, and inferencing in ~200 milliseconds. I couldn't test wasm as it wouldn't load in node.js though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template
Projects
None yet
Development

No branches or pull requests

3 participants