-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature request] Deno webgpu support #22309
Comments
Thank you for reporting this issue. since |
@fs-eire Try the following: import { InferenceSession } from "npm:onnxruntime-web";
await InferenceSession.create(
"https://huggingface.co/briaai/RMBG-1.4/resolve/main/onnx/model_quantized.onnx",
{
executionProviders: ["webgpu"],
}
); I'm using deno 2.0.0-rc.10, but deno stable version get the same error. |
I am not sure about this “npm:onnxruntime-web” syntax. In a nodejs based web app project I used “onnxruntime-web/webgpu” (“webgpu” is the export name that defined in package.json). If I just import “onnxruntime-web” it doesn’t load webgpu support. |
@fs-eire I understand.
The release candidate version of Deno throw the same error:
|
It seems that Emscripten does not have well support for Deno. The generated javascript glue file didn't work well in Deno:
|
Update: OK I hacked the Emscripten glue JS and finally get into Deno's webgpu. And now I am blocked with a bug in Deno's webgpu implementation:
link to the issue: denoland/deno#22029 |
I don't know why |
changed another PC and the value of |
According to the WebGPU spec, |
In google chrome adapter.limits.maxBufferSize
2147483648
typeof adapter.limits.maxBufferSize
'number' Chrome: 129.0.6668.100 |
I am not sure about the details but at least the implementation of Deno WebGPU is incorrect. It It |
@fs-eire what deno version you tested early? const adapter = await navigator.gpu.requestAdapter();
console.log(adapter?.limits.maxBufferSize); // 2147483647
const device = await adapter?.requestDevice( { requiredLimits : { maxBufferSize: adapter.limits.maxBufferSize! }})
console.log(device) // GPUDevice with no error |
I tried both 1.46.3 and 2.0.0
|
The blocking issue denoland/deno#22029 is fixed in upstream. I will check if it works in latest version of deno. |
seems the fix is not yet in latest (canary) version. |
@fs-eire i think the fix is in the latest canary now |
Fix has been released in Deno 2.1.2 |
@fs-eire @ry import { InferenceSession } from "npm:onnxruntime-web";
await InferenceSession.create(
"https://huggingface.co/briaai/RMBG-1.4/resolve/main/onnx/model_quantized.onnx",
{
executionProviders: ["webgpu"],
}
);
error: Uncaught (in promise) Error: no available backend found. ERR: [webgpu] backend not found.
at resolveBackendAndExecutionProviders (file:///run/media/jlucaso/secondary/projects/webgpu/node_modules/.deno/[email protected]/node_modules/onnxruntime-common/dist/esm/backend-impl.js:120:15)
at async Function.create (file:///run/media/jlucaso/secondary/projects/webgpu/node_modules/.deno/[email protected]/node_modules/onnxruntime-common/dist/esm/inference-session-impl.js:180:52)
at async file:///run/media/jlucaso/secondary/projects/webgpu/main.ts:3:1 |
Describe the feature request
Support to running onnx models in webgpu using deno
Describe scenario use case
Actually not working because this error:
error: Uncaught (in promise) Error: no available backend found. ERR: [webgpu] backend not found.
Debug
console.log(await navigator.gpu.requestAdapter())
The text was updated successfully, but these errors were encountered: