Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Web] Undescriptive error when calling run() on a model #17964

Closed
nezaBacar opened this issue Oct 16, 2023 · 3 comments
Closed

[Web] Undescriptive error when calling run() on a model #17964

nezaBacar opened this issue Oct 16, 2023 · 3 comments
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template stale issues that have not been addressed in a while; categorized by a bot

Comments

@nezaBacar
Copy link

Describe the issue

Hello, I am trying to load and run an ort unet model (1.16Gb) using onnxruntime-web. I am able to create InferenceSession, but when calling run() and passing required parameters a non descriptive error in a form of a large number occurs (example: 2097581304) and I don't know what it means.

To reproduce

  1. Load the ORT model from https://drive.google.com/drive/folders/1-Gxvogz9kN7r4vscOd8UCgTMQOIVOH65?usp=sharing, also load the imported_latents and imported_prompt_embeds
  2. Create session and pass parameters, latents_ (imported_latents.js) and prompt_embeds_ (imported_prompt_embeds.js) should be imported from downloaded files

export const OPTIONS_CPU = { executionProviders: ['wasm'], graphOptimizationLevel: 'all' };
let UNET = 'path_to_unet'
let unet_session = await ort.InferenceSession.create(UNET, OPTIONS_CPU);
const HEIGHT = 512;
const WIDTH = 512;
let latents = new ort.Tensor('float32', latents_, [1, 4, HEIGHT/8, WIDTH/8]);
let prompt_embeds = new ort.Tensor('float32', prompt_embeds_, [1, 77, 768]);
let time_step = 999;
let noise_prediction = await unet_session.run({ "sample": latents, "timestep": new ort.Tensor('float32', [time_step]), "encoder_hidden_states": prompt_embeds})

I also tried to improve performance by applying parameters below:
export const OPTIONS_CPU = {
executionProviders: ["wasm"],
graphOptimizationLevel: 'all',
enableMemPattern: false,
enableCpuMemArena: false,
extra: {
session: {
disable_prepacking: "1",
use_device_allocator_for_initializers: "0",
use_ort_model_bytes_directly: "1",
use_ort_model_bytes_for_initializers: "1"
}
}
}

and applying use_ort_model_bytes_for_initializers actually enabled the inference to execute, but the output values were NaN, NaN, ...

I also tried this with ONNX model, but then I get this error when calling run():
RangeError: offset is out of bounds
at Uint8Array.set (<anonymous>)
at t.run (wasm-core-impl.ts:219:1)
at t.run (proxy-wrapper.ts:212:1)
at t.OnnxruntimeWebAssemblySessionHandler.run (session-handler.ts:83:1)
at i.run (inference-session-impl.js:94:1)
at Pipeline.predict_noise (pipeline.js:57:1)
at Object.testUnet (localStore.js:70:1)

Urgency

It is somewhat urgent.

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.16

Execution Provider

'wasm'/'cpu' (WebAssembly CPU)

@nezaBacar nezaBacar added the platform:web issues related to ONNX Runtime web; typically submitted using template label Oct 16, 2023
@nezaBacar nezaBacar changed the title [Web] Error while calling run() on a model [Web] Undescriptive error when calling run() on a model Oct 16, 2023
@dakenf
Copy link
Contributor

dakenf commented Oct 17, 2023

This is either out of memory issue or an error caused by old emscripten compiler version that returned negative values for memory addresses >2gb. First one cannot be fixed without 64bit support (release date unknown) second might be in the next release

You can try using webgpu provider since it uses less RAM by allocating weights on GPU

@nezaBacar
Copy link
Author

Thank you for you answer I will test again when the next feature is released. I sadly can not utilize webgpu since some of the operators needed for my model are not supported.

Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Jan 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

2 participants