Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Web] Running ORT model results in NaN values output #19491

Open
nezaBacar opened this issue Feb 11, 2024 · 4 comments
Open

[Web] Running ORT model results in NaN values output #19491

nezaBacar opened this issue Feb 11, 2024 · 4 comments
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template stale issues that have not been addressed in a while; categorized by a bot

Comments

@nezaBacar
Copy link

Describe the issue

Hi, when I attempt to run inference of .ort model I turn on flags recommended in this issue: #13445 (comment).

const sessionOption = {
    executionProviders: ["wasm"],
    enableMemPattern: false,
    enableCpuMemArena: false,
    extra: {
      session: {
        disable_prepacking: "1",
        use_device_allocator_for_initializers: "0",
        use_ort_model_bytes_directly: "1",
        use_ort_model_bytes_for_initializers: "1"
      }
    }

Not turning them on results in an error, but turning them on results in an output of NaN values.

To reproduce

Run inference session with this model: https://drive.google.com/drive/folders/12tOtPWpANIlCPrsDMHSqkgxdhM0gwW88?usp=sharing and these flags:

const sessionOption = {
    executionProviders: ["wasm"],
    enableMemPattern: false,
    enableCpuMemArena: false,
    extra: {
      session: {
        disable_prepacking: "1",
        use_device_allocator_for_initializers: "0",
        use_ort_model_bytes_directly: "1",
        use_ort_model_bytes_for_initializers: "1"
      }
    }

Urgency

It is somewhat urgent

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.16

Execution Provider

'wasm'/'cpu' (WebAssembly CPU)

@nezaBacar nezaBacar added the platform:web issues related to ONNX Runtime web; typically submitted using template label Feb 11, 2024
@nezaBacar nezaBacar changed the title [Web] Running ORT model results in NaN values after running inference [Web] Running ORT model results in NaN values output Feb 11, 2024
@fs-eire
Copy link
Contributor

fs-eire commented Feb 16, 2024

could you share what exact input data that you feed into the model, so that I can try to reproduce the problem

@nezaBacar
Copy link
Author

Thanks for the fast response! I've added you to the demo repository. Just pop the model into the /demo/models folder and make sure that the line 20 in index.html points to model.ort. I noticed that if I try to run the ort model with these flags:
let sessionOptions = { executionProviders: [ 'wasm' ], graphOptimizationLevel: 'all', };
the outputs are ok. Should I maybe do it this way?
(thought this is not the right way because of this comment: github.com//issues/13445#issuecomment-1430153341)

@fs-eire
Copy link
Contributor

fs-eire commented Feb 21, 2024

I can reproduce the problem. And I checked the session options. If I remove use_ort_model_bytes_for_initializers: "1", the output value will be correct.

It takes a while for me to figure out why. From the comments in the source code, when specifying use_ort_model_bytes_for_initializers, it requires the model buffer keep valid during the whole life cycle of the inference session. Unfortunately this is not how onnxruntime-web manages the model data. Now, onnxruntime-web frees the model data after inference session is initialized. So using onnxruntime-web with config use_ort_model_bytes_for_initializers is not working.

Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Mar 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

2 participants