-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Mobile] onnxruntime-react-native app crashes during session.run #17623
Comments
|
The code died in run and was tested with a poco phone.
|
Just to confirm, you are using onnxruntime-react-native 1.16.0 (released couple hours ago)? Also, does the crash only happen on real device or does it also happen in simulator as well? |
The simulator did not conduct the test, but proceeded with the pocophone. 1.14, 1.15, and 1.16, which came out about five hours ago, all proceeded. |
What shape input does the model require? Most often I've seen mnist take a 4D input with shape {1, 1, 28, 28} and |
Although we did not check what the exact input of the mnist was, we found that other models (1, 3, 224, 224) that knew the exact input were also interrupted by session.run. We also found that if the input is different, the model outputs an input shape related error. |
If the input is different, the app does not die and only causes the error that the input is different. |
I'm uploading the library in case it's a library problem
|
I have the same issue as mentioned here #17541. Input shape is not a problem. I think there is a bug with the native code as the crash is from C. |
I hope the problem is solved quickly! |
Can you try using the XNNPACK execution provider to see if that avoids the issue? Requires onnxruntime 1.16. Details on how to enable that in this comment: #17541 (comment) |
Describe the issue
I am using onnxruntime-react-native. There is an issue where the app crashes during session.run.
This happens consistently with any ORT model and across various versions.
We created a blank array to match the shape of the input and challenged many of the models released in the port file.
The problem is that the app turns off as soon as you infer without errors
I want to know the solution.
To reproduce
all ort inference code
Urgency
No response
Platform
Android
OS Version
1.16
ONNX Runtime Installation
Built from Source
Compiler Version (if 'Built from Source')
No response
Package Name (if 'Released Package')
None
ONNX Runtime Version or Commit ID
1.16
ONNX Runtime API
JavaScript
Architecture
ARM64
Execution Provider
Default CPU
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered: