You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First of all, thanks for the great work, I appreciate it!
I am asking for general guidance to debug a problem that I can not reproduce with an MCVE.
We have an Onnx model, which runs fine on Intel (Debian 11, C7I instance to be exact), but moving to Graviton (Debian11, ARM64, C7G to be exact), the model fails with an ORT_FAIL - message: Type Error: Type parameter (T) of Optype (Min) bound to different types (tensor(float) and tensor(int64) in node (model_2/safe_relu/clip_by_value/Minimum)..
Unfortunately, so far my attempts to reproduce it (minimum code in a JAR, and running the JAR on C7G) have failed to reproduce it.
I didn't find a way to debug models at runtime (such as access to the deserialized graph).
What is usually the way to investigate this kind of issue? Is the minimal reproducer the only way to try to isolate the bug? Or is there any tools to investigate at runtime?it
To reproduce
Unfortunately, I failed to make a reproducer (yet).
Looking for general guidance there
You can use netron to visualize the onnx model and determine the node on its name model_2/safe_relu/clip_by_value/Minimum and see where the inputs come from. It helps finding where the model fails. If it works on intel and fails on Graviton, it seems a bug in onnxruntime. Based on the error message, it fails in method Graph::InferAndVerifyTypeMatch in core/graph/graph.cc. Operator Min expects its inputs to have the same type but for some reason it is not. One of its inputs has a different type on Graviton and on Intel (it works there). You can enable to logs to see if some warnings pops up and if they are different on both machines.
Hello Xavier,
thank you very much for your answer.
After iterating a few more iterations, I found out that the bug only produces when there is an optimization on the onnx graph BASIC_OPT at least. But does not fail with NO_OPT.
I will try to iterate a few more times to extract a minimal reproducer and post a new issue.
From what I understand, the models indeed fail at inference time and not at load time, right?
You can enable to logs to see if some warnings pops up and if they are different on both machines.
Sorry for my newbie question here, by this do you mean the ORT_LOGGING_LEVEL_WARNING?
And again, thank you for your answer, it has been helpful !
Feel free to close this if you wish, I can open another issue if I find the reproducer
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
Describe the issue
First of all, thanks for the great work, I appreciate it!
I am asking for general guidance to debug a problem that I can not reproduce with an MCVE.
We have an Onnx model, which runs fine on Intel (Debian 11, C7I instance to be exact), but moving to Graviton (Debian11, ARM64, C7G to be exact), the model fails with an
ORT_FAIL - message: Type Error: Type parameter (T) of Optype (Min) bound to different types (tensor(float) and tensor(int64) in node (model_2/safe_relu/clip_by_value/Minimum).
.Unfortunately, so far my attempts to reproduce it (minimum code in a JAR, and running the JAR on C7G) have failed to reproduce it.
I didn't find a way to debug models at runtime (such as access to the deserialized graph).
What is usually the way to investigate this kind of issue? Is the minimal reproducer the only way to try to isolate the bug? Or is there any tools to investigate at runtime?it
To reproduce
Unfortunately, I failed to make a reproducer (yet).
Looking for general guidance there
Urgency
No response
Platform
Linux
OS Version
Debian11
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
ef2b2169d30994c5160504e3c7b16f2bb031150c1ae9f52f22d7afdc6911a538
ONNX Runtime API
Java
Architecture
ARM64
Execution Provider
Default CPU
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered: