You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to latch an extra attribute to each node of a simple MNIST ONNX model.
I'm able to add custom attributes and save the model. However, while loading the model for Inference, I'm getting this: **Error Unrecognized attribute: CustomAttr for operator Conv**
The short answer is this can't be done as it violates the ONNX spec/interface for the operators found in the MNIST model. The spec/interface for each operator may be found here. The error you see while loading the mutated model is exactly complaining about this - it doesn't recognize the new attribute as it isn't supported by the spec. Is there any reason why you need to add a custom attribute to every node in the model ?
Describe the issue
I want to latch an extra attribute to each node of a simple MNIST ONNX model.
I'm able to add custom attributes and save the model. However, while loading the model for Inference, I'm getting this:
**Error Unrecognized attribute: CustomAttr for operator Conv**
I've attached a link for the model for reference.
https://drive.google.com/file/d/1votf5NdwU2YHamhCsyrEDi3nkSCBPBjZ/view?usp=sharing
To reproduce
Load the model using this:
onnxruntime.InferenceSession("model.onnx")
Urgency
No response
Platform
Linux
OS Version
4.18.0
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.16.1
ONNX Runtime API
Python
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered: