You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I'm working with two separate models: an encoder and a decoder. Individually, exporting either model works without any issues. However, I encountered a problem when trying to export a unified model that integrates both the encoder and the decoder. Below is the structure of my combined model:
class Model():
def forward(self, x):
tmp = encoder(x)
out = decoder(tmp)
return out
Traceback (most recent call last):
File "onnx/test_save_all.py", line 66, in <module>
onnx_model = ort.InferenceSession("qcnet.onnx",
File "/home/shy/mydiskBig/miniforge3/envs/onnx/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/home/shy/mydiskBig/miniforge3/envs/onnx/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 463, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
onnxruntime.capi.onnxruntime_pybind11_state.NotImplemented: [ONNXRuntimeError] : 9 : NOT_IMPLEMENTED : Could not find an implementation for Cos(7) node with name '/encoder/Cos'
I have checked the onnx in netron, but it seems ok. The pytorch code is here:
The error means onnxruntime cannot find an implementation for operator Cos with opset 7 which is the most recent opset for this one. Maybe the input type for the operator Cos happens to be outside the supported list. Are running it on Cpu, Cuda, ...? Can you share the instructions you used to convert your model into ONNX?
Hi, @xadupre
Sorry for the late reply. I have tried some different opset versions, but they seem not to work. I am now running it on the CPU, and here are the instructions. By the way, my model includes two parts: an encoder and a decoder. The problem occurs with the cos operator in the encoder. When I change all the cos to sin, the problem does not occur. The input type they are all fp32
Describe the issue
Hello, I'm working with two separate models: an encoder and a decoder. Individually, exporting either model works without any issues. However, I encountered a problem when trying to export a unified model that integrates both the encoder and the decoder. Below is the structure of my combined model:
I have checked the onnx in netron, but it seems ok. The pytorch code is here:
To reproduce
Urgency
No response
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.16.3
PyTorch Version
2.1.1+cu118
Execution Provider
Default CPU
Execution Provider Library Version
cu118
The text was updated successfully, but these errors were encountered: