-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference session crashes using ONNX runtime. #20043
Comments
could you please also share the code snippet of using onnxruntime-web? It may be related to the error. |
""" Must be run on an x86 host. Args:
import os import onnxruntime as rt os.environ["TIDL_RT_PERFSTATS"] = "1" if name == "main":
|
Sorry. I thought it was for ORT Web... Removing the 'web' tag. |
Okay, is issue can be taken over? |
Describe the issue
I have been customizing yolov8 in order to port on hardware. I customised SPFF block in yolov8 architecture:- I have changed the shape of maxpool layer from 55 to 33 shape. Then I exported to onnx model. Then try to inference session. I got the error as below:
Warning : Couldn't find corresponding ioBuf tensor for onnx tensor with matching name 2024-03-22 12:05:15.362450533 [W:onnxruntime:, execution_frame.cc:835 VerifyOutputSizes] Expected shape from model of {1,5,8400} does not match actual shape of {} for output output0
Onnx model link: https://drive.google.com/file/d/1wpfgRop6tscEQcmDTO9dfvagiv3GJrge/view?usp=sharing
ONNX Opset version 11
The text was updated successfully, but these errors were encountered: