Error Inferring Yolov8 node_args.cpp Vitis Ai provider #17534
Labels
ep:VitisAI
issues related to Vitis AI execution provider
quantization
issues related to quantization
stale
issues that have not been addressed in a while; categorized by a bot
Describe the issue
Hello,
I had a quantized Yolov8 model that I wanted to infer on a Kria-KV260 target with DPUCZDX8G-B409 with Vitis AI option. During quantization, I excluded the nodes that aren't supported by the DPU architecture like: reshape, resize, slice, split, divide, and subtract. The model runs with onnx run time successfully on my local laptop on CPU. However, when I tried to run it on the target, there was an error -> node_arg.cpp:329, unknown type:2, check failure stack trace, Aborted. I changed the logging to be verbose so that I could have more info and I am attaching the output of the log. I tried to look what would be inside node_arg.cpp file to debug but couldn't find any clues.
Would appreciate if you can help debugging this. I have attached the quantized model and inference script as well for your reference.
To reproduce
1- Move Quantized model to the kit
2 - Used the following script to create a session
Yolov8.zip
Urgency
I am currently working on a project trying to assess the performance of FPGAs vs GPUs and this issue is blocking me as this task aim was to estimate how can we scale the FPGA vs the GPU.
Platform
Linux
OS Version
Peta-Linux, Vitis3.0
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.14.0
ONNX Runtime API
C++
Architecture
ARM64
Execution Provider
Vitis AI
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered: