-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Models with multiple outputs produce different results when the order of irrelevant lines are changed #18081
Labels
core runtime
issues related to core runtime
ep:tvm
issues related to TVM execution provider
stale
issues that have not been addressed in a while; categorized by a bot
Comments
A similar case, which also contains the operation
|
hariharans29
added
core runtime
issues related to core runtime
and removed
ep:tvm
issues related to TVM execution provider
labels
Oct 26, 2023
Azyka
changed the title
Models with multiple outputs produce incorrect results when handling fp16 data
Models with multiple outputs produce different results when the order of irrelevant lines are changed
Nov 1, 2023
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details. |
github-actions
bot
added
the
stale
issues that have not been addressed in a while; categorized by a bot
label
Jan 4, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
core runtime
issues related to core runtime
ep:tvm
issues related to TVM execution provider
stale
issues that have not been addressed in a while; categorized by a bot
Describe the issue
On opset version 14, when organizing the nodes related to multiple outputs in a different order, models supposed the same produce different outputs in onnxruntime. The result is correct when reserving either one of the outputs but goes wrong when there are 2 outputs. The multiple outputs divergence was found in processing fp16 data. And such divergence was not seen in tvm.
Generally, the bug seems to need several conditions:
To reproduce
Test models in ort with and without optimization:
I simply changed the order of these lines in models from:
to:
and nothing should be changed in execution.
However I got the output as follows:
Urgency
This is an incorrect functionality implementation. It may cause severe bugs for those systems on the top of ORT.
Platform
Linux
OS Version
Ubuntu 22.04.3 LTS (x86_64)
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.15.1
ONNX Runtime API
Python
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered: