Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Graph optimization] INVALID_GRAPH error returned by InferenceSession with offline optimization #23022

Open
jenchaine opened this issue Dec 5, 2024 · 2 comments
Labels
more info needed issues that cannot be triaged until more information is submitted by the original user

Comments

@jenchaine
Copy link

Describe the issue

We tried to use offline optimization to reduce startup time as explained here:
https://onnxruntime.ai/docs/performance/model-optimizations/graph-optimizations.html

But the initialization of the optimized model fails with INVALID_GRAPH error.
check_model returns the same error.
However, online optimization works fine.

To reproduce

  1. Serialize the optimized model
options = onnxruntime.SessionOptions()
options.optimized_model_filepath = "optimized.onnx"
options.graph_optimization_level = onnxruntime.GraphOptimizationLevel.ORT_ENABLE_ALL

sess = onnxruntime.InferenceSession("model.onnx", providers=["CPUExecutionProvider"], sess_options=options)
  1. Load the optimized model without optimization
options = onnxruntime.SessionOptions()
options.graph_optimization_level = onnxruntime.GraphOptimizationLevel.ORT_DISABLE_ALL
sess = onnxruntime.InferenceSession("optimized.onnx", providers=["CPUExecutionProvider"], sess_options=options)

The following error will be returned:

10 : INVALID_GRAPH : Load model from optimized.onnx failed:This is an invalid model. In Node, (\"\", If, \"\", -1) : (\"if_node\": tensor(bool),) -> (\"if_out\",) , Error Graph must be in single static assignment (SSA) form, however '_inlfunc_SequenceMap_token_0_SequenceMap_out_sequence_0_seqempty' has been used as output names multiple times."

Urgency

We plan to deploy the model in Q2 2025.
With online graph optimization, it takes about 5 seconds to initialize the session, which is quite slow if we need to create the session frequently.

Platform

Windows

OS Version

10

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.19.2

ONNX Runtime API

Python

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

@tianleiwu tianleiwu added the more info needed issues that cannot be triaged until more information is submitted by the original user label Dec 10, 2024
@tianleiwu
Copy link
Contributor

@jenchaine, Please share the onnx model for reproducing the issue.

@jenchaine
Copy link
Author

Hello @tianleiwu,
Sorry I cannot share the model because of confidential issues.
I will avoid using the SequenceMap (see issue #23024) so normally I will no longer have this issue.
I have also noticed some strange bahavior: the InferenceSession initialization is faster with optimization on ... 5 seconds (on) v.s. 7 seconds (off).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
more info needed issues that cannot be triaged until more information is submitted by the original user
Projects
None yet
Development

No branches or pull requests

2 participants