Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Inference] Could not find an implmentation for groupnorm #20661

Closed
neonarc4 opened this issue May 12, 2024 · 3 comments
Closed

[Inference] Could not find an implmentation for groupnorm #20661

neonarc4 opened this issue May 12, 2024 · 3 comments
Labels
ep:oneDNN questions/issues related to DNNL EP ep:ROCm questions/issues related to ROCm execution provider stale issues that have not been addressed in a while; categorized by a bot

Comments

@neonarc4
Copy link

neonarc4 commented May 12, 2024

Describe the issue

i dont know what going on here cant even run simple thing

To reproduce

from optimum.onnxruntime import ORTStableDiffusionXLPipeline
pipeline = ORTStableDiffusionXLPipeline.from_pretrained("greentree/SDXL-olive-optimized")
2024-05-12 22:48:45.277336: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2024-05-12 22:48:46.160987: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
WARNING:tensorflow:From Z:\software\python11\Lib\site-packages\tf_keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead.

Traceback (most recent call last):
  File "X:\sad\practice\test\aii\neo.py", line 259, in <module>
    pipeline = ORTStableDiffusionXLPipeline.from_pretrained("greentree/SDXL-olive-optimized")
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\optimum\onnxruntime\modeling_ort.py", line 669, in from_pretrained
    return super().from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\optimum\modeling_base.py", line 402, in from_pretrained
    return from_pretrained_method(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\optimum\onnxruntime\modeling_diffusion.py", line 337, in _from_pretrained    vae_decoder, text_encoder, unet, vae_encoder, text_encoder_2 = cls.load_model(
                                                                   ^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\optimum\onnxruntime\modeling_diffusion.py", line 214, in load_model
    vae_decoder = ORTModel.load_model(vae_decoder_path, provider, session_options, provider_options)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\optimum\onnxruntime\modeling_ort.py", line 375, in load_model
    return ort.InferenceSession(
           ^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "Z:\software\python11\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session
    sess.initialize_session(providers, provider_options, disabled_optimizers)
onnxruntime.capi.onnxruntime_pybind11_state.NotImplemented: [ONNXRuntimeError] : 9 : NOT_IMPLEMENTED : Could not find an implementation for GroupNorm(1) node with name 'GroupNorm_0'

Urgency

i dont know what to say

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.17.3

PyTorch Version

2.3.0.dev20240122+cpu

Execution Provider

Other / Unknown

Execution Provider Library Version

amd

@neonarc4 neonarc4 added the training issues related to ONNX Runtime training; typically submitted using template label May 12, 2024
@github-actions github-actions bot added the ep:oneDNN questions/issues related to DNNL EP label May 12, 2024
@kshama-msft kshama-msft added ep:ROCm questions/issues related to ROCm execution provider and removed training issues related to ONNX Runtime training; typically submitted using template labels May 23, 2024
@mindest
Copy link
Contributor

mindest commented May 24, 2024

Which execution provider are you using? Is it CPUExecutionProvider instead of ROCMExecutionProvider?

@mindest mindest changed the title [Training] Could not find an implmentation for groupnorm [Inference] Could not find an implmentation for groupnorm May 24, 2024
Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Jun 24, 2024
@mindest
Copy link
Contributor

mindest commented Jun 25, 2024

The default execution provider is CPUExecutionProvider. If you want to run on ROCM EP, try

from optimum.onnxruntime import ORTStableDiffusionXLPipeline
pipeline = ORTStableDiffusionXLPipeline.from_pretrained(
    "greentree/SDXL-olive-optimized", provider="ROCMExecutionProvider")

Please reopen if there is still issue.

@mindest mindest closed this as completed Jun 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ep:oneDNN questions/issues related to DNNL EP ep:ROCm questions/issues related to ROCm execution provider stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

3 participants