Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Onnxruntime LoadLibrary failed with error 126 #21501

Open
Sumphy-ai opened this issue Jul 25, 2024 · 7 comments
Open

Onnxruntime LoadLibrary failed with error 126 #21501

Sumphy-ai opened this issue Jul 25, 2024 · 7 comments
Labels
ep:CUDA issues related to the CUDA execution provider stale issues that have not been addressed in a while; categorized by a bot

Comments

@Sumphy-ai
Copy link

Describe the issue

When I'm using Stable Diffusion Auto1111 WebUI with ControlNet IP-Adapter and ip-adapter-faceid-plus v2 created by h94 on hugginface, I keep getting the following error message :

[E:onnxruntime:Default, provider_bridge_ort.cc:1745 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1426 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\sd.webui\system\python\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"

PS : I'm not dev, not a native english speaker, I followed tutos and solutions proposed here but may have made mistakes.

To reproduce

Launch Stable Diffusion Auto1111 WebUI and generate a 832x1216px pic using JuggernautX model and DPM++ 2M Karras sampler with 4.5 CFG and 30 steps.
Activate ControlNet IP-Adapter and load any preprocessor and ip-adapter-faceid-plusv2_sdxl.
Start the generation.

Urgency

Issue is not that urgent since there is no client involved, but I'm stuck in my project development.

Platform

Windows

OS Version

10

ONNX Runtime Installation

Other / Unknown

ONNX Runtime Version or Commit ID

CUDA 12

ONNX Runtime API

Python

Architecture

Other / Unknown

Execution Provider

CUDA

Execution Provider Library Version

No response

@github-actions github-actions bot added ep:CUDA issues related to the CUDA execution provider platform:windows issues related to the Windows platform labels Jul 25, 2024
@skottmckay
Copy link
Contributor

Follow the installation instructions for CUDA and cuDNN very very precisely. Make sure the libraries from both are available in the PATH that is set when the load failure occurs.

You MUST use the instructions for the specific version of CUDA and cuDNN. e.g. some versions of cuDNN require zlibwapi.dll to be installed.

@Sumphy-ai
Copy link
Author

Follow the installation instructions for CUDA and cuDNN very very precisely. Make sure the libraries from both are available in the PATH that is set when the load failure occurs.

You MUST use the instructions for the specific version of CUDA and cuDNN. e.g. some versions of cuDNN require zlibwapi.dll to be installed.

Thank you for your quick answer, I'm going to reinstall them being very careful.

About the PATH thing, is it related to the parameters in the .bat file that launches the app, or shall I just check if variables are present in the folder indicated in the error message ?

@Sumphy-ai
Copy link
Author

Here is a feedback, I followed the instructions to install the compatible CUDA and cuDNN, and even TensorRT since some people seem to say it was required and i didn't have it.

But the problem remains, and I now have this message :

D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:891 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.

So I guess I installed the wrong versions (that I thought were the right one according to requirements doc...)

How can I be sure what versions go with my program ?

@sophies927 sophies927 added model:transformer issues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc. and removed platform:windows issues related to the Windows platform model:transformer issues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc. labels Jul 25, 2024
@Noor-Nizar
Copy link

Noor-Nizar commented Jul 27, 2024

Here is a feedback, I followed the instructions to install the compatible CUDA and cuDNN, and even TensorRT since some people seem to say it was required and i didn't have it.

But the problem remains, and I now have this message :

D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:891 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.

So I guess I installed the wrong versions (that I thought were the right one according to requirements doc...)

How can I be sure what versions go with my program ?

I'm also facing the same issue #21527

@GilbertPan97
Copy link

I encountered the same issue, but strangely, the same environment works fine on a Linux system. Is there any way to force it to use the GPU for inference? This way, I can determine which DLL is missing.

Follow the installation instructions for CUDA and cuDNN very very precisely. Make sure the libraries from both are available in the PATH that is set when the load failure occurs.

You MUST use the instructions for the specific version of CUDA and cuDNN. e.g. some versions of cuDNN require zlibwapi.dll to be installed.

Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Aug 27, 2024
@Sola-AIGithub
Copy link

Sorry, I'm having the same problem, I installed the following software version:

Microsoft.ML.OnnxRuntime.Gpu: v1.20.0
CUDA:11.8
cudnn:8.9.7.29_cuda11-archive

Image

Image

Is there any update solution here?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ep:CUDA issues related to the CUDA execution provider stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

6 participants