CUDA wasnt able to be loaded when tring to use CUDA12.4 and onnxruntime-gpu 11.8 #21049
Labels
ep:CUDA
issues related to the CUDA execution provider
platform:windows
issues related to the Windows platform
Describe the issue
Just simplly using
onnxruntime.InferenceSession(self.__graph_path, providers=['CUDAExecutionProvider',])
And faild.
I get Errors here:
`2024-06-15 03:51:08.7326342 [E:onnxruntime:Default, provider_bridge_ort.cc:1744 onnxruntime::TryGetProviderInfo_CUDA] C:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1426 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "c:\Users\usr\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
*************** EP Error ***************
EP Error C:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:866 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
when using ['CUDAExecutionProvider']
Falling back to ['CUDAExecutionProvider', 'CPUExecutionProvider'] and retrying.
2024-06-15 03:51:08.7616051 [E:onnxruntime:Default, provider_bridge_ort.cc:1744 onnxruntime::TryGetProviderInfo_CUDA] C:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1426 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "c:\Users\usr\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
Traceback (most recent call last):
File "c:\Users\usr\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "c:\Users\usr\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: C:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:866 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\usr\Desktop\TopTech!WOWS_W.O.L.F\OCR\test.py", line 6, in
ocr = ddddocr.DdddOcr(show_ad=False,use_gpu=True) # 切换为第二套ocr模型
File "c:\Users\usr\AppData\Local\Programs\Python\Python310\lib\site-packages\ddddocr_init_.py", line 2419, in init
self.__ort_session = onnxruntime.InferenceSession(self.__graph_path, providers=self.__providers)
File "c:\Users\usr\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 432, in init
raise fallback_error from e
File "c:\Users\usr\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 427, in init
self._create_inference_session(self._fallback_providers, None)
File "c:\Users\usr\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: C:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:866 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.`
CUDA working:
Copyright (c) 2005-2023 NVIDIA Corporation Built on Wed_Nov_22_10:30:42_Pacific_Standard_Time_2023 Cuda compilation tools, release 12.3, V12.3.107 Build cuda_12.3.r12.3/compiler.33567101_0
Drivers working:
`+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 546.12 Driver Version: 546.12 CUDA Version: 12.3 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name TCC/WDDM | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce RTX 3080 ... WDDM | 00000000:01:00.0 Off | N/A |
| N/A 59C P3 24W / 115W | 1425MiB / 16384MiB | 53% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| 0 N/A N/A 10668 C+G ...s\System32\Kinect\KinectService.exe N/A |
| 0 N/A N/A 22068 C+G ...8517795\bin64\WorldOfWarships64.exe N/A |
| 0 N/A N/A 23188 C+G ...in\8517795\bin64\cef_subprocess.exe N/A |
+---------------------------------------------------------------------------------------+`
Provider also good:
To reproduce
Using
pip install onnxruntime-gpu --extra-index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/onnxruntime-cuda-12/pypi/simple/
from
https://onnxruntime.ai/docs/install/
and cuda12.4
then execcute:
onnxruntime.InferenceSession(<graph>, providers=['CUDAExecutionProvider',])
Urgency
No response
Platform
Windows
OS Version
Windows 11 21h2
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
onnxruntime-gpu 11.8
ONNX Runtime API
Python
Architecture
X64
Execution Provider
CUDA
Execution Provider Library Version
CUDA 12.4
The text was updated successfully, but these errors were encountered: