Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does onnruntime-gpu support CUDA12.1? #19292

Closed
zhongqiu1245 opened this issue Jan 27, 2024 · 10 comments
Closed

Does onnruntime-gpu support CUDA12.1? #19292

zhongqiu1245 opened this issue Jan 27, 2024 · 10 comments
Labels
ep:CUDA issues related to the CUDA execution provider

Comments

@zhongqiu1245
Copy link

Describe the issue

Hello!
I use onnxruntime to run ptq(quant, static), but my cpu threads are always be killed.
So, I want to use gpu.
But, when I set provider=['CUDAExecutionProvider'], the error 'Failed to create cuda provider' was coming.
Does onnxruntime-gpu support CUDA12.1(pip install onnxrumtime)?
And pardon my poor english.

To reproduce

sess = ort.InferenceSession('xxxxxx.onnx', providers=['CUDAExecutionProvider'])

'Failed to create CUDAExecutionProvider. Please reference https://onnxruntime.ai/docs/reference/execution-providers/CUDA-ExecutionProvider.html#requirements to ensure all dependencies are met.'

Urgency

Not really

Platform

Linux

OS Version

ubuntu 22.04

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

onnxrumtime 1.16.2

ONNX Runtime API

Python

Architecture

X64

Execution Provider

CUDA

Execution Provider Library Version

CUDA12.1

@github-actions github-actions bot added the ep:CUDA issues related to the CUDA execution provider label Jan 27, 2024
@tianleiwu
Copy link
Contributor

You will need wait for 1.17 release, which will have cuda 12 release package.

You can test release candidate build here:
ort-nightly-gpu : 1.17.0.dev20240118002

@snnn
Copy link
Member

snnn commented Jan 27, 2024

See also: #19137

@zhongqiu1245
Copy link
Author

Thank you, guys!

@jarredou
Copy link

1.17 still generates same error with cuda 12.2 while nightly release was greatly working

@tianleiwu
Copy link
Contributor

See https://onnxruntime.ai/docs/install/#requirements if you encountered issue with CUDA 12.*

@jarredou
Copy link

jarredou commented Mar 17, 2024

See https://onnxruntime.ai/docs/install/#requirements if you encountered issue with CUDA 12.*

No. That "stable" release has never worked on Google Colab and generates:

2024-03-17 16:17:28.444178554 [E:onnxruntime:Default, provider_bridge_ort.cc:1548 TryGetProviderInfo_CUDA] /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1209 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_cuda.so with error: libcublasLt.so.11: cannot open shared object file: No such file or directory

2024-03-17 16:17:28.444221332 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:861 CreateExecutionProviderInstance] Failed to create CUDAExecutionProvider. Please reference https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirementsto ensure all dependencies are met.

Installing the nighlty release fixes that issue instantly.

@JohnHerry
Copy link

Which version can support CUDA12.1+cuDNN8.9?

@tianleiwu
Copy link
Contributor

tianleiwu commented Sep 24, 2024

@JohnHerry,

1.18.0 supports cuda 12 and cudnn 8. See https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#cuda-12x

You can install that version like

pip install onnxruntime-gpu==1.18.0 --extra-index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/onnxruntime-cuda-12/pypi/simple/

@JohnHerry
Copy link

https://aiinfra.pkgs.visualstudio.com/PublicPackages/_p

Thank you very much for the help. by the way, I need to install with local package. where can I download this package?

@tianleiwu
Copy link
Contributor

https://aiinfra.pkgs.visualstudio.com/PublicPackages/_p

Thank you very much for the help. by the way, I need to install with local package. where can I download this package?

Yes, you can download the whl file (See https://onnxruntime.ai/docs/install/#python-installs for wheel loculation), then pip install *.whl

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ep:CUDA issues related to the CUDA execution provider
Projects
None yet
Development

No branches or pull requests

5 participants