-
Notifications
You must be signed in to change notification settings - Fork 554
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Detecting cuda
but not using it.
#412
Comments
Your CUDA version is not suuported currently, please see #400 |
Will converting this ONNX model to latest model work? import onnx
from onnx import version_converter
model_path = Path("models")
dest_path = Path('converted_models')
for model_ in model_path.iterdir():
if str(model_).endswith('onnx'):
dest_path_model = dest_path/model_.name
print(dest_path_model)
model = onnx.load(model_)
converted_model = version_converter.convert_version(model, 11)
onnx.save(converted_model, dest_path_model) |
I doubt it. The problem isn't the model but the runtime which isn't up to date. |
Somedev's roop is working on this same gpu. how? |
I had a similar problem and fixed it by installing the not yet final release of ONNX runtime. It uses current CUDA versions, therefore enabling your GPU: #329 (comment) |
Describe the bug
Roop is detecting the GPU, but not using it.
Details
What OS are you using?
Are you using a GPU?
Which version of roop unleashed are you using?
jan 20, 2024 git pull
Screenshots
The text was updated successfully, but these errors were encountered: