Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Roop not using gcolab GPUs #400

Closed
cubargh opened this issue Jan 14, 2024 · 17 comments
Closed

Roop not using gcolab GPUs #400

cubargh opened this issue Jan 14, 2024 · 17 comments
Labels

Comments

@cubargh
Copy link

cubargh commented Jan 14, 2024

Describe the bug
I have collab premium. Copied the notebook, ran it in a v100 hardware. No GPU utilization. Tried it in a100, same result

Help?

got this error on console
[E:onnxruntime:Default, provider_bridge_ort.cc:1480 TryGetProviderInfo_CUDA] /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1193 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_cuda.so with error: libcufft.so.10: cannot open shared object file: No such file or director

@cubargh
Copy link
Author

cubargh commented Jan 15, 2024

@C0untFloyd got any idea why is this happening?

@marat2509
Copy link

marat2509 commented Jan 17, 2024

Google Colab has updated the Cuda version to version 12. ONNXRuntime does not support this version yet, but this feature will appear soon
See microsoft/onnxruntime#18850
Also see googlecolab/colabtools#4214

@cubargh cubargh changed the title Roop not using gcollab GPUs Roop not using gcolab GPUs Jan 17, 2024
@cubargh
Copy link
Author

cubargh commented Jan 17, 2024

For a temporary fix, go to the command palette (ctrl+shift+P) and select "Use fallback runtime version"

@C0untFloyd
Copy link
Owner

I was having problems on my local pc as well, suddenly not recognizing cuda but instead having "Azure" among the execution providers. I went back to installing onnxruntime-gpu v1.16.2 which seemed to have solved it. Weird...

This was referenced Jan 19, 2024
@thany22
Copy link

thany22 commented Feb 6, 2024

For a temporary fix, go to the command palette (ctrl+shift+P) and select "Use fallback runtime version"

The previous runtime version isn't available anymore. Any solutions)

@marat2509
Copy link

For a temporary fix, go to the command palette (ctrl+shift+P) and select "Use fallback runtime version"

The previous runtime version isn't available anymore. Any solutions)

Install latest official onnxruntime-gpu

@Marv761125
Copy link

For a temporary fix, go to the command palette (ctrl+shift+P) and select "Use fallback runtime version"

The previous runtime version isn't available anymore. Any solutions)

Install latest official onnxruntime-gpu

Hi. Could you tell me how to do this or is there a link you used? Thx.

@marat2509
Copy link

how to do this

pip install --upgrade onnxruntime-gpu

@Marv761125
Copy link

how to do this

pip install --upgrade onnxruntime-gpu

Hi Marat. This may sound pretty stupid, but where do I have to put this codeline in? I use the Roop-Unleashed colab version. Thx.

@htmlcodepreview
Copy link

how to do this

pip install --upgrade onnxruntime-gpu

Tried this and several versions like 16.2 17 and 17.1, with no success.

@marat2509
Copy link

marat2509 commented Feb 12, 2024

Setup (after cloning):

import os
!pip install pyyaml -r requirements.txt

if os.path.isfile("/usr/bin/nvidia-smi") or os.path.isfile("/opt/bin/nvidia-smi"):
    !pip uninstall -y onnxruntime onnxruntime-gpu
    !pip install onnxruntime-gpu==1.17.0 --extra-index-url https://pkgs.dev.azure.com/onnxruntime/onnxruntime/_packaging/onnxruntime-cuda-12/pypi/simple

Run:

import yaml
import os

roop_config = {
    'clear_output': True,
    'live_cam_start_active': False,
    'max_threads': 32,
    'output_image_format': 'png',
    'output_video_codec': 'libx264',
    'output_video_format': 'mp4',
    'video_quality': 14,
    'frame_buffer_size': 6,
    'selected_theme': 'Default',
    'server_name': '',
    'server_port': 0,
    'server_share': True
}

if os.path.isfile("/usr/bin/nvidia-smi") or os.path.isfile("/opt/bin/nvidia-smi"):
    roop_config["provider"] = "cuda"
    print("I: Using CUDA")
else:
    roop_config["provider"] = "cpu"

with open("./config.yaml", "w") as config_file:
    config_file.write(yaml.dump(roop_config))

!python run.py

@zachysaur
Copy link

wht about on kaggle?

@michidaeron
Copy link

michidaeron commented Feb 17, 2024

how to do this

pip install --upgrade onnxruntime-gpu

Is there no way to avoid the message about disallowed code and cause it to be disconnected?
Using fallback runtime version it worked before but not now :C

@marat2509
Copy link

@michidaeron @htmlcodepreview @Marv761125 - me updated and tested new setup code. Please test

@htmlcodepreview
Copy link

it works for me now

Copy link

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

@github-actions github-actions bot added the Stale label Apr 10, 2024
Copy link

This issue was closed because it has been stalled for 5 days with no activity.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

8 participants