You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hi can we not use onnxruntime-gpu for conversion of a model to onnx?
On cpu it takes about 30 min and 12gb ram for a model ,
i tried to change CPUExecutionProvider to cuda execution provider after installing onnxruntime-gpu but it loaded everything on cpu .
I am using colab
The text was updated successfully, but these errors were encountered:
hi can we not use onnxruntime-gpu for conversion of a model to onnx?
On cpu it takes about 30 min and 12gb ram for a model ,
i tried to change CPUExecutionProvider to cuda execution provider after installing onnxruntime-gpu but it loaded everything on cpu .
I am using colab
The text was updated successfully, but these errors were encountered: