-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PyPI 1.16.0 release requires specifying the execution provider during InferenceSession creation #17631
Comments
The providers parameter is required for onnxruntime-gpu 1.15.1 in linux/windows. Maybe this issue is for cpu-only package. |
Yes, this is for the cpu package, or at least that is what I think. I installed using:
|
Yes, this looks like a valid issue. The workaround is to explicitly supply the CPUExecutionProvider in the list. @RandySheriff - can you take a look? |
+1 also having this issue |
Temporary fix due to the MS Issue of 1.16.0 ORT library microsoft/onnxruntime#17631 Default 'AzureExecutionProvider' instead of CPU Awaiting for the 1.16.1 patch
Temporary fix due to the MS Issue of 1.16.0 ORT library microsoft/onnxruntime#17631 Default 'AzureExecutionProvider' instead of CPU Awaiting for the 1.16.1 patch
There is currently a bug in `onnxruntime==1.16.0` (microsoft/onnxruntime#17631), so the installation instructions have been revised to reflect that. The usage of the "requirements.txt" file has also been clarified.
# Description Add `get_stt` timing metric for audio input # Issues NeonGeckoCom/neon-minerva#3 # Other Notes Includes patch for microsoft/onnxruntime#17631 Updates license tests for dependency with undefined MIT license --------- Co-authored-by: Daniel McKnight <[email protected]>
1.16.1 was released with the below fix commit. |
Describe the issue
The latest 1.16.0 CPU-release (aka
onnxruntime
) on PyPI appears to haveAzureExecutionProvider
as a default provider (introduced by #17025 ?). This means that the EP is now ambiguous with respect to the second defaultCPUExecutionProvider
causing an exception:The text in the exception is misleading. I just checked and as of 1.15.1 it was not necessary to specify an explicit provider through the Python interface. No warning is given in 1.15.1 about this change either.
This is a breaking change in the Python interface. Furthermore, the choice of making the
AzureExecutionProvider
EP one of the defaults seems strange given that it appears to be undocumented.To reproduce
Create an inference session via the Python API (using a random model from the
onnx/onnx
repository):Urgency
This is a breaking change with no prior warnings on possibly the most common way to initialize a session.
Platform
Mac
OS Version
12.3.1
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.16.0
ONNX Runtime API
Python
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response
Edit: Clarified that this is about the CPU-release
The text was updated successfully, but these errors were encountered: