You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The documentation(https://github.com/microsoft/onnxruntime/blob/main/docs/ContribOperators.md) tells me Ort supports Moe. I installed the 1.16.3 version of onnxruntime-gpu through pip. But when calling the InferenceSession function, ORT error: com.microsoft:MoE(-1) is not a registered functionc/op. Doesn't the 1.16.3 version of ORT support Moe yet?
To reproduce
None
Urgency
No response
Platform
Linux
OS Version
CentOS Linux release 7.9.2009
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.16.3
ONNX Runtime API
Python
Architecture
X64
Execution Provider
CUDA
Execution Provider Library Version
cuda 11.4
The text was updated successfully, but these errors were encountered:
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
Describe the issue
The documentation(https://github.com/microsoft/onnxruntime/blob/main/docs/ContribOperators.md) tells me Ort supports Moe. I installed the 1.16.3 version of onnxruntime-gpu through pip. But when calling the InferenceSession function, ORT error: com.microsoft:MoE(-1) is not a registered functionc/op. Doesn't the 1.16.3 version of ORT support Moe yet?
To reproduce
None
Urgency
No response
Platform
Linux
OS Version
CentOS Linux release 7.9.2009
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.16.3
ONNX Runtime API
Python
Architecture
X64
Execution Provider
CUDA
Execution Provider Library Version
cuda 11.4
The text was updated successfully, but these errors were encountered: