[Training] Support for RKNPU Execution Provider on RK3562 Platform and On-Device Training Capabilities #21060
Labels
ep:RockchipNPU
issues related to Rockchip execution provider
stale
issues that have not been addressed in a while; categorized by a bot
training
issues related to ONNX Runtime training; typically submitted using template
Describe the issue
Hi,
I have been reading the documentation for ONNX Runtime and came across the RKNPU Execution Provider (EP). I would like to clarify my understanding:
It appears that the RKNPU EP currently only supports the RK1808 Linux platform. Does this mean I cannot use the RKNPU EP on my RK3562 platform?
If the answer to the first question is yes, can I still perform ONNX model inference using the CPU on the RK3562?
To further clarify my requirements:
I intend to use ONNX Runtime with the RKNPU EP for model inference on the RK3562 platform.
I also plan to use ONNX Runtime's On-Device Training feature to train models on the RK3562 and would like to use the RKNPU EP for inference with the trained models.
Does the current version of ONNX Runtime support these requirements? Thank you for your clarification
To reproduce
No
Urgency
No response
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.18
PyTorch Version
1.7
Execution Provider
Other / Unknown
Execution Provider Library Version
RKNN EP
The text was updated successfully, but these errors were encountered: