Using ONNX runtime with Execution Providers in Java #22996
Labels
api:Java
issues related to the Java API
ep:OpenVINO
issues related to OpenVINO execution provider
ep:Xnnpack
issues related to XNNPACK EP
Describe the issue
The issue is from user Kevin from Earnix. Here are the details from Kevin:
We are succesfully using the ONNX runtime in our system (currently version 1.15.1).
To reference the ONNX runtime, we are using Maven and the Jar file it provides.
I am currently investigating ways to improve performance of the ONNX models. We are investigating a bunch of options.
I am investigating the use of Execution Provider and how they might affect performance.
We may possibly use Intel DNNL, Intel OpenVINO and/or XNNPACK.
In Java, I added the following code:
OrtSession.SessionOptions sessionOptions = new OrtSession.SessionOptions();
sessionOptions.addXnnpack(Map.of("intra_op_num_threads","1"));
I recived the following exception:
ai.onnxruntime.OrtException: Error code - ORT_INVALID_ARGUMENT - message: XNNPACK execution provider is not supported in this build. : ai.onnxruntime.OrtException: Error code - ORT_INVALID_ARGUMENT - message: XNNPACK execution provider is not supported in this build.
I understood from this error message, and from some research online, that I need a build of ONNX that includes these Execution Provider.
The idea would be to find an existing build that includes these exceution prodides by default, but I don't see that.
Next I tried to build the library myself. following parts of the guides below:
https://onnxruntime.ai/docs/build/eps.html#execution-provider-shared-libraries
https://onnxruntime.ai/docs/build/inferencing.html
I was able to include DNNL and XNNPACK using this
python ..\onnxruntime\tools\ci_build\build.py --config RelWithDebInfo --cmake_generator "Visual Studio 17 2022" --build_dir . --build_java --use_dnnl --use_xnnpack
But the Jar only contained the Native files for Windows and not for linux (or Mac). The Jar file from Maven contains native files for multiple operrating systems.
How can I build a Jar that contains, all the Native files, and the 3 Execution Provider I require?
I also see failed unit tests, but they seem to be possibly because of tolerance:
[----------] Global test environment tear-down
[==========] 4603 tests from 285 test suites ran. (82448 ms total)
[ PASSED ] 4597 tests.
[ FAILED ] 6 tests, listed below:
[ FAILED ] QAttentionTest.QAttentionDNNLBatch1
[ FAILED ] QAttentionTest.QAttentionDNNLBatch2
[ FAILED ] QAttentionTest.QAttentionDNNLMaskPartialSequence
[ FAILED ] QAttentionTest.QAttentionNoMaskIndex
[ FAILED ] QAttentionTest.QAttentionPrunedModel
[ FAILED ] ActivationOpTest.LeakyRelu_bfloat16
Kind regards,
Kevin Brenkel
To reproduce
In Java, I added the following code:
OrtSession.SessionOptions sessionOptions = new OrtSession.SessionOptions();
sessionOptions.addXnnpack(Map.of("intra_op_num_threads","1"));
I recived the following exception:
ai.onnxruntime.OrtException: Error code - ORT_INVALID_ARGUMENT - message: XNNPACK execution provider is not supported in this build. : ai.onnxruntime.OrtException: Error code - ORT_INVALID_ARGUMENT - message: XNNPACK execution provider is not supported in this build.
I understood from this error message, and from some research online, that I need a build of ONNX that includes these Execution Provider.
The idea would be to find an existing build that includes these exceution prodides by default, but I don't see that.
Next I tried to build the library myself. following parts of the guides below:
https://onnxruntime.ai/docs/build/eps.html#execution-provider-shared-libraries
https://onnxruntime.ai/docs/build/inferencing.html
I was able to include DNNL and XNNPACK using this
python ..\onnxruntime\tools\ci_build\build.py --config RelWithDebInfo --cmake_generator "Visual Studio 17 2022" --build_dir . --build_java --use_dnnl --use_xnnpack
But the Jar only contained the Native files for Windows and not for linux (or Mac). The Jar file from Maven contains native files for multiple operrating systems.
How can I build a Jar that contains, all the Native files, and the 3 Execution Provider I require?
I also see failed unit tests, but they seem to be possibly because of tolerance:
[----------] Global test environment tear-down
[==========] 4603 tests from 285 test suites ran. (82448 ms total)
[ PASSED ] 4597 tests.
[ FAILED ] 6 tests, listed below:
[ FAILED ] QAttentionTest.QAttentionDNNLBatch1
[ FAILED ] QAttentionTest.QAttentionDNNLBatch2
[ FAILED ] QAttentionTest.QAttentionDNNLMaskPartialSequence
[ FAILED ] QAttentionTest.QAttentionNoMaskIndex
[ FAILED ] QAttentionTest.QAttentionPrunedModel
[ FAILED ] ActivationOpTest.LeakyRelu_bfloat16
Kind regards,
Kevin Brenkel
Urgency
No response
Platform
Windows
OS Version
Windows 11
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.15.1
ONNX Runtime API
Python
Architecture
X64
Execution Provider
OpenVINO, Other / Unknown
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered: