Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

nodejs + tensorrt :: how to set execution provider options? #18380

Open
ogolovanov opened this issue Nov 9, 2023 · 2 comments
Open

nodejs + tensorrt :: how to set execution provider options? #18380

ogolovanov opened this issue Nov 9, 2023 · 2 comments
Labels
api:Java issues related to the Java API api:Javascript issues related to the Javascript API ep:TensorRT issues related to TensorRT execution provider stale issues that have not been addressed in a while; categorized by a bot

Comments

@ogolovanov
Copy link

ogolovanov commented Nov 9, 2023

Describe the issue

Hi.

I unable to find any examples on the web - how to set provider options for TensorRT via nodejs.
At the same time there are examples for C++/python/JAVA.
https://onnxruntime.ai/docs/execution-providers/TensorRT-ExecutionProvider.html#samples

Setting provider options via ENV also does not work.

export ORT_TENSORRT_ENGINE_CACHE_ENABLE=1
export ORT_TENSORRT_CACHE_PATH=/tmp/trt_cache/
node test.js

test.js creates session like that:
const model = await ort.InferenceSession.create(modelBuffer, {"executionProviders": ["tensorrt"], "logSeverityLevel": 0})

Everyting works ok, but there is nothing at /tmp/trt_cache/.
I see that nodejs binding simply ignores any ENV variables, related to TensorRT.

Same situation with "onnxruntime_perf_test" ( = nothing at /tmp/trt_cache/ ):

export ORT_TENSORRT_ENGINE_CACHE_ENABLE=1
export ORT_TENSORRT_CACHE_PATH=/tmp/trt_cache/
./onnxruntime_perf_test -r 1 -e tensorrt -i "trt_fp16_enable|true" /root/www/model.trt.onnx

I don't understand something, or it is really not possible to set TensorRT provider options via nodejs?
Why ORT_TENSORRT_* ENV variables ignored?

Thanks.

To reproduce


Urgency

not urgent

Platform

Linux

OS Version

Ubuntu 22.04

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.17.0

ONNX Runtime API

JavaScript

Architecture

X64

Execution Provider

TensorRT

Execution Provider Library Version

TensorRT 8.6.1.6-1+cuda12.0

@github-actions github-actions bot added api:Java issues related to the Java API api:Javascript issues related to the Javascript API ep:TensorRT issues related to TensorRT execution provider labels Nov 9, 2023
Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Dec 10, 2023
@fr-an-k
Copy link

fr-an-k commented Dec 4, 2024

Is tensorrt supported at all on the nodejs backend? It would be good to add a tensorrt row to the table of supported backends in the documentation, even if it is not supported.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api:Java issues related to the Java API api:Javascript issues related to the Javascript API ep:TensorRT issues related to TensorRT execution provider stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

2 participants