Skip to content

[TensorRT ExecutionProvider] Cannot infer the model on a GPU device with an ID other than 0 #5039

[TensorRT ExecutionProvider] Cannot infer the model on a GPU device with an ID other than 0

[TensorRT ExecutionProvider] Cannot infer the model on a GPU device with an ID other than 0 #5039

Workflow file for this run

name: "Issue Labeler"
on:
issues:
types: [opened, edited]
permissions:
issues: write
jobs:
triage:
runs-on: ubuntu-latest
steps:
- uses: github/[email protected]
with:
repo-token: "${{ secrets.GITHUB_TOKEN }}"
configuration-path: .github/labeler.yml
not-before: 2020-01-15T02:54:32Z
enable-versioned-regex: 0
include-title: 1