-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[VitisAI] 1. api compatbile 2. dynamic load onnx #18470
Conversation
@jywu-msft Hi, we have made some major changes to Vitis AI. Could you take a look when avaliable? |
/azp run Linux CPU CI Pipeline,Linux CPU Minimal Build E2E CI Pipeline,Linux GPU CI Pipeline,Linux GPU TensorRT CI Pipeline,Linux OpenVINO CI Pipeline,Linux QNN CI Pipeline,MacOS CI Pipeline,ONNX Runtime Web CI Pipeline |
/azp run Windows ARM64 QNN CI Pipeline,Windows CPU CI Pipeline,Windows GPU CI Pipeline,Windows GPU TensorRT CI Pipeline,onnxruntime-binary-size-checks-ci-pipeline,orttraining-linux-ci-pipeline,orttraining-linux-gpu-ci-pipeline,orttraining-ortmodule-distributed |
Azure Pipelines successfully started running 8 pipeline(s). |
1 similar comment
Azure Pipelines successfully started running 8 pipeline(s). |
@jywu-msft Just a question. I found the cpplint failed. Is there a character limit of 120 for onnxruntime now? I checked the .clang-format there isn't any limit there. |
yes, that's always been there. Line 6 in 3f0ebd6
and https://github.com/microsoft/onnxruntime/blob/main/docs/Coding_Conventions_and_Standards.md |
Hi, I l have linted my code. Could you restart the pipeline. |
/azp run Linux CPU CI Pipeline,Linux CPU Minimal Build E2E CI Pipeline,Linux GPU CI Pipeline,Linux GPU TensorRT CI Pipeline,Linux OpenVINO CI Pipeline,Linux QNN CI Pipeline,MacOS CI Pipeline,ONNX Runtime Web CI Pipeline |
/azp run Windows ARM64 QNN CI Pipeline,Windows CPU CI Pipeline,Windows GPU CI Pipeline,Windows GPU TensorRT CI Pipeline,onnxruntime-binary-size-checks-ci-pipeline,orttraining-linux-ci-pipeline,orttraining-linux-gpu-ci-pipeline,orttraining-ortmodule-distributed |
Azure Pipelines successfully started running 8 pipeline(s). |
1 similar comment
Azure Pipelines successfully started running 8 pipeline(s). |
@jywu-msft Sorry, the last lint wasn't thorough. Could you start the pipeline again? |
1 similar comment
@jywu-msft Sorry, the last lint wasn't thorough. Could you start the pipeline again? |
@jywu-msft Hi, could you restart the pipeline. |
/azp run Linux CPU CI Pipeline,Linux CPU Minimal Build E2E CI Pipeline,Linux GPU CI Pipeline,Linux GPU TensorRT CI Pipeline,Linux OpenVINO CI Pipeline,Linux QNN CI Pipeline,MacOS CI Pipeline,ONNX Runtime Web CI Pipeline |
/azp run Windows ARM64 QNN CI Pipeline,Windows CPU CI Pipeline,Windows GPU CI Pipeline,Windows GPU TensorRT CI Pipeline,onnxruntime-binary-size-checks-ci-pipeline,orttraining-linux-ci-pipeline,orttraining-linux-gpu-ci-pipeline,orttraining-ortmodule-distributed |
Azure Pipelines successfully started running 8 pipeline(s). |
1 similar comment
Azure Pipelines successfully started running 8 pipeline(s). |
@jywu-msft Should the failed CI concerns me? I found the failures are not related to my PR. |
/azp run Windows x64 QNN CI Pipeline |
Azure Pipelines successfully started running 1 pipeline(s). |
looks unrelated to your PR. there's some transient failure. I am resubmitting the failed jobs. |
I found the remaining errors also exist in other pull requests. So, feel free to review the code now. |
@jywu-msft Can this PR be merged? |
Description
Motivation and Context