-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could not build solution.!! #19890
Comments
please build RelWithDebInfo or Release flavor |
@yf711 can you check if Debug build has new issues? |
Thanks for your help @jywu-msft ! I use the nuget package to load the onnxruntime-gpu version 1.16.3, and I could see the folder like: My question is: |
@hy846130226 You can build ORT with Actually, I tried your command to build ORT in debug mode and it passed on my side. |
Hi @yf711, I do not know "You can build ORT with --build_shared_lib to generate onnxruntime.dll/lib", could you please tell me more details or what should I do? .\build.bat --cudnn_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8" --cuda_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8" --use_tensorrt --tensorrt_home “C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8\TensorRT-8.6.1.6” --cmake_generator "Visual Studio 17 2022" --buildshared_lib Is this your means? |
And my tensorrt version is 8.6.1.6, according to the official onnxruntime website, "The TensorRT execution provider for ONNX Runtime is built and tested with TensorRT 8.6.". So I don't think I should use the "--use_tensorrt_oss_parser" command |
Hi @yf711, Thanks for your help. |
Describe the issue
I pull the V1.6.3 code.
and use the following command: .\build.bat --cudnn_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8" --cuda_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8" --use_tensorrt --tensorrt_home “C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8\TensorRT-8.6.1.6” --cmake_generator "Visual Studio 17 2022"
but when I got the solution, I COULD NOT build the nvonnxparser_static project
Urgency
No response
Target platform
C++
Build script
.\build.bat --cudnn_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8" --cuda_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8" --use_tensorrt --tensorrt_home “C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8\TensorRT-8.6.1.6” --cmake_generator "Visual Studio 17 2022"
Error / output
C2664 “bool google::protobuf::TextFormat::Parse(google::protobuf::io::ZeroCopyInputStream *,google::protobuf::Message *)”: could not transfrom “onnx::ModelProto *” into “google::protobuf::Message *” nvonnxparser_static D:\onnxruntime\onnxruntime\build\Windows\Debug_deps\onnx_tensorrt-src\ModelImporter.cpp 358
Visual Studio Version
No response
GCC / Compiler Version
No response
The text was updated successfully, but these errors were encountered: