-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request] Mark as negative tests for minimal CUDA build #21394
Comments
Typically if you're doing a build with reduced operators the simplest thing to do is use I expect the amount of time it would take to track and maintain lists of tests that are expected to pass/fail would far outweigh any benefit. For reference, there are currently over 4,000 tests in onnxruntime_test_all. |
For context we try to build up a pipeline for building/testing onnxruntime, and it could be confusing if we have to check the logs to see if all the failed tests are expected. |
Yes it does. The majority of tests in onnruntime_test_all loop through all the execution providers that are enabled in the build as they test the individual onnx operators with different input values and different opsets. There are also other things like tests for the optimizers or regression tests that use models from the testdata directory. |
Is it going to be non-trivial to do something like |
You could certainly try adding an ifdef around this line:
Given we do a similar thing for NHWC CUDA ops it doesn't seem unreasonable to use the same approach for a CUDA minimal build. |
Describe the feature request
Mark tests that are expected to fail in the minimal CUDA build.
Describe scenario use case
Some tests are expected to fail while using the minimal CUDA EP by compiling with
-Donnxruntime_CUDA_MINIMAL
. Since the CUDA "EP" in result is more of an utility library for memory allocations etc, ops are not expected to run directly with the minimal CUDA EP. Should we mark those tests as negative tests ifUSE_CUDA_MINIMAL
was defined?Happy to contribute. Thanks!
The text was updated successfully, but these errors were encountered: