Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[torchlib] Fix implementation for clamp_max / clamp_min #1765

Merged
merged 5 commits into from
Jul 30, 2024

Conversation

justinchuby
Copy link
Collaborator

@justinchuby justinchuby commented Jul 30, 2024

Update clamp_max and clamp_min. Remove support for size-0 inputs to simplify the implementations. Fixed registration to make the operators discoverable.

@justinchuby justinchuby added the topic: torch_lib Related to the torch/aten function lib in development label Jul 30, 2024
Copy link

codecov bot commented Jul 30, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 75.06%. Comparing base (19f1126) to head (dd574ae).

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1765      +/-   ##
==========================================
+ Coverage   75.01%   75.06%   +0.05%     
==========================================
  Files         245      245              
  Lines       26451    26443       -8     
  Branches     4826     4824       -2     
==========================================
+ Hits        19841    19850       +9     
+ Misses       5677     5662      -15     
+ Partials      933      931       -2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@justinchuby justinchuby changed the title [torchlib] Fix more registrations [torchlib] Fix implementation for clamp_max / clamp_min Jul 30, 2024
Copy link
Contributor

@titaiwangms titaiwangms left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we not need to support size 0 input? I feel like it's in the test for a reason?

@justinchuby
Copy link
Collaborator Author

Do we not need to support size 0 input? I feel like it's in the test for a reason?

There are tests because the pytorch operator supports that, but in practice I don't see that being meaningful because it's just doing an expand. We can come back to it when it becomes an issue.

Copy link

Test Results

     24 files  ±  0       24 suites  ±0   3h 21m 28s ⏱️ + 2m 9s
 13 384 tests  -   2   11 809 ✅  -   9    1 545 💤 +  7   30 ❌ ±0 
477 544 runs  +336  100 130 ✅ +168  377 146 💤 +168  268 ❌ ±0 

For more details on these failures, see this check.

Results for commit dd574ae. ± Comparison against base commit 19f1126.

This pull request removes 590 and adds 588 tests. Note that renamed tests count towards both.
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_062_aten_clamp
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_063_aten_clamp_max
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_064_aten_clamp_min
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_065_aten_clone
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_066_aten_complex
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_068_aten_cat
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_069_aten_conj
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_070_aten_conj_complex
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_071_aten_constant_pad_nd
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_072_aten_cos
…
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_062_aten_clamp_max
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_063_aten_clamp_min
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_064_aten_clone
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_065_aten_complex
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_066_aten_cat
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_068_aten_conj
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_069_aten_conj_complex
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_070_aten_constant_pad_nd
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_071_aten_cos
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_072_aten_cosh
…
This pull request removes 134 skipped tests and adds 133 skipped tests. Note that renamed tests count towards both.
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_062_aten_clamp
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_065_aten_clone
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_066_aten_complex
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_068_aten_cat
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_069_aten_conj
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_070_aten_conj_complex
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_076_aten_diagonal
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_077_aten_diagonal_bool
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_081_aten_div_mode
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_082_aten_div_mode_int
…
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_064_aten_clone
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_065_aten_complex
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_066_aten_cat
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_068_aten_conj
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_069_aten_conj_complex
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_075_aten_diagonal
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_076_aten_diagonal_bool
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_080_aten_div_mode
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_081_aten_div_mode_int
tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_script_function_passes_checker_083_aten_empty
…
This pull request skips 8 tests.
tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU ‑ test_output_match_opinfo__clamp_min_cpu_float16
tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU ‑ test_output_match_opinfo__clamp_min_cpu_float32
tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU ‑ test_output_match_opinfo__clamp_min_cpu_int32
tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU ‑ test_output_match_opinfo__clamp_min_cpu_int64
tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU ‑ test_output_match_opinfo__clamp_min_cpu_float16
tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU ‑ test_output_match_opinfo__clamp_min_cpu_float32
tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU ‑ test_output_match_opinfo__clamp_min_cpu_int32
tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU ‑ test_output_match_opinfo__clamp_min_cpu_int64

@justinchuby justinchuby merged commit a72f048 into main Jul 30, 2024
30 of 43 checks passed
@justinchuby justinchuby deleted the justinchu/registration-clamp branch July 30, 2024 18:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
topic: torch_lib Related to the torch/aten function lib in development
Projects
Development

Successfully merging this pull request may close these issues.

2 participants