Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement _log_softmax | feat(torchlib) #1079

Merged
merged 9 commits into from
Oct 3, 2023
Merged

Conversation

justinchuby
Copy link
Collaborator

Fixes #1077

@justinchuby justinchuby added the topic: torch_lib Related to the torch/aten function lib in development label Oct 2, 2023
Copy link
Contributor

@BowenBao BowenBao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you also add test coverage?

@justinchuby
Copy link
Collaborator Author

Could you also add test coverage?

On it

@codecov
Copy link

codecov bot commented Oct 2, 2023

Codecov Report

Merging #1079 (c33a9c0) into main (b958002) will decrease coverage by 0.02%.
The diff coverage is 69.56%.

@@            Coverage Diff             @@
##             main    #1079      +/-   ##
==========================================
- Coverage   77.92%   77.91%   -0.02%     
==========================================
  Files         115      115              
  Lines       14684    14706      +22     
  Branches     1558     1562       +4     
==========================================
+ Hits        11443    11458      +15     
- Misses       2872     2877       +5     
- Partials      369      371       +2     
Files Coverage Δ
onnxscript/function_libs/torch_lib/ops/special.py 61.94% <100.00%> (ø)
...ript/tests/function_libs/torch_lib/extra_opinfo.py 98.33% <100.00%> (+0.03%) ⬆️
...ipt/tests/function_libs/torch_lib/ops_test_data.py 96.18% <100.00%> (+0.01%) ⬆️
onnxscript/function_libs/torch_lib/ops/core.py 79.55% <50.00%> (-0.15%) ⬇️

@github-actions
Copy link

github-actions bot commented Oct 2, 2023

Test Results

         18 files  ±         0         18 suites  ±0   1h 13m 42s ⏱️ - 9m 42s
  10 974 tests +       12    8 226 ✔️ +       7      2 729 💤 +         6       19 ±0 
154 935 runs   - 15 560  34 934 ✔️  - 3 767  118 546 💤  - 11 792  1 455 ±0 

For more details on these failures, see this check.

Results for commit c33a9c0. ± Comparison against base commit b958002.

This pull request removes 519 and adds 531 tests. Note that renamed tests count towards both.
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_001_aten__softmax
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_002_aten__softmax_half
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_003_aten_all_dim
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_004_aten_allclose
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_005_aten_all
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_006_aten_abs
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_007_aten_abs_complex
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_008_aten_acos
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_009_aten_acosh
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_010_aten_add
…
onnxscript.function_libs.tools.torch_lib.deduce_type_constraints_test.TestDeduceTypeConstraints ‑ test_deduce_type_constraints_does_not_crash_for_onnx_function_aten__log_softmax
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_001_aten__log_softmax
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_002_aten__log_softmax_half
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_003_aten__softmax
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_004_aten__softmax_half
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_005_aten_all_dim
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_006_aten_allclose
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_007_aten_all
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_008_aten_abs
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_009_aten_abs_complex
…

♻️ This comment has been updated with latest results.

@justinchuby
Copy link
Collaborator Author

Added tests

@justinchuby justinchuby requested a review from BowenBao October 2, 2023 22:56

@torch_op("aten::_log_softmax")
def aten__log_softmax(
self: TFloatHighPrecision, dim: int, half_to_float: bool # pylint: disable=unused-argument
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: very minor, should we do below to be consistent (saw in aten__softmax_half below) ?

    del half_to_float  # Unused

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function is not trace only so we couldn't use del (not supported by onnxscript yet)

]

for (shape, dim), half_to_float in itertools.product(cases, (False,)):
# NOTE: softmax with half to float conversion is not supported on CPU
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we test bfloat16?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ORT doesn't run bfloat16 on cpu so

@justinchuby justinchuby merged commit c74bc6a into main Oct 3, 2023
@justinchuby justinchuby deleted the justinchu/_log_softmax branch October 3, 2023 18:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
topic: torch_lib Related to the torch/aten function lib in development
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support to aten._log_softmax.default
2 participants