Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add op (batch_norm) | feat(torchlib) #1761

Closed

Conversation

titaiwangms
Copy link
Contributor

No description provided.

@titaiwangms titaiwangms added the topic: torch_lib Related to the torch/aten function lib in development label Jul 26, 2024
@titaiwangms titaiwangms requested a review from justinchuby July 26, 2024 22:47
Copy link

codecov bot commented Jul 26, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 75.03%. Comparing base (19f1126) to head (86e57ef).

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1761      +/-   ##
==========================================
+ Coverage   75.01%   75.03%   +0.02%     
==========================================
  Files         245      245              
  Lines       26451    26456       +5     
  Branches     4826     4828       +2     
==========================================
+ Hits        19841    19851      +10     
+ Misses       5677     5673       -4     
+ Partials      933      932       -1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@titaiwangms
Copy link
Contributor Author

torch.export.export does not use aten::batch_norm, but use aten::native_batch_norm_legit (pytorch/pytorch#88697). We can revisit when we actually run into batch_norm missing op.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
topic: torch_lib Related to the torch/aten function lib in development
Projects
Development

Successfully merging this pull request may close these issues.

1 participant