-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Deprecation Warnings in AutoCast API #113
fix: Deprecation Warnings in AutoCast API #113
Conversation
Signed-off-by: Abhishek <[email protected]>
Signed-off-by: Abhishek <[email protected]>
Signed-off-by: Abhishek <[email protected]>
Signed-off-by: Abhishek <[email protected]>
Signed-off-by: Abhishek <[email protected]>
@Abhishek-TAMU it looks good in general. I think it needs a lint, try Also will be good to run a local bench test to see if there is any regeression in the performance (highly doubt) to do this you run
If you want to make it faster you can just uncomment the other models and leave this line |
Signed-off-by: Abhishek <[email protected]>
Signed-off-by: Abhishek <[email protected]>
Thank you @fabianlim for guidance. I fixed fmt and lint issue. I also ran |
it should not be the case befcause it should come in with |
Also I commented out this line and next line and just left model |
@Abhishek-TAMU you need to inspect the |
Thanks for the input Fabian. Sharing |
@Abhishek-TAMU lookin at your benches, the train loss seems to be a bit higher, can you take a quick look
|
@fabianlim Running benchmark from code from main branch (without my changes) gives this train loss plot: (Other plots are same). Do you think this is a significant change in train loss compared to train loss with my code changes ? |
@Abhishek-TAMU ic.. ok then its due to variation . i approve. Also did you verify that the warning messages went away? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM and there are no regressions
Yes, I checked it by running
|
This PR fixes issue: #107
Changes:
Modified the autocasting decorator to
@torch.amp.custom_fwd and @torch.amp.custom_bwd
for autograd function.