Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(ml): optim weigth decay parameter control #555

Merged
merged 1 commit into from
Oct 4, 2023

Conversation

beniz
Copy link
Contributor

@beniz beniz commented Oct 2, 2023

No description provided.

@beniz beniz self-assigned this Oct 2, 2023
@beniz beniz force-pushed the feat_optim_weight_decay branch 3 times, most recently from d0de2f9 to 35ef0fa Compare October 4, 2023 10:05
train.py Outdated
print("Using ", opt.train_optim, " as optimizer")
if opt.train_optim == "adam":
return torch.optim.Adam(params, lr, betas)
return torch.optim.Adam(params, lr, betas, weight_decay)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@beniz beniz merged commit 34fb2dd into master Oct 4, 2023
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants