Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds Eve Optimizer #475

Open
wants to merge 35 commits into
base: main
Choose a base branch
from
Open

Adds Eve Optimizer #475

wants to merge 35 commits into from

Conversation

wglao
Copy link

@wglao wglao commented Jan 24, 2023

The Eve Optimizer was proposed by Hayashi et al in 2018 (https://arxiv.org/abs/1611.01505) and is a simple framework applicable to adaptive gradient optimizers, but is specifically applied to Adam in the paper. This Adam-based algorithm is what is implemented in this fork. However, there is room for future improvement with implementing eve as a wrapper that adds the global learning rate scaling as a chainable scale_by method to any arbitrary optimizer

@mtthss
Copy link
Collaborator

mtthss commented Oct 10, 2023

Apologies for the long delay, would you mind moving this to contrib/?
Also would you consider implementing the wrapper version instead?

@fabianp fabianp added the needs work Needs more work by the author to be merged label Jan 22, 2024
@amosyou
Copy link
Contributor

amosyou commented Feb 3, 2024

happy to help out with the wrapper version if there's still interest!

@wglao
Copy link
Author

wglao commented Feb 3, 2024 via email

@fabianp
Copy link
Member

fabianp commented Feb 3, 2024

@amosyou : feel free to open a new PR if you want to take over this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs work Needs more work by the author to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants