Python library for optimization descent algorithms
In the field of machine learning, especially statistical machine learning, optimization algorithms are at the heart of all problems. We even believe that optimization methods determine the robustness and performance of the whole learning system. The sgdlib
library provides a series of straightforward API interfaces that make it very easy for users to use some optimization algorithms, and we also allow users to access custom objective functions by simply following a uniform interface naming convention, which is designed to give the whole system a certain flexibility without worrying about exposing the core of the internal algorithms.
The overall framework includes the following optimization algorithms:
- Stochastic Descent Gradient (SGD)
- Stochastic Average Gradient (SAG)
- Stochastic Coordinate Descent (SCD)
- Truncated Gradient (TG)
- LBFGS