Library with implementations of optimization methods in Python 3
git clone https://github.com/amkatrutsa/liboptpy.git
cd liboptpy
python setup.py install
or
pip install git+https://github.com/amkatrutsa/liboptpy
- Unconstrained smooth and non-smooth optimization
- Comparison of projected gradient descent and Frank-Wolfe method
- Gradient descent
- Nesterov accelerated gradient descent
- Newton method and inexact (truncated) Newton method with CG as linear solver
- Conjugate gradient method
- for convex quadratic function
- for non-quadratic function (Fletcher-Reeves method)
- Barzilai-Borwein method
- Subgradient method
- Dual averaging method
- Projected gradient method
- Frank-Wolfe method
- Primal barrier method
- Constant
- Inverse number on iteration and scaled by gradient norm version
- Inverse square root of number of iterationas and scaled by gradient norm version
- Backtracking
- Armijo rule
- Wolfe rule
- Strong Wolfe rule
- Goldstein rule
- Exact line search for quadratic function
If you find any bugs, please fix them and send pull-request. If you want add some enhancement or something new, please open an issue for discussion.
To send pull-request, you should make the following steps
- Fork this repository
- Clone the forked repository
- Add original repositore as remote one
- Create a branch in your local repository with specific name for your changes, e.g.
bugfix
- Switch to this branch
- Change something that you assume make this repository better
- Commit your changes in the branch
bugfix
with a meaningful comment, e.g.Fix typo
- Switch to the branch
master
- Pull new commits to the branch
master
from this repository, not forked one - Switch to branch
bugfix
- Make
git rebase master
to take all new commits from original repository to branchbugfix
- Make push to your forked repository in new remote branch
bugfix
- Send pull-request from your remote branch
bugfix
tomaster
branch of the original repository