Skip to content

Commit

Permalink
Rework presentation of keywords (#393)
Browse files Browse the repository at this point in the history
* Unify all doc strings to one format.
  • Loading branch information
kellertuer authored Aug 11, 2024
1 parent 489da96 commit c8564b8
Show file tree
Hide file tree
Showing 84 changed files with 3,243 additions and 2,645 deletions.
12 changes: 12 additions & 0 deletions Changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,17 @@ All notable Changes to the Julia package `Manopt.jl` will be documented in this
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.4.70] WIP

### Added

* Unify doc strings and presentation of keyword arguments
* general indexing, for example in a vector, uses `i`
* index for inequality constraints is unified to `i` running from `1,...,m`
* index for equality constraints is unified to `j` running from `1,...,n`
* iterations are using now `k`
* Doc strings unified and even reusing similar docstring snippets.

## [0.4.69] – August 3, 2024

### Changed
Expand Down Expand Up @@ -40,6 +51,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
* a few typos in the documentation
* `WolfePowellLinesearch` no longer uses `max_stepsize` with invalid point by default.


## [0.4.66] June 27, 2024

### Changed
Expand Down
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "Manopt"
uuid = "0fc0a36d-df90-57f3-8f93-d78a9fc72bb5"
authors = ["Ronny Bergmann <[email protected]>"]
version = "0.4.69"
version = "0.4.70"

[deps]
ColorSchemes = "35d6a980-a343-548e-a6ea-1d62b119f2f4"
Expand Down
18 changes: 14 additions & 4 deletions docs/src/about.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,10 @@
Manopt.jl inherited its name from [Manopt](https://manopt.org), a Matlab toolbox for optimization on manifolds.
This Julia package was started and is currently maintained by [Ronny Bergmann](https://ronnybergmann.net/).

The following people contributed
## Contributors

Thanks to the following contributors to `Manopt.jl`:

* [Constantin Ahlmann-Eltze](https://const-ae.name) implemented the [gradient and differential `check` functions](helpers/checks.md)
* [Renée Dornig](https://github.com/r-dornig) implemented the [particle swarm](solvers/particle_swarm.md), the [Riemannian Augmented Lagrangian Method](solvers/augmented_Lagrangian_method.md), the [Exact Penalty Method](solvers/exact_penalty_method.md), as well as the [`NonmonotoneLinesearch`](@ref)
* [Willem Diepeveen](https://www.maths.cam.ac.uk/person/wd292) implemented the [primal-dual Riemannian semismooth Newton](solvers/primal_dual_semismooth_Newton.md) solver.
Expand All @@ -14,21 +17,28 @@ The following people contributed
* [Markus A. Stokkenes](https://www.linkedin.com/in/markus-a-stokkenes-b41bba17b/) contributed most of the implementation of the [Interior Point Newton Method](solvers/interior_point_Newton.md)
* [Manuel Weiss](https://scoop.iwr.uni-heidelberg.de/author/manuel-weiß/) implemented most of the [conjugate gradient update rules](@ref cg-coeffs)

as well as various [contributors](https://github.com/JuliaManifolds/Manopt.jl/graphs/contributors) providing small extensions, finding small bugs and mistakes and fixing them by opening [PR](https://github.com/JuliaManifolds/Manopt.jl/pulls)s.
as well as various [contributors](https://github.com/JuliaManifolds/Manopt.jl/graphs/contributors) providing small extensions, finding small bugs and mistakes and fixing them by opening [PR](https://github.com/JuliaManifolds/Manopt.jl/pulls)s. Thanks to all of you.

If you want to contribute a manifold or algorithm or have any questions, visit
the [GitHub repository](https://github.com/JuliaManifolds/Manopt.jl/)
to clone/fork the repository or open an issue.

## Work using Manopt.jl

* [ExponentialFamilyProjection.jl](https://github.com/ReactiveBayes/ExponentialFamilyProjection.jl) projects distributions
* [Caesar.jl](https://github.com/JuliaRobotics/Caesar.jl) within non-Gaussian factor graph inference algorithms

Is a package missing? [Open an issue](https://github.com/JuliaManifolds/Manopt.jl/issues/new)!
It would be great to collect anything and anyone using Manopt.jl

# Further packages
## Further packages

`Manopt.jl` belongs to the Manopt family:

* [manopt.org](https://www.manopt.org) The Matlab version of Manopt, see also their :octocat: [GitHub repository](https://github.com/NicolasBoumal/manopt)
* [pymanopt.org](https://www.pymanopt.org/) The Python version of Manopt providing also several AD backends, see also their :octocat: [GitHub repository](https://github.com/pymanopt/pymanopt)

but there are also more packages providing tools on manifolds:
but there are also more packages providing tools on manifolds in other languages

* [Jax Geometry](https://github.com/ComputationalEvolutionaryMorphometry/jaxgeometry) (Python/Jax) for differential geometry and stochastic dynamics with deep learning
* [Geomstats](https://geomstats.github.io) (Python with several backends) focusing on statistics and machine learning :octocat: [GitHub repository](https://github.com/geomstats/geomstats)
Expand Down
2 changes: 2 additions & 0 deletions docs/src/notation.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,6 @@ with the following additional parts.

| Symbol | Description | Also used | Comment |
|:--:|:--------------- |:--:|:-- |
| ``\operatorname{arg\,min}`` | argument of a function ``f`` where a local or global minimum is attained | |
| ``k`` | the current iterate | ``ì`` | the goal is to unify this to `k` |
| ```` | The [Levi-Cevita connection](https://en.wikipedia.org/wiki/Levi-Civita_connection) | | |
4 changes: 2 additions & 2 deletions docs/src/plans/debug.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,6 @@ automatically available, as explained in the [`gradient_descent`](@ref) solver.

```@docs
initialize_solver!(amp::AbstractManoptProblem, dss::DebugSolverState)
step_solver!(amp::AbstractManoptProblem, dss::DebugSolverState, i)
stop_solver!(amp::AbstractManoptProblem, dss::DebugSolverState, i::Int)
step_solver!(amp::AbstractManoptProblem, dss::DebugSolverState, k)
stop_solver!(amp::AbstractManoptProblem, dss::DebugSolverState, k::Int)
```
2 changes: 1 addition & 1 deletion docs/src/plans/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ The following symbols are used.
| Symbol | Used in | Description |
| :----------- | :------ | :--------------------------------------------------------- |
| `:Activity` | [`DebugWhenActive`](@ref) | activity of the debug action stored within |
| `:Basepoint` | [`TangentSpace`]() | the point the tangent space is at |
| `:Basepoint` | [`TangentSpace`](@extref ManifoldsBase `ManifoldsBase.TangentSpace`) | the point the tangent space is at |
| `:Cost` | generic |the cost function (within an objective, as pass down) |
| `:Debug` | [`DebugSolverState`](@ref) | the stored `debugDictionary` |
| `:Gradient` | generic | the gradient function (within an objective, as pass down) |
Expand Down
4 changes: 2 additions & 2 deletions docs/src/plans/record.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,6 @@ Further specific [`RecordAction`](@ref)s can be found when specific types of [`A

```@docs
initialize_solver!(amp::AbstractManoptProblem, rss::RecordSolverState)
step_solver!(p::AbstractManoptProblem, s::RecordSolverState, i)
stop_solver!(p::AbstractManoptProblem, s::RecordSolverState, i)
step_solver!(p::AbstractManoptProblem, s::RecordSolverState, k)
stop_solver!(p::AbstractManoptProblem, s::RecordSolverState, k)
```
2 changes: 1 addition & 1 deletion docs/src/plans/stepsize.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ Tangent bundle with the Sasaki metric has 0 injectivity radius, so the maximum s
`Hyperrectangle` also has 0 injectivity radius and an estimate based on maximum of dimensions along each index is used instead.
For manifolds with corners, however, a line search capable of handling break points along the projected search direction should be used, and such algorithms do not call `max_stepsize`.

Some solvers have a different iterate from the one used for linesearch. Then the following state can be used to wrap
Some solvers have a different iterate from the one used for the line search. Then the following state can be used to wrap
these locally

```@docs
Expand Down
20 changes: 10 additions & 10 deletions docs/src/solvers/DouglasRachford.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,44 +6,44 @@ manifolds in [BergmannPerschSteidl:2016](@cite).
The aim is to minimize the sum

```math
F(p) = f(p) + g(p)
f(p) = g(p) + h(p)
```

on a manifold, where the two summands have proximal maps
``\operatorname{prox}_{λ f}, \operatorname{prox}_{λ g}`` that are easy
``\operatorname{prox}_{λ g}, \operatorname{prox}_{λ h}`` that are easy
to evaluate (maybe in closed form, or not too costly to approximate).
Further, define the reflection operator at the proximal map as

```math
\operatorname{refl}_{λ f}(p) = \operatorname{retr}_{\operatorname{prox}_{λ f}(p)} \bigl( -\operatorname{retr}^{-1}_{\operatorname{prox}_{λ f}(p)} p \bigr).
\operatorname{refl}_{λ g}(p) = \operatorname{retr}_{\operatorname{prox}_{λ g}(p)} \bigl( -\operatorname{retr}^{-1}_{\operatorname{prox}_{λ g}(p)} p \bigr).
```

Let ``\alpha_k ∈ [0,1]`` with ``\sum_{k ∈ ℕ} \alpha_k(1-\alpha_k) = \infty``
and ``λ > 0`` (which might depend on iteration ``k`` as well) be given.

Then the (P)DRA algorithm for initial data ``x_0 ∈ \mathcal H`` as
Then the (P)DRA algorithm for initial data ``p^{(0)} ∈ \mathcal M`` as

## Initialization

Initialize ``t_0 = x_0`` and ``k=0``
Initialize ``q^{(0)} = p^{(0)}`` and ``k=0``

## Iteration

Repeat until a convergence criterion is reached

1. Compute ``s_k = \operatorname{refl}_{λ f}\operatorname{refl}_{λ g}(t_k)``
2. Within that operation, store ``p_{k+1} = \operatorname{prox}_{λ g}(t_k)`` which is the prox the inner reflection reflects at.
3. Compute ``t_{k+1} = g(\alpha_k; t_k, s_k)``, where ``g`` is a curve approximating the shortest geodesic, provided by a retraction and its inverse
1. Compute ``r^{(k)} = \operatorname{refl}_{λ g}\operatorname{refl}_{λ h}(q^{(k)})``
2. Within that operation, store ``p^{(k+1)} = \operatorname{prox}_{λ h}(q^{(k)})`` which is the prox the inner reflection reflects at.
3. Compute ``q^{(k+1)} = g(\alpha_k; q^{(k)}, r^{(k)})``, where ``g`` is a curve approximating the shortest geodesic, provided by a retraction and its inverse
4. Set ``k = k+1``

## Result

The result is given by the last computed ``p_K``.
The result is given by the last computed ``p^{(K)}`` at the last iterate ``K``.

For the parallel version, the first proximal map is a vectorial version where
in each component one prox is applied to the corresponding copy of ``t_k`` and
the second proximal map corresponds to the indicator function of the set,
where all copies are equal (in ``\mathcal H^n``, where ``n`` is the number of copies),
where all copies are equal (in ``\mathcal M^n``, where ``n`` is the number of copies),
leading to the second prox being the Riemannian mean.

## Interface
Expand Down
2 changes: 1 addition & 1 deletion docs/src/solvers/adaptive-regularization-with-cubics.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ of a manifolds to be available
* By default the tangent vector storing the gradient is initialized calling [`zero_vector`](@extref `ManifoldsBase.zero_vector-Tuple{AbstractManifold, Any}`)`(M,p)`.
* [`inner`](@extref `ManifoldsBase.inner-Tuple{AbstractManifold, Any, Any, Any}`)`(M, p, X, Y)` is used within the algorithm step

Furthermore, within the Lanczos subsolver, generating a random vector (at `p`) using [`rand!`](@extref Base.rand-Tuple{AbstractManifold})(M, X; vector_at=p)` in place of `X` is required
Furthermore, within the Lanczos subsolver, generating a random vector (at `p`) using [`rand!`](@extref Base.rand-Tuple{AbstractManifold})`(M, X; vector_at=p)` in place of `X` is required

## Literature

Expand Down
10 changes: 8 additions & 2 deletions docs/src/solvers/conjugate_residual.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Conjugate Residual Solver in a Tangent space
# Conjugate residual solver in a Tangent space

```@meta
CurrentModule = Manopt
Expand All @@ -14,7 +14,7 @@ conjugate_residual
ConjugateResidualState
```

## Objetive
## Objective

```@docs
SymmetricLinearSystemObjective
Expand All @@ -26,6 +26,12 @@ SymmetricLinearSystemObjective
StopWhenRelativeResidualLess
```

## Internal functions

```@docs
Manopt.get_b
```

## Literature

```@bibliography
Expand Down
2 changes: 1 addition & 1 deletion docs/src/solvers/interior_point_Newton.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Interior Point Newton method
# Interior point Newton method

```@meta
CurrentModule = Manopt
Expand Down
10 changes: 10 additions & 0 deletions docs/styles/config/vocabularies/Manopt/accept.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
_field_.*\b
_arg_.*\b
_kw_.*\b
_l_.*\b
_math_.*\b
_problem_.*\b
_doc_.*\b
Absil
Adagrad
[A|a]djoint
Expand Down Expand Up @@ -62,6 +69,7 @@ Lui
Manifolds.jl
ManifoldsBase.jl
[Mm]anopt(:?.org|.jl)?
Markus
Marquardt
Moakher
Munkvold
Expand Down Expand Up @@ -90,6 +98,7 @@ Riemer
Riemopt
Riesz
Rosenbrock
Sasaki
semicontinuous
Steihaug
Stiefel
Expand All @@ -98,6 +107,7 @@ Souza
Steidl
Stephansen
[Ss]tepsize
Stokkenes
[Ss]ubdifferential
[Ss]ubgradient
subsampled
Expand Down
14 changes: 9 additions & 5 deletions ext/ManoptLRUCacheExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,15 @@ Given a vector of symbols `caches`, this function sets up the
# Keyword arguments
* `p`: (`rand(M)`) a point on a manifold, to both infer its type for keys and initialize caches
* `value`: (`0.0`) a value both typing and initialising number-caches, the default is for (Float) values like the cost.
* `X`: (`zero_vector(M, p)` a tangent vector at `p` to both type and initialize tangent vector caches
* `cache_size`: (`10`) a default cache size to use
* `cache_sizes`: (`Dict{Symbol,Int}()`) a dictionary of sizes for the `caches` to specify different (non-default) sizes
* `p=`$(Manopt._link_rand()): a point on a manifold, to both infer its type for keys and initialize caches
* `value=0.0`:
a value both typing and initialising number-caches, the default is for (Float) values like the cost.
* `X=zero_vector(M, p)`:
a tangent vector at `p` to both type and initialize tangent vector caches
* `cache_size=10`:
a default cache size to use
* `cache_sizes=Dict{Symbol,Int}()`:
a dictionary of sizes for the `caches` to specify different (non-default) sizes
"""
function Manopt.init_caches(
M::AbstractManifold,
Expand Down
2 changes: 1 addition & 1 deletion ext/ManoptLineSearchesExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ end
function (cs::Manopt.LineSearchesStepsize)(
mp::AbstractManoptProblem,
s::AbstractManoptSolverState,
i::Int,
k::Int,
η=-get_gradient(s);
fp=get_cost(mp, get_iterate(s)),
kwargs...,
Expand Down
6 changes: 6 additions & 0 deletions ext/ManoptManifoldsExt/ManoptManifoldsExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,12 @@ module ManoptManifoldsExt

using ManifoldsBase: exp, log, ParallelTransport, vector_transport_to
using Manopt
using Manopt:
_l_refl,
_l_retr,
_kw_retraction_method_default,
_kw_inverse_retraction_method_default,
_kw_X_default
import Manopt:
max_stepsize,
alternating_gradient_descent,
Expand Down
10 changes: 5 additions & 5 deletions ext/ManoptManifoldsExt/alternating_gradient.jl
Original file line number Diff line number Diff line change
Expand Up @@ -16,18 +16,18 @@ function get_gradient(
end

@doc raw"""
X = get_gradient(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, p, k)
get_gradient!(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, X, p, k)
X = get_gradient(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, p, i)
get_gradient!(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, X, p, i)
Evaluate one of the component gradients ``\operatorname{grad}f_k``, ``k\{1,…,n\}``, at `x` (in place of `Y`).
Evaluate one of the component gradients ``\operatorname{grad}f_i``, ``i\{1,…,n\}``, at `x` (in place of `Y`).
"""
function get_gradient(
M::ProductManifold,
mago::ManifoldAlternatingGradientObjective{AllocatingEvaluation,TC,<:Function},
p,
k,
i,
) where {TC}
return get_gradient(M, mago, p)[M, k]
return get_gradient(M, mago, p)[M, i]
end
function get_gradient!(
M::AbstractManifold,
Expand Down
21 changes: 12 additions & 9 deletions ext/ManoptManifoldsExt/manifold_functions.jl
Original file line number Diff line number Diff line change
Expand Up @@ -108,28 +108,31 @@ function reflect!(M::AbstractManifold, q, pr::Function, x; kwargs...)
return reflect!(M, q, pr(x), x; kwargs...)
end

@doc raw"""
@doc """
reflect(M, p, x, kwargs...)
reflect!(M, q, p, x, kwargs...)
Reflect the point `x` from the manifold `M` at point `p`, given by
````math
\operatorname{refl}_p(x) = \operatorname{retr}_p(-\operatorname{retr}^{-1}_p x).
````
```math
$_l_refl
```
where ``\operatorname{retr}`` and ``\operatorname{retr}^{-1}`` denote a retraction and an inverse
where ``$_l_retr`` and ``$_l_retr^{-1}`` denote a retraction and an inverse
retraction, respectively.
This can also be done in place of `q`.
## Keyword arguments
* `retraction_method`: (`default_retraction_metiod(M, typeof(p))`) the retraction to use in the reflection
* `inverse_retraction_method`: (`default_inverse_retraction_method(M, typeof(p))`) the inverse retraction to use within the reflection
* $_kw_retraction_method_default
the retraction to use in the reflection
* $_kw_inverse_retraction_method_default
the inverse retraction to use within the reflection
and for the `reflect!` additionally
* `X`: (`zero_vector(M,p)`) a temporary memory to compute the inverse retraction in place.
* $_kw_X_default
a temporary memory to compute the inverse retraction in place.
otherwise this is the memory that would be allocated anyways.
"""
function reflect(
Expand All @@ -149,7 +152,7 @@ function reflect!(
q,
p,
x;
retraction_method=default_retraction_method(M),
retraction_method=default_retraction_method(M, typeof(p)),
inverse_retraction_method=default_inverse_retraction_method(M),
X=zero_vector(M, p),
)
Expand Down
Loading

0 comments on commit c8564b8

Please sign in to comment.