This repository constains the code to reproduce the experiments of the paper Mirror and Preconditioned Gradient Descent in Wasserstein Space. We propose in this paper to minimize functionals on the space of probability measures through the Mirror Descent and Preconditioned Gradient Descent schemes.
As the problem of minimizing functionals on the Wasserstein space encompasses many applications in machine learning, different optimization algorithms on
@inproceedings{bonet2024mirror,
title={Mirror and Preconditioned Gradient Descent in Wasserstein Space},
author={Clément Bonet and Théo Uscidda and Adam David and Pierre-Cyril Aubin-Frankowski and Anna Korba},
year={2024},
booktitle={Thirty-eight Conference on Neural Information Processing Systems}
}
- Figure 1 can be reproduced by running the notebook "MD_mirror_interaction.ipynb" in the folder xps_interaction
- Figure 2 can be reproduced by first running "xps.sh" in the folder xps_Gaussians, and then by running the notebook "Results_Gaussian.ipynb"
- The experiment on the simplex of Appendix G is available in the notebook "MD - Dirichlet Posterior.ipynb".
- jax, jaxopt
- ott
- python-ternary package for the Dirichlet experiment
- Some code of https://implicit-layers-tutorial.org/implicit_functions/ was used for the Newton solver in jax.
- For the gradient of the MMD Riesz kernel, a part of the code was adapted from the sliced_MMD_flows repository.