Fast Bayesian optimization, quadrature, inference over arbitrary domain (discrete and mixed spaces) with GPU parallel acceleration based on GPytorch and BoTorch. The paper is here arXiv,
While the existing method (batch Thompson sampling; TS) is stuck in the local minima, SOBER robustly finds global optimmum.
SOBER provides a faster, more sample-efficient, more diversified, and more scalable optimization scheme than existing methods.
In the paper, SOBER outperformed 11 competitive baselines on 12 synthetic and diverse real-world tasks.
- Red star: ground truth
- black crosses: next batch queries recommended by SOBER
- white dots: historical observations
- Branin function: blackbox function to maximise
-
$\pi$ : the probability of global optimum locations estimated by SOBER
- fast batch Bayesian optimization
- fast batch Bayesian quadrature
- fast Bayesian inference
- fast fully Bayesian Gaussian process modelling and related acquisition functions
- sample-efficient simulation-based inference
- Massively parallel active learning
- GPU acceleration
- Arbitrary domain space (continuous, discrete, mixture, or domain space as dataset)
- Arbitrary kernel for surrogate modelling
- Arbitrary acquisition function
- Arbitrary prior distribution for Bayesian inference
We prepared detailed explanations about how to customize SOBER for your tasks.
See tutorials
.
- 00 Quick start
- 01 How does SOBER work?
- 02 Customise prior for various domain types
- 03 Customise acquisition function
- 04 Fast fully Bayesian Gaussian process modelling
- 05 Fast Bayesian inference for simulation-based inference
- 06 Tips for drug discovery
- 07 Compare with Thompson sampling
- 08 Benchmarking against batch BO methods.ipynb
See examples
for reproducing the results in the paper.
Please download the newest .whl file from Releases.
If you wish to build from source, git clone
the repository and run the following commands in the top folder:
pip install build
python -m build
You will find the packaged library in the dist
folder.
We solve batch global optimization as Bayesian quadrature;
We select the batch query locations to minimize the integration error of the true function
- PyTorch
- GPyTorch
- BoTorch
This code repository uses materials from the following public and provided codes. The authors thank the respective repository maintainers
- BASQ: Adachi, M., Hayakawa, S., Jørgensen, M., Oberhauser, H., Osborne, M. A., Fast Bayesian Inference with Batch Bayesian Quadrature via Kernel Recombination. Advances in Neural Information Processing Systems, 35 (NeurIPS 2022) code, paper
- RCHQ: Hayakawa, S., Oberhauser, H., Lyons, T., Positively Weighted Kernel Quadrature via Subsampling. Advances in Neural Information Processing Systems, 35 (NeurIPS 2022) code, paper
- Thompson sampling: Kandasamy, K., Krishnamurthy, A., Schneider, J. and Póczos, B., Parallelised Bayesian optimisation via Thompson sampling. International Conference on Artificial Intelligence and Statistics (AISTATS 2018) code from BoTorch, paper
- Decoupled Thompson sampling: Wilson, J., Borovitskiy, V., Terenin, A., Mostowsky, P. and Deisenroth, M., Efficiently sampling functions from Gaussian process posteriors. International Conference on Machine Learning (ICML 2020) code from @saitcakmak, paper
- Determinantal Point Process Thompson sampling (DPP-TS): Nava, E., Mutny, M. and Krause, A., Diversified sampling for batched bayesian optimization with determinantal point processes. International Conference on Artificial Intelligence and Statistics (AISTATS 2022). We appreciate the paper authors providing the code and allowing us to open-source here @elvisnava, paper
- GIBBON: Moss, H.B., Leslie, D.S., Gonzalez, J. and Rayson, P., 2021. Gibbon: General-purpose information-based bayesian optimisation. The Journal of Machine Learning Research, 22(1), (JMLR 2021) code from BoTorch. paper
- TurBO: Eriksson, D., Pearce, M., Gardner, J. R., Turner, R., & Poloczek, M. (2019). Scalable global optimization via local Bayesian optimization. Advances in Neural Information Processing Systems, 32 (NeurIPS 2019) code from BoTorch, paper
Please cite this work as
@article{adachi2024a,
title={A Quadrature Approach for General-Purpose Batch Bayesian Optimization via Probabilistic Lifting},
author={Adachi, Masaki and Hayakawa, Satoshi and J{\o}rgensen, Martin and Hamid, Saad and Oberhauser, Harald and Osborne, Michael A},
journal={https://doi.org/10.48550/arXiv.2404.12219},
year={2024}
}