Important
The TPE optimizer for the experiments are now available at OptunaHub. This repository is now not maintained anymore, so people might experience hard time in the installation. However, if you just would like to use my TPE optimizer for your experiments, please use the one registered at OptunaHub. I added an example for HPOLib in examples. You can run the example with the following:
# dataset_id can take from 0 to 3.
$ python examples/example_hpolib.py --dataset_id 0
This package is the implementation example of tree-structured parzen estimator (TPE).
TPE is an hyperparameter optimization (HPO) method invented in Algorithms for Hyper-Parameter Optimization
.
NOTE: The sampling strategy is based on the BOHB implementation.
This package requires python 3.8 or later version and you can install
pip install tpe
The optimization of 10D sphere function can be executed as follows:
from __future__ import annotations
import time
import ConfigSpace as CS
import ConfigSpace.hyperparameters as CSH
import numpy as np
from tpe.optimizer import TPEOptimizer
def sphere(eval_config: dict[str, float]) -> tuple[dict[str, float], float]:
start = time.time()
vals = np.array(list(eval_config.values()))
vals *= vals
return {"loss": np.sum(vals)}, time.time() - start
if __name__ == "__main__":
dim = 10
cs = CS.ConfigurationSpace()
for d in range(dim):
cs.add_hyperparameter(CSH.UniformFloatHyperparameter(f"x{d}", lower=-5, upper=5))
opt = TPEOptimizer(obj_func=sphere, config_space=cs, resultfile='sphere')
# If you do not want to do logging, remove the `logger_name` argument
print(opt.optimize(logger_name="sphere"))
The documentation of ConfigSpace
is available here.
Please cite the following paper when using my implementation:
@article{watanabe2023tpe,
title = {Tree-structured {P}arzen estimator: Understanding its algorithm components and their roles for better empirical performance},
author = {S. Watanabe},
journal = {arXiv preprint arXiv:2304.11127},
year = {2023}
}