Go forth!
- Support early stopping of optimization loop.
- Benchmarking scripts to evaluate performance of different surrogate models.
- Support for parallel evaluations of the objective function via several constant liar stategies.
- BayesSearchCV as a drop in replacement for scikit-learn's GridSearchCV.
- New acquisition functions "EIps" and "PIps" that takes into account function compute time.
- Fixed inference of dimensions of type Real.
- Change interface of GradientBoostingQuantileRegressor's predict method to match return type of other regressors
- Dimensions of type Real are now inclusive of upper bound.
Third time's a charm.
- Accuracy improvements of the optimization of the acquisition function
by pre-selecting good candidates as starting points when
using
acq_optimizer='lbfgs'
. - Support a ask-and-tell interface. Check out the
Optimizer
class if you need fine grained control over the iterations. - Parallelize L-BFGS minimization runs over the acquisition function.
- Implement weighted hamming distance kernel for problems with only categorical dimensions.
- New acquisition function
gp_hedge
that probabilistically chooses one ofEI
,PI
orLCB
at every iteration depending upon the cumulative gain.
- Warnings are now raised if a point is chosen as the candidate optimum multiple times.
- Infinite gradients that were raised in the kernel gradient computation are now fixed.
- Integer dimensions are now normalized to [0, 1] internally in
gp_minimize
.
- The default
acq_optimizer
function has changed from"auto"
to"lbfgs"
ingp_minimize
.
- Speed improvements when using
gp_minimize
withacq_optimizer='lbfgs'
andacq_optimizer='auto'
when all the search-space dimensions are Real. - Persistence of minimization results using
skopt.dump
andskopt.load
. - Support for using arbitrary estimators that implement a
return_std
argument in theirpredict
method by means ofbase_minimize
fromskopt.optimizer.
- Support for tuning noise in
gp_minimize
using thenoise
argument. TimerCallback
inskopt.callbacks
to log the time between iterations of the minimization loop.
First light!
- Bayesian optimization via
gp_minimize
. - Tree-based sequential model-based optimization via
forest_minimize
andgbrt_minimize
, with support for multi-threading. - Support of LCB, EI and PI as acquisition functions.
- Plotting functions for inspecting convergence, evaluations and the objective function.
- API for specifying and sampling from a parameter space.
See AUTHORS.md
.