Releases
v0.9.0
relf
released this
01 Mar 21:36
Mixture of Experts improvements: (#282 thanks @jbussemaker , #283 )
add variance prediction API (ie. predict_variances()
) which is enabled when variances_support
option is set
add MOESurrogateModel
class which adapts MOE
to the SurrogateModel
interface
allow selection of experts to be part of the mixture (see allow
/deny
options)
MOE.AVAILABLE_EXPERTS
lists all possible experts
enabled_experts
property of an MOE instance lists possible experts wrt derivatives/variances_support
and allow/deny
options.
Sampling Method interface refactoring: (#284 thanks @LDAP )
create an intermediate ScaledSamplingMethod
class to be the base class for sampling methods
which generate samples in the [0, 1] hypercube
allow future implementation of sampling methods generating samples direcly in the input space (i.e. within xlimits)
Use of Gower distance in kriging based mixed integer surrogate: (#289 thanks @raul-rufato )
add use_gower_distance
option to MixedIntegerSurrogate
add gower
correlation model to kriging based surrogate
see MixedInteger notebook for usage
Improve kriging based surrogates with multistart method (#293 thanks @Paul-Saves )
run several hyperparameter optimizations taking the best result
number of optimization is controlled by n_start
new option (default 10)
Update documentation for MOE and SamplingMethod (#285 )
Fixes (#279 , #281 )
You can’t perform that action at this time.