Releases: SMTorg/smt
Releases · SMTorg/smt
1.0.0
It is a good time to release SMT 1.0 (just after 0.9!).
The SMT architecture has shown to be useful and resilient since the 0.2 version presented in the article (more additions than actual breaking changes since then). Special thanks to @bouhlelma and @hwangjt and thanks to all contributors.
This is a smooth transition from SMT 0.9, with small additions and bug fixes:
- Add
random_state
option toNestedLHS
for result reproducibility (#296 thanks @anfelopera) - Add
use_gower_distance
option to EGO to use the Gower distance kernel
instead of continuous relaxation in presence of mixed integer variables (#299 thanks @Paul-Saves ) - Fix kriging based bug to allow
n_start=1
(#301) - Workaround PLS changes in
scikit-learn 0.24
which impact KPLS surrogate model family (#306) - Add documentation about saving and loading surrogate models (#308)
0.9.0
- Mixture of Experts improvements: (#282 thanks @jbussemaker, #283)
- add variance prediction API (ie.
predict_variances()
) which is enabled whenvariances_support
option is set - add
MOESurrogateModel
class which adaptsMOE
to theSurrogateModel
interface - allow selection of experts to be part of the mixture (see
allow
/deny
options) MOE.AVAILABLE_EXPERTS
lists all possible expertsenabled_experts
property of an MOE instance lists possible experts wrtderivatives/variances_support
andallow/deny
options.
- add variance prediction API (ie.
- Sampling Method interface refactoring: (#284 thanks @LDAP)
- create an intermediate
ScaledSamplingMethod
class to be the base class for sampling methods
which generate samples in the [0, 1] hypercube - allow future implementation of sampling methods generating samples direcly in the input space (i.e. within xlimits)
- create an intermediate
- Use of Gower distance in kriging based mixed integer surrogate: (#289 thanks @raul-rufato )
- add
use_gower_distance
option toMixedIntegerSurrogate
- add
gower
correlation model to kriging based surrogate - see MixedInteger notebook for usage
- add
- Improve kriging based surrogates with multistart method (#293 thanks @Paul-Saves )
- run several hyperparameter optimizations taking the best result
- number of optimization is controlled by
n_start
new option (default 10)
- Update documentation for MOE and SamplingMethod (#285)
- Fixes (#279, #281)
0.8.0
- Noise API changes for Kriging based surrogates (#276, #257 thanks @anfelopera):
- add a new tutorial notebook on how to deal with noise in SMT
- rename
noise
asnoise0
option and is now a list of values - add option
use_het_noise
to manage heteroscedastic noise, - improve noise management for MFK (different noise by level),
- add option
nugget
to enable the handling of numerical instabilitily - matern kernel documentation
- Add
predict_variance_derivatives
API (#256 , #259 thanks @Paul-Saves)- add spatial derivatives for Kriging based surrogates
- fix respect of parameters bounds in Kriging based surrogates
- Notebooks updates (#262, #275 thanks @NatOnera, #277 thanks @Paul-Saves )
- Kriging based surrogates refactoring (#261 thanks @anfelopera)
- inheritance changes: MFKPLS -> MFK, KPLSK, GEKPLS -> KPLS
- improve noise options consistency
- improve options validity checking
- Code quality (#264, #267, #268 thanks @LDAP):
- use of abc metaclass to enforce developer API
- type hinting
- add 'build system' specification and requirements.txt for tests, setup cleanup
0.7.1
- allow noise evaluation for Kriging based surrogate (#251)
- fix optimizer bounds in Kriging based surrogate (#252)
- fix MFK parameterization by level (#252)
- add
random_state
option to LHS sampling method for test repeatability (#253) - add
random_state
option to EGO application for test repeatability (#255) - cleanup tests (#255)
Marginal Gaussian Process surrogate model
- add Marginal Gaussian Process surrogate models(#236, thanks @repriem)
- add Matern kernels for kriging based surrogates (#236, thanks @repriem)
- add gradient based optimization for hyperparameters in kriging based surrogates: new
hyper_opt
option to specify TNC Scipy gradient based optimizer, gradient-free Cobyla optimizer remains the default. (#236, thanks @repriem) - add
MixedIntegerContext
documentation (#234 ) - fix bug in
mixed_integer::unfold_with_enum_mask
(#233 )
Mixed Integer Sampling Method and Surrogate
- Application: Mixed integer sampling methods and surrogates (#229)
- handling of categorical and integer variables in Kriging (#219, thanks @Paul-Saves)
- handling of categorical and integer variables in EGO optimizer (#220, thanks @Paul-Saves)
- remove initial doe returned value from EGO optimize method (#224)
- drop Python 2.7 (#215, #227)
- fix MFK variance computation (#211)
- fix MOE experts selection (#223)
- fix MOE RMTS usage (#225)
- fix QP as used in run_examples (#226)
MFKPLSK bug fix
- fix bug when
eval_noise
isTrue
Fix packaging bug
- add
packaging
dependency in setup
MKFPLS bug fix
- fix bug in MFKPLS when
eval_noise
andoptim_var
are set toTrue
Applications : MFKPLS, MFKPLSK and parallel EGO
- add MFKPLS application (#193, thanks @m-meliani )
- add MFKPLSK application (#198, thanks @m-meliani )
- add parallel EGO with qEI criterion (#202 , #199, #190, thanks @EmileRouxSMB and @rruusu )
- add notebook tutorial for EGO method
- fix kriging based surrogate bug (#200 )
- fix full factorial sampling weights and clip options (#197 )
- fix sklearn 0.22 cross_decomposition warning (#196 )
- next releases > 0.5.x will likely drop Python 2.7 support