CI/CD | |
Docs | |
Package | |
Meta |
Tools for diagnostics and assessment of (machine learning) models
Highlights:
- All common point predictions covered: mean, median, quantiles, expectiles.
- Assess model calibration with identification functions (generalized residuals), compute_bias and compute_marginal.
- Assess calibration and bias graphically
- reliability diagrams for auto-calibration
- bias plots for conditional calibration
- marginal plots for average
y_obs
,y_pred
and partial dependence for one feature
- Assess the predictive performance of models
- strictly consistent, homogeneous scoring functions
- score decomposition into miscalibration, discrimination and uncertainty
- Choose your plot backend, either matplotlib or plotly, e.g., via set_config.
🚀 To our knowledge, this is the first python package to offer reliability diagrams for quantiles and expectiles and a score decomposition, both made available by an internal implementation of isotonic quantile/expectile regression. 🚀
Read more in the documentation.
This package relies on the giant shoulders of, among others, polars, matplotlib, scipy and scikit-learn.
Installation
pip install model-diagnostics
Contributions
Contributions are warmly welcome! When contributing, you agree that your contributions will be subject to the MIT License.