-
Notifications
You must be signed in to change notification settings - Fork 307
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to execute the metrics on regression problems instead of classification? #159
Comments
had the same inquiries. As the faithfulness and monotonicity metrics are based on the classification model (using classification mode.predic_proba), is there a way to use it on a regression model instead? |
In regression problems, the goal is to predict a continuous numeric output rather than a categorical label as in classification problems. The evaluation metrics for regression problems differ from those used in classification. Here are some common metrics used to evaluate the performance of regression models: Mean Absolute Error (MAE): Formula: ( MAE = \frac{1}{n} \sum_{i=1}^{n} |y_i - \hat{y}_i| ) Formula: ( MSE = \frac{1}{n} \sum_{i=1}^{n} (y_i - \hat{y}_i)^2 ) Formula: ( RMSE = \sqrt{MSE} ) Formula: ( R^2 = 1 - \frac{\sum_{i=1}^{n} (y_i - \hat{y}i)^2}{\sum{i=1}^{n} (y_i - \bar{y})^2} ) Formula: ( MAPE = \frac{1}{n} \sum_{i=1}^{n} \left|\frac{y_i - \hat{y}_i}{y_i}\right| \times 100 ) Formula: ( \bar{R^2} = 1 - \frac{(1-R^2)(n-1)}{n-p-1} ) |
The examples given in metrics is for classification and not regression. Any example to implement faithfulness and monotonicity on regression?
The text was updated successfully, but these errors were encountered: