Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix confusion matrices labeling in skills/classification/guide.ipynb #62

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

docmparker
Copy link

This fixes the confusion matrices in the classification guide notebook. I noticed in going through the notebook that all the confusion matrices do not match the tabular results from each classification report (precision, recall, etc.). This can be easily seen in the original (current online version) by comparing the rows of the classification report to the rows of the confusion matrix. The values are correct in the classification report but not the confusion matrix in each case just based on label order (the rows/cols are just mislabeled). The fix was simple: in the evaluate function, I pass the labels to classification_report and to confusion_matrix to make sure there is consistency. I then re-ran the output for all cells with confusion matrices to have them show correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant