-
Notifications
You must be signed in to change notification settings - Fork 7.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix Classification Interpretation #3563
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Many thanks - just one request
nbs/20_interpret.ipynb
Outdated
@@ -8,7 +8,7 @@ | |||
"source": [ | |||
"#hide\n", | |||
"#skip\n", | |||
"! [ -e /content ] && pip install -Uqq fastai # upgrade fastai on colab" | |||
"# ! [ -e /content ] && pip install -Uqq fastai # upgrade fastai on colab" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you please uncomment this? It's needed to work on colab
Done |
Great. |
@warner-benjamin With the changes made here, one would be re-running inference on each call of plotting the confusion matrix or fetching the |
@rsomani95 It's on my todo list to add back a cached option |
Resolves #3562 and adds simple test to make sure
ClassificationInterpretation
doesn't fall through the cracks again.Implementation does generate the entire datasets' worth of predictions, targets, and decoded predictions, which is required for the confusion matrix. It might be possible to make this more memory efficient, but the labels and predictions for classification problems are not usually that large.