Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Explain PyTorch neural nets with Grad-CAM #327

Open
wants to merge 174 commits into
base: master
from

Conversation

@teabolt
Copy link
Contributor

teabolt commented Jul 29, 2019

This PR explains image and text classifiers built in PyTorch using the Grad-CAM method, building on #315 and #325.

Images example:
Using the pretrained mobilenet_v2 network from torchvision and calling eli5.show_prediction(model, doc, image=img)

We get the classical explanation for 'dog':
Screenshot from 2019-07-29 22-12-23

Text example:
Using an example model from https://www.kaggle.com/ziliwang/pytorch-text-cnn for an insincere question classification task (https://www.kaggle.com/c/quora-insincere-questions-classification/overview), we can write eli5.show_prediction(model, doc, tokens=tokens, layer=layer, relu=False).

To get an explanation like this (green = 'insincere', red = 'neutral'):
Screenshot from 2019-07-29 21-52-42

This PR only provides basic PyTorch support.

TODO items:

  • Consistency with Keras support and features.
  • Image tutorial.
  • Text tutorial (need to scale model down).
  • Image integration tests.
  • Text integration tests.
  • Unit tests.
  • Docstrings and docs.
  • CI and coverage.
  • Reviews.
teabolt and others added 30 commits Jul 1, 2019
Co-Authored-By: Mikhail Korobov <kmike84@gmail.com>
teabolt and others added 30 commits Aug 17, 2019
Co-Authored-By: Konstantin Lopuhin <kostia.lopuhin@gmail.com>
Co-Authored-By: Konstantin Lopuhin <kostia.lopuhin@gmail.com>
…into pytorch-gradcam
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants
You can’t perform that action at this time.