Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Colab demo for saliency maps: Gradient images are blank #38

Open
drscotthawley opened this issue Sep 29, 2020 · 2 comments
Open
Labels
bug Something isn't working

Comments

@drscotthawley
Copy link

drscotthawley commented Sep 29, 2020

Thanks so much for sharing your work and make it so easy...in theory...to use. Although I haven't been able to get it to work by running your Colab example(s).

Describe the bug
The two middle images involving gradients for each example, "Gradients across RGB channels" and "Max Gradients", appear as uniform color. The RGB is all grey and the Max is all purple.

To Reproduce
Steps to reproduce the behavior:

  1. Go to the README page for this repo, scroll down to "Want to try?"
  2. Next to "Saliency maps:", click on the "Open in Colab" badge.
  3. Save a copy of the notebook to Drive (or not; this step is not essential to produce the error)
  4. Edit > Notebook Settings, select GPU
  5. Runtime > Restart and run all
  6. Scroll down and observe that only the leftmost and right most images for owl, peacock and toucan are visible, the middle two images are "blank"

Expected behavior
Gradient images should show content, such as colored pixels around the owl's eyes as in your Medium post. That is not what running the Colab demo gives me. See my sample screenshot below.

Screenshots
Screenshot from 2020-09-29 00-20-16

Environment (please complete the following information):
Colab. Whatever OS that's running.
Brand new installation of FlashTorch via the !pip install flashtorch in the notebook. Looks like (0.1.3)
Looks like Torch 1.6

Additional context
From pip (after I re-ran it a second time):

Collecting flashtorch
  Downloading https://files.pythonhosted.org/packages/de/cb/482274e95812c9a17bd156956bef80a8e2683a2b198a505fb922f1c01a71/flashtorch-0.1.3.tar.gz
Requirement already satisfied: matplotlib in /usr/local/lib/python3.6/dist-packages (from flashtorch) (3.2.2)
Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from flashtorch) (1.18.5)
Requirement already satisfied: Pillow in /usr/local/lib/python3.6/dist-packages (from flashtorch) (7.0.0)
Requirement already satisfied: torch in /usr/local/lib/python3.6/dist-packages (from flashtorch) (1.6.0+cu101)
Requirement already satisfied: torchvision in /usr/local/lib/python3.6/dist-packages (from flashtorch) (0.7.0+cu101)
Collecting importlib_resources
  Downloading https://files.pythonhosted.org/packages/ba/03/0f9595c0c2ef12590877f3c47e5f579759ce5caf817f8256d5dcbd8a1177/importlib_resources-3.0.0-py2.py3-none-any.whl
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib->flashtorch) (0.10.0)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->flashtorch) (2.4.7)
Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->flashtorch) (2.8.1)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->flashtorch) (1.2.0)
Requirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from torch->flashtorch) (0.16.0)
Requirement already satisfied: zipp>=0.4; python_version < "3.8" in /usr/local/lib/python3.6/dist-packages (from importlib_resources->flashtorch) (3.1.0)
Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from cycler>=0.10->matplotlib->flashtorch) (1.15.0)
Building wheels for collected packages: flashtorch
  Building wheel for flashtorch (setup.py) ... done
  Created wheel for flashtorch: filename=flashtorch-0.1.3-cp36-none-any.whl size=26248 sha256=95cdabc7c3dbc87e25d0795eb47d01971668600bdc58cd558b670c5e7ce0b725
  Stored in directory: /root/.cache/pip/wheels/03/6d/b1/2d3c5987b69e900fcceceeef39d3ed92dfe46ba1359b9c79f8
Successfully built flashtorch
Installing collected packages: importlib-resources, flashtorch
Successfully installed flashtorch-0.1.3 importlib-resources-3.0.0
@drscotthawley drscotthawley added the bug Something isn't working label Sep 29, 2020
@drscotthawley
Copy link
Author

drscotthawley commented Sep 29, 2020

Perhaps something important has changed on Colab since these examples were created: The other Colab demo on activation minimization also fails, although in that case it gives a RuntimeError. (RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!)

These problems only seem to affect the Colab notebooks. When I cloned the repo and ran locally using the non-colab notebooks, I was able to see the saliency maps ok.

@mpierrau
Copy link

mpierrau commented Oct 9, 2020

I tried debugging a bit but only managed to figure out that the problem lies in that the gradient returned from AlexNet in the backpropagation does not correspond to the expected dimensions, so self.gradients is never updated because it fails the if statement in _record_gradients:

grad_in[0].shape: torch.Size([1, 64, 55, 55])
self.gradients.shape: torch.Size([1, 3, 224, 224])

When running with CPU or TPU on colab it works fine!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants