Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Utils to show modulename with its repr(); Add Linear weighted activations as objective; Add pretrained GAN as parametrization #5

Merged
merged 4 commits into from
Jun 22, 2020

Conversation

Animadversio
Copy link
Contributor

Dear author,

Thanks so much for implement lucid in PyTorch! I really enjoyed using it in my projects of leveraging deep neural networks as a way to understand real neurons in visual cortices. In my usage, I want to activate multiple channels together to match the selectivity of the biological neuron or units in other networks. We can achieve this by adding up the original channel objective or neuron objective. But it becomes very inefficient in back prop.

So here are my 2 cents, in this commit I

  • Add linearly weighted activations of the channel, neuron, neuron group as objective. Using tensor operations.
  • Add a function in util.py to output the module names to ease the usage for custom networks.

…objective.

Add a function in util.py to output the module names to ease usage for custom networks.
@greentfrapp
Copy link
Owner

@Animadversio thanks for the contribution! And I'm glad Lucent helped with your work.

The linearly weighted activations parts look good. Can you add some tests in tests/optvis/test_objectives.py for the new objective functions? You can run the tests with

$ coverage run --source . --omit setup.py -m pytest
$ coverage report -m

Regarding the new lucent_layernames function in util.py, it's actually already there, under get_model_layers in lucent/modelzoo/util.py

def get_model_layers(model):

Although I just realized that I didn't add the following snippet!

if layer is None:
    # e.g. GoogLeNet's aux1 and aux2 layers
    continue

Can you make this change in the modelzoo/util.py file instead? Thanks!!!

@Animadversio
Copy link
Contributor Author

Sure! Thanks for telling me how to run tests! Haven't tried that before

@Animadversio
Copy link
Contributor Author

Animadversio commented Jun 17, 2020

@greentfrapp Hi! I add tests for the new objectives.
Besides, I add a new kind of parametrization: Use pre-trained GAN as prior to visualize features. I have added tests for these new functionalities!

The method is inspired by
Nguyen, A., Dosovitskiy, A., Yosinski, J., Brox, T., & Clune, J.
Synthesizing the preferred inputs for neurons in neural networks via deep generator networks.(2016) NIPS
I have translated it into PyTorch and host the weights of the pre-trained GAN in my personal space.

The GANs are originally shared at https://lmb.informatik.uni-freiburg.de/people/dosovits/code.html
The results are quite impressive for higher-level concept neurons. Below are samples of using fc8, fc7, fc6 GAN as parametrization to visualize the Lipstick neuron in VGG.

vgg16-fc-ch0629-fc8GAN-tfmdiv2 0-0113
vgg16-fc-ch0629-fc7GAN-tfmdiv2 0-0658
vgg16-fc-ch0629-fc6GAN-tfmdiv2 0-0219

@Animadversio Animadversio changed the title Utils to show modulenames, and add Linear weighted activations as objective. Utils to show modulenames, add Linear weighted activations as objective; Add pretrained GAN as parametrization Jun 17, 2020
@Animadversio Animadversio changed the title Utils to show modulenames, add Linear weighted activations as objective; Add pretrained GAN as parametrization Utils to show modulename with its repr(); Add Linear weighted activations as objective; Add pretrained GAN as parametrization Jun 17, 2020
@greentfrapp
Copy link
Owner

Thanks @Animadversio! The GAN-as-prior work looks interesting and intuitively will work better for higher-level layers compared to lower-level ones, due to the way the GAN is trained. I will be merging this for now, but I think it'll be helpful if you could also add a notebook to demonstrate the use of the GAN prior!

@greentfrapp greentfrapp merged commit a8f0703 into greentfrapp:master Jun 22, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants