Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to visualize the attention map #21

Open
its-dron opened this issue Apr 17, 2017 · 1 comment
Open

How to visualize the attention map #21

its-dron opened this issue Apr 17, 2017 · 1 comment

Comments

@its-dron
Copy link

I am attempting to visualize results, which is mostly handled by main.visualize(). However, the code to get the attention map has been commented out, and replaced with np.zeros.

My general question is what is the intuition behind the commented out code? Some specifics:

  • What is i_datum?
  • What is mod_layout_choice?
  • Why is att_blob_name created the way it is?

This will be helpful to understand, as we are also attempting to connect an additional model to the final attention map, pre softmax activation.
Thanks.

@ShangxuanWu
Copy link

ShangxuanWu commented Apr 18, 2017

Same question here. When I uncomment the visualize lines, it goes to this error:

Traceback (most recent call last):
File "main.py", line 260, in
main()
File "main.py", line 36, in main
do_iter(task.val, model, config, vis=True)
File "main.py", line 108, in do_iter
visualize(batch_data, model)
File "main.py", line 227, in visualize
att_data = model.apollo_net.blobs[att_blob_name].data[i_datum,...]
KeyError: 'Find_101_softmax'

I have no idea of the numbers such as "101". It seems that "Find_101_softmax" is not in the layer list.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants