Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

有关grid注意力的疑惑 #18

Open
XA23i opened this issue May 10, 2023 · 1 comment
Open

有关grid注意力的疑惑 #18

XA23i opened this issue May 10, 2023 · 1 comment

Comments

@XA23i
Copy link

XA23i commented May 10, 2023

你好,如果我想要全局的注意力图,要怎么操作呢,你的demo是对应某个grid的注意力图

@AvadaKarrot
Copy link

您好,请问您解决这个问题了吗?
我发现visualizer可视化的是某一个transformer layer的某个attention head的attn map,attn_map[i]的shape是(1, 12, 129, 129), grid_index是从0-128的int,在visualizer.py这个函数里,attention_map是(129, )的向量,如果grid_index取0,是不是就是cls的特征图,可以理解成全局的注意力图吗?
我也想要全局注意力图,但是没想出合理的办法。请不吝赐教~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants