Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why multi-headed self-attention? #18

Closed
hugh920 opened this issue Apr 20, 2022 · 1 comment
Closed

Why multi-headed self-attention? #18

hugh920 opened this issue Apr 20, 2022 · 1 comment

Comments

@hugh920
Copy link

hugh920 commented Apr 20, 2022

Dear author, thanks for your papper and code.However,I've had a problem for a long time.Why the multi-headed attention were used in RCB?Can we not use multi-head? Like just using the normal self-attention mechanism in RCB.Looking forward to your reply.Thank you so much.

@akshitac8
Copy link
Owner

Hello @hugh920 Thank you for your interest in our work. We wanted to use an attention module that help us in capturing effective region based features. we have also added results with other attention blocks in the paper to show that are proposed module works best for our setting.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants