You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dear author, thanks for your papper and code.However,I've had a problem for a long time.Why the multi-headed attention were used in RCB?Can we not use multi-head? Like just using the normal self-attention mechanism in RCB.Looking forward to your reply.Thank you so much.
The text was updated successfully, but these errors were encountered:
Hello @hugh920 Thank you for your interest in our work. We wanted to use an attention module that help us in capturing effective region based features. we have also added results with other attention blocks in the paper to show that are proposed module works best for our setting.
Dear author, thanks for your papper and code.However,I've had a problem for a long time.Why the multi-headed attention were used in RCB?Can we not use multi-head? Like just using the normal self-attention mechanism in RCB.Looking forward to your reply.Thank you so much.
The text was updated successfully, but these errors were encountered: