You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems that RLA_lang_att does not contribute so much. I have tried to remove these lines of code and the result kept the same.
Moreover, with self.rla_weight=0.1 and only used for the first layer, the lang_feat_att may not affect to the output. However, in the paper, I saw that it improves ~1% in performance. Is there any mistake or I understood in a wrong way?
The text was updated successfully, but these errors were encountered:
Hi,
I have a question related to RLA module.
`
`
It seems that RLA_lang_att does not contribute so much. I have tried to remove these lines of code and the result kept the same.
Moreover, with self.rla_weight=0.1 and only used for the first layer, the lang_feat_att may not affect to the output. However, in the paper, I saw that it improves ~1% in performance. Is there any mistake or I understood in a wrong way?
The text was updated successfully, but these errors were encountered: