-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Conversation
* Fix tests * Fix typing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks for this, I have several comments
* Addres PR feedback
This is good for another look. @joelgrus |
Hi! It's been 4 months and self attention is a very useful model. Could someone please review the model code? |
@rangwani-harsh You didn't implement the bi-directional LSTM before the self attention layer? |
@jianwolf You can use the Attention module using any Seq2Seq encoder. As Bidirectional LSTM is a Seq2Seq encoder you can just specify it in your model and pass the output of the Bidirectional LSTM to the Structed Self Attentive Encoder. For example you can see here https://github.com/rangwani-harsh/sentence-encoder-irony-detection/blob/71b5adf9d755e53918797fb0bfabe6acd6ffcaec/irony_model/model/model.py#L82 . |
@joelgrus, you want to follow up on this one? |
We closed the issue associated with this as we're not sure that it's necessary for it to be integrated into the library itself - feel free to push it to it's own github repo. |
This PR contains the implementation of the Self-Attentive-Sentence-Encoder described here #2188
TODO