Please sign in to comment.
Variational dropout in Augmented LSTM (#2344)
This pull requests fixes #2320 I have done point 2 on the list within the issue. Apart from the points made in the issue I have also added more detail to the doc strings in the [Stacked Bi-Directional LSTM](https://github.com/allenai/allennlp/blob/master/allennlp/modules/stacked_bidirectional_lstm.py) such as the corrected shape of the `final_states` returned form the forward pass and the correct shape required for the `initial_state` argument for the forward pass. I have also included the returned Type of the forward pass for the [Stacked Bi-Directional LSTM](https://github.com/allenai/allennlp/blob/master/allennlp/modules/stacked_bidirectional_lstm.py). Screen shot of the changed doc strings for the augmented LSTM is below: Only change is the wording of the `recurrent_dropout_probability` argument within the constructor: ![augmented doc string](https://user-images.githubusercontent.com/13574854/51051958-34cf7180-15cd-11e9-9a48-7bbb039ae504.png) Screen shots of the changed doc strings for the Stacked Bi-Directional LSTM are below: Changes are those stated above with regards to the forward pass and the wording of the `recurrent_dropout_probability` and `layer_dropout_probability` within the constructor as well as the text of the constructor: ![stacked constructor doc string](https://user-images.githubusercontent.com/13574854/51051945-2719ec00-15cd-11e9-9809-4a603a8e1a66.png) ![stacked forward doc string](https://user-images.githubusercontent.com/13574854/51051952-2d0fcd00-15cd-11e9-906b-7a1cf3ab7b25.png)
- Loading branch information...
Showing with 126 additions and 16 deletions.