Skip to content

Commit

Permalink
Fix dropout=recurrent_dropout
Browse files Browse the repository at this point in the history
  • Loading branch information
justinxzhao committed Jun 2, 2022
1 parent 5e3f9c8 commit fc7a9c0
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion ludwig/encoders/sequence_encoders.py
Original file line number Diff line number Diff line change
Expand Up @@ -1509,6 +1509,8 @@ def __init__(
:param dropout: determines if there should be a dropout layer before
returning the encoder output.
:type dropout: Boolean
:param recurrent_dropout: Dropout rate for the recurrent stack.
:type recurrent_dropout: float
:param initializer: the initializer to use. If `None` it uses
`xavier_uniform`. Options are: `constant`, `identity`,
`zeros`, `ones`, `orthogonal`, `normal`, `uniform`,
Expand Down Expand Up @@ -1606,7 +1608,7 @@ def __init__(
weights_initializer=weights_initializer,
recurrent_initializer=recurrent_initializer,
bias_initializer=bias_initializer,
dropout=dropout,
dropout=recurrent_dropout,
)

self.reduce_output = reduce_output
Expand Down

0 comments on commit fc7a9c0

Please sign in to comment.