Skip to content

Commit

Permalink
default dropout to correct value for big transformer
Browse files Browse the repository at this point in the history
  • Loading branch information
alexeib committed May 17, 2018
1 parent c903798 commit e560a12
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions fairseq/models/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -435,6 +435,7 @@ def transformer_vaswani_wmt_en_de_big(args):
args.decoder_ffn_embed_dim = 4096
args.decoder_layers = 6
args.decoder_attention_heads = 16
args.dropout = 0.3


@register_model_architecture('transformer', 'transformer_wmt_en_de_big')
Expand Down

0 comments on commit e560a12

Please sign in to comment.