Skip to content

Commit

Permalink
change the name of learning rate variabe to avoid confusion
Browse files Browse the repository at this point in the history
  • Loading branch information
whwang299 committed May 9, 2019
1 parent 69ded82 commit 0e1794f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion train_shallow_layer.py
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ def get_opt(model, model_bert, model_type):
opt_bert = torch.optim.Adam(list(filter(lambda p: p.requires_grad, model.parameters())) \
# + list(model_bert.parameters()),
+ list(filter(lambda p: p.requires_grad, model_bert.parameters())),
lr=args.lr, weight_decay=0)
lr=args.lr_bert, weight_decay=0)
opt = opt_bert # for consistency in interface
else:
raise NotImplementedError
Expand Down

0 comments on commit 0e1794f

Please sign in to comment.