You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
the final loss is defined as the sum of cross_entropy_loss and L2-loss to penalty large variables, where a hyperparameter l2_lambda is used to balance these two items. In the algorithm, as the hyperparameter l2_lambda is directly set to 0.0001, would it be too small(or too large) under some circumstances such that one in the two items loses its contribution to the whole loss?
in general, is there any pratical methods to guide me to set values of hyperparameters, e.g. l2_lambda @brightmart
The text was updated successfully, but these errors were encountered:
longbowking
changed the title
on the hyperparameter in p8_text
hyperparameter in textRNN
Jan 27, 2018
________________________________________
发件人: longbowking [notifications@github.com]
发送时间: 2018年1月27日 23:09
收件人: brightmart/text_classification
抄送: brightmart; Mention
主题: [brightmart/text_classification] on the hyperparameter in p8_text (#32)
https://github.com/brightmart/text_classification/blob/68e2fcf57a8dcec7e7d12f78953ed570451f0076/a03_TextRNN/p8_TextRNN_model.py#L74-L83
* the final loss is defined as the sum of cross_entropy_loss and L2-loss to penalty large variables, where a hyperparameter l2_lambda is used to balance these two items. In the algorithm, as the hyperparameter l2_lambda is directly set to 0.0001, would it be too small(or too large) under some circumstances such that one in the two items loses its contribution to the whole loss?
* in general, is there any pratical methods to guide me to set values of hyperparameters, e.g. l2_lambda
@brightmart<https://github.com/brightmart>
―
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub<#32>, or mute the thread<https://github.com/notifications/unsubscribe-auth/ASuYMFIAG35PYMjDHWpisLVYjg0ci8Pnks5tOzwlgaJpZM4RvUqf>.
text_classification/a03_TextRNN/p8_TextRNN_model.py
Lines 74 to 83 in 68e2fcf
l2_lambda
is used to balance these two items. In the algorithm, as the hyperparameterl2_lambda
is directly set to0.0001
, would it be too small(or too large) under some circumstances such that one in the two items loses its contribution to the whole loss?l2_lambda
@brightmart
The text was updated successfully, but these errors were encountered: