You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi ,
when I read your code, for var in self.layers[0].vars.values() shows that the L2 regularization only contains vars of the first layer, why not all layers? I am confused about this.
def _loss(self):
# Weight decay loss
for var in self.layers[0].vars.values():
self.loss += FLAGS.weight_decay * tf.nn.l2_loss(var)
# Cross entropy error
self.loss += masked_softmax_cross_entropy(self.outputs, self.placeholders['labels'],
self.placeholders['labels_mask'])
Thanks!
The text was updated successfully, but these errors were encountered:
We found that L2 loss only on the first layer sufficed for decent results
(as described in our ICLR 2017 paper). Of course you can place a
regularizer on all layers.
On Tue, Apr 23, 2019 at 11:22 AM agave233 ***@***.***> wrote:
Hi ,
when I read your code, for var in self.layers[0].vars.values() shows that
the L2 regularization only contains vars of the first layer, why not all
layers? I am confused about this.
def _loss(self):
# Weight decay loss
for var in self.layers[0].vars.values():
self.loss += FLAGS.weight_decay * tf.nn.l2_loss(var)
# Cross entropy error
self.loss += masked_softmax_cross_entropy(self.outputs, self.placeholders['labels'],
self.placeholders['labels_mask'])
Thanks!
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#108>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABYBYYAGY42RJK5F3Q6U4LTPR3INBANCNFSM4HHWD7HQ>
.
Hi ,
when I read your code,
for var in self.layers[0].vars.values()
shows that the L2 regularization only contains vars of the first layer, why not all layers? I am confused about this.Thanks!
The text was updated successfully, but these errors were encountered: