Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hyperparameter in textRNN #32

Closed
longbowking opened this issue Jan 27, 2018 · 1 comment
Closed

hyperparameter in textRNN #32

longbowking opened this issue Jan 27, 2018 · 1 comment

Comments

@longbowking
Copy link

def loss(self,l2_lambda=0.0001):
with tf.name_scope("loss"):
#input: `logits` and `labels` must have the same shape `[batch_size, num_classes]`
#output: A 1-D `Tensor` of length `batch_size` of the same type as `logits` with the softmax cross entropy loss.
losses = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=self.input_y, logits=self.logits);#sigmoid_cross_entropy_with_logits.#losses=tf.nn.softmax_cross_entropy_with_logits(labels=self.input_y,logits=self.logits)
#print("1.sparse_softmax_cross_entropy_with_logits.losses:",losses) # shape=(?,)
loss=tf.reduce_mean(losses)#print("2.loss.loss:", loss) #shape=()
l2_losses = tf.add_n([tf.nn.l2_loss(v) for v in tf.trainable_variables() if 'bias' not in v.name]) * l2_lambda
loss=loss+l2_losses
return loss

  • the final loss is defined as the sum of cross_entropy_loss and L2-loss to penalty large variables, where a hyperparameter l2_lambda is used to balance these two items. In the algorithm, as the hyperparameter l2_lambda is directly set to 0.0001, would it be too small(or too large) under some circumstances such that one in the two items loses its contribution to the whole loss?
  • in general, is there any pratical methods to guide me to set values of hyperparameters, e.g. l2_lambda
    @brightmart
@longbowking longbowking changed the title on the hyperparameter in p8_text hyperparameter in textRNN Jan 27, 2018
@brightmart
Copy link
Owner

brightmart commented Jan 28, 2018 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants