New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix *_cross_entropy_with_logits calls #864

Merged
merged 6 commits into from Jan 8, 2017
Jump to file or symbol
Failed to load files and symbols.
+1 −1
Diff settings

Always

Just for now

Next

Fix xent call in mnist tutorial code

Fixes #857.
  • Loading branch information...
martinwicke committed Jan 8, 2017
commit e93ec37201f5f2116933ae96e505f409ddbf344d
@@ -228,7 +228,7 @@ def model(data, train=False):
# Training computation: logits + cross-entropy loss.
logits = model(train_data_node, True)
loss = tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(
logits, train_labels_node))
labels=train_labels_node, logits=logits))
# L2 regularization for the fully connected parameters.
regularizers = (tf.nn.l2_loss(fc1_weights) + tf.nn.l2_loss(fc1_biases) +
ProTip! Use n and p to navigate between commits in a pull request.