Skip to content

Commit

Permalink
update tf.nn.sparse_softmax_cross_entropy_with_logits call
Browse files Browse the repository at this point in the history
  • Loading branch information
amygdala committed Jun 19, 2017
1 parent 92d9a27 commit 2138e76
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion workshop_sections/xor/xor/xor_summaries_softmax.py
Expand Up @@ -38,7 +38,7 @@ def make_graph(features, labels, num_hidden=8):
# Shape [4, 2]
logits = tf.matmul(hidden_activations, output_weights)

cross_entropy = tf.nn.sparse_softmax_cross_entropy_with_logits(logits, labels)
cross_entropy = tf.nn.sparse_softmax_cross_entropy_with_logits(logits=logits, labels=labels)
loss = tf.reduce_mean(cross_entropy)
tf.summary.scalar('loss', loss)

Expand Down

0 comments on commit 2138e76

Please sign in to comment.