You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm pretty sure it is used correctly in my code and the documentation is unclear here. See e.g. this StackOverflow topic, where you can find a more detailed explanation.
I solved this problem. This is cased by wrongly one-hot encoding. For two-class problem, the label of the first class need to be "0" and the other label should be "1". See the correct results below, and many thanks for the explanation.
Hi,
For a two-class problem, I got the following result.
The cross-entropy is very very small, but I can not get high accuracy.
I traced the code of computing loss value (following code),
"loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=score, labels=y))"
and found that the score need to be " Unscaled log probabilities" (https://www.tensorflow.org/api_docs/python/tf/nn/softmax_cross_entropy_with_logits)
So it seems that you need to perform a log function to "score" first before feeding it to softmax function?
Please correct me if I'm wrong, and many thanks.
The text was updated successfully, but these errors were encountered: