-
Notifications
You must be signed in to change notification settings - Fork 429
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug after recent Tensorflow update #16
Comments
Thanks for this comment! I had this same issue to when loading for the first time. |
cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits=LeNet(x), labels=one_hot_y) |
agree... cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=one_hot_y) |
I am a classroom mentor. New students start to get this bug because they use 1.0. I think Udacity code should use 1.0 standard as well. |
@shaox058, the best way it to modify your code. It's trivial to do so. You
just need keyword arguments.
…On Sun, Mar 26, 2017 at 12:57 AM shaox058 ***@***.***> wrote:
Hi @CreatCodeBuild <https://github.com/CreatCodeBuild> @adai183
<https://github.com/adai183> @mwhayford <https://github.com/mwhayford>
I have the same issue here. Should I downgrade my tensorflow? If yes,
which version is the best fit.
Thanks,
Harry
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#16 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AIga8sKFUyrWaVvjJu1fLunllb6jSfuPks5rpfBEgaJpZM4MKJR1>
.
|
Since we are still using v0.12.1, closing this - will be updated when we move to the newer versions. |
cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits, one_hot_y)
need to be:
cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=one_hot_y)
If not we get:
Raise en error if labels do not sum to 1
exceptionThis came up in the discussion forum.
The text was updated successfully, but these errors were encountered: