-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
想问下第一章softmax回归的一个问题 #7
Comments
tensorflow的文档是说If |
这个地方确实是有一些问题,实际应当是axis=1的,当时考虑为了便于理解去掉了这个参数(捂脸),就像@Wang-yaole 说的,如果加上的话就是两者之间差了一个常数 |
soga。。。而且居然加了axis=1之后正确率还降低了。。只有87左右 |
需要改一下学习率的 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
softmax.py定义交叉熵的那行里面
cross_entropy = tf.reduce_mean(-tf.reduce_sum((y_ * tf.log(y))))
tf.reduce_sum函数在这里不用指明axis吗,不指明返回的是不是标量总和,怎么mean?
对python不太熟,麻烦指点一下
The text was updated successfully, but these errors were encountered: