Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

想问下第一章softmax回归的一个问题 #7

Closed
yyf941126 opened this issue Apr 14, 2018 · 4 comments
Closed

想问下第一章softmax回归的一个问题 #7

yyf941126 opened this issue Apr 14, 2018 · 4 comments

Comments

@yyf941126
Copy link

softmax.py定义交叉熵的那行里面
cross_entropy = tf.reduce_mean(-tf.reduce_sum((y_ * tf.log(y))))
tf.reduce_sum函数在这里不用指明axis吗,不指明返回的是不是标量总和,怎么mean?
对python不太熟,麻烦指点一下

@Wang-yaole
Copy link

tensorflow的文档是说If axis has no entries, all dimensions are reduced, and a
tensor with a single element is returned.
看来reduce_sum以后已经是是个标量了……加不加mean也就无所谓了反正也就是差个常数

@hzy46
Copy link
Owner

hzy46 commented Apr 16, 2018

这个地方确实是有一些问题,实际应当是axis=1的,当时考虑为了便于理解去掉了这个参数(捂脸),就像@Wang-yaole 说的,如果加上的话就是两者之间差了一个常数

@hzy46 hzy46 closed this as completed Apr 16, 2018
@yyf941126
Copy link
Author

soga。。。而且居然加了axis=1之后正确率还降低了。。只有87左右

@hzy46
Copy link
Owner

hzy46 commented Apr 16, 2018

需要改一下学习率的

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants