Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

叨扰了,再请教一个问题 #20

Closed
thu-zxs opened this issue May 31, 2016 · 0 comments
Closed

叨扰了,再请教一个问题 #20

thu-zxs opened this issue May 31, 2016 · 0 comments

Comments

@thu-zxs
Copy link

thu-zxs commented May 31, 2016

@pinguo-luhaofang
如果我更改norm2layer前的那层全连接层的输出个数(即与softmax loss训练时的输出个数不同,我把3000多改成了10),发现ap、an均变得很大,有约1e2的数量级,这是什么原因?norm2layer不是已经做了归一化么?以及我这样改动科学么?。。
谢谢!

@thu-zxs thu-zxs closed this as completed Jun 15, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant