We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
@pinguo-luhaofang 如果我更改norm2layer前的那层全连接层的输出个数(即与softmax loss训练时的输出个数不同,我把3000多改成了10),发现ap、an均变得很大,有约1e2的数量级,这是什么原因?norm2layer不是已经做了归一化么?以及我这样改动科学么?。。 谢谢!
The text was updated successfully, but these errors were encountered:
No branches or pull requests
@pinguo-luhaofang
如果我更改norm2layer前的那层全连接层的输出个数(即与softmax loss训练时的输出个数不同,我把3000多改成了10),发现ap、an均变得很大,有约1e2的数量级,这是什么原因?norm2layer不是已经做了归一化么?以及我这样改动科学么?。。
谢谢!
The text was updated successfully, but these errors were encountered: