Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

finetuning, fc6 layer initialization #33

Open
westpilgrim opened this issue Sep 11, 2017 · 0 comments
Open

finetuning, fc6 layer initialization #33

westpilgrim opened this issue Sep 11, 2017 · 0 comments

Comments

@westpilgrim
Copy link

first , i train my model using original softmax
then, i want to finetune it using a-softmax
i intend to initialize the a-softmax's fc6 layer param using the original softmax, but the log says that they don't have the same number of param blobs, I found that margin inner product layer has no bias param blob.
i tried xavier initialization , but when i freeze the layers before fc6 and only learn fc6, the loss cannot converge .Do i have to relax the lr of the layers before fc6?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant