Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Two questions about softmax #103

Open
FightForCS opened this issue Mar 30, 2018 · 0 comments
Open

Two questions about softmax #103

FightForCS opened this issue Mar 30, 2018 · 0 comments

Comments

@FightForCS
Copy link

Hi, thanks for your work. I have two questions about softmax loss in your paper

  1. softmax classifier suffers from biased gradient, but it can be sloved by using larger batch size. Have you ever tried this setting? (Since in experiments part, you have compared softmax and OIM, but you did not mention the batch size for training)

  2. How to 'proper pretraining' for softmax? Is it possible to pretrain for OIM loss?

Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant