Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hyperparameters for resnet50 training #6

Closed
pengleigithub opened this issue Aug 7, 2018 · 7 comments
Closed

hyperparameters for resnet50 training #6

pengleigithub opened this issue Aug 7, 2018 · 7 comments

Comments

@pengleigithub
Copy link

I am wondering if you can share the hyper-parameters for resnet50 training. I have trained a model but it is 2% worse than yours.

@michuanhaohao
Copy link
Owner

@pengleigithub The default settings are the hyper-parameters of the paper. If you directly run the train_alignedreid.py, you will get similar results shown in README.

@pengleigithub
Copy link
Author

In the paper, it says 'mini-batch size is set to160,in which each identity has 4 images. Each epoch includes 2000 mini-batches'. I am wondering if there is a typo. Each mini-batch has 40 identities and there are 2000 mini-batches. So there are 40 * 2000 = 80000 identities in each epoch. But none of the datasets has 80000 identities.

@michuanhaohao
Copy link
Owner

@pengleigithub Please wait our new paper Alignedreid++. The experiments of AlignedReID exist some mistakes.

@pengleigithub
Copy link
Author

Can you share the details since it might be a long time before the paper is posted? Thx.

@michuanhaohao
Copy link
Owner

AlignedReID++ trains and tests on each datasets, and add some theory explanation.

@pengleigithub
Copy link
Author

Thx. Can you share the training details on market1501?

@michuanhaohao
Copy link
Owner

michuanhaohao commented Aug 10, 2018

All details are in the train_alignedreid.py. There are no any tricks.

  1. Adam
  2. 300 epoch
  3. lr:0.0002 (0-150); 0.00002 (150-300)
  4. loss: softmax + triplet loss

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants