Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Checkpoint with 200 epochs? #18

Closed
lzyhha opened this issue Aug 27, 2021 · 7 comments
Closed

Checkpoint with 200 epochs? #18

lzyhha opened this issue Aug 27, 2021 · 7 comments

Comments

@lzyhha
Copy link

lzyhha commented Aug 27, 2021

Hello, can you provide the checkpoint with 200 epochs and 256 batch size in imagenet-1k? It will be better if the checkpoint include the last fc and predictor in addition to the backbone.

@endernewton
Copy link
Contributor

Okay I can train one on my side.

@lzyhha
Copy link
Author

lzyhha commented Aug 27, 2021

Great, thank you.

@lzyhha
Copy link
Author

lzyhha commented Sep 7, 2021

Hello, is there any progress?

@endernewton
Copy link
Contributor

Yes I have trained the model a while back but did not get chance to upload. We now have some trouble uploading the model due to the s3 bucket change. Are you able to train one on your side already?

@endernewton
Copy link
Contributor

Okay got it working on my side: uploaded to https://dl.fbaipublicfiles.com/simsiam/models/200ep-256bs/pretrain/checkpoint_0199.pth.tar.

I did a quick linear eval with the current code base and it gives 69.8 top-1 accuracy. Should be good enough to use. Closing.

@kevinbro96
Copy link

Could you share the 200 epoch training log?

@lzyhha
Copy link
Author

lzyhha commented Jan 25, 2022 via email

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants