Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About training schedule #10

Closed
chenz97 opened this issue May 25, 2021 · 6 comments
Closed

About training schedule #10

chenz97 opened this issue May 25, 2021 · 6 comments

Comments

@chenz97
Copy link

chenz97 commented May 25, 2021

Thank you for your great work and code! I have a question about the training schedule. In the pre-training schedule, is the training protocol exactly identical to the main training stage? For example, the epoch is also 200 epochs for pre-training? Thank you very much!

@chenz97
Copy link
Author

chenz97 commented May 25, 2021

Sorry to bother, but I have another question about pre-training. I noticed that you used stuff annotations for the COCO dataset. I'm wondering why you are not using the instance segmentation masks, which seems to be more reasonable since VOS targets are usually things rather than stuffs. Thank you.

@hzxie
Copy link
Owner

hzxie commented Jun 18, 2021

Sorry for the late reply.

  1. The number of epochs during training is not 200 for pretraining. We stop the pretrain when the loss is no longer decrease.
  2. Actually, we are using instance segmentation masks. We give the wrong URL in the README.

@hzxie hzxie closed this as completed Jun 18, 2021
@chenz97
Copy link
Author

chenz97 commented Jun 19, 2021

Thank you for your reply. Did you start from an schedule of 200 epochs and perform early stopping? Could you give a rough estimation of the number of epochs for pre-training? Thanks!

@hzxie
Copy link
Owner

hzxie commented Jun 20, 2021

@chenz97
Sorry, I don't remember.
Actually, I randomly choose a small number of images from COCO for each epoch.

@chenz97
Copy link
Author

chenz97 commented Jun 22, 2021

@hzxie Thank you for your reply. Could you give a rough estimation of the validation performance on DAVIS after the pre-training stage? Thanks.

@hzxie
Copy link
Owner

hzxie commented Jul 8, 2021

@chenz97
Actually, the experiment was done one year ago. Therefore, I can not remember the value clearly.

@hzxie hzxie reopened this Jul 8, 2021
@hzxie hzxie closed this as completed Jul 8, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants