New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About training schedule #10
Comments
Sorry to bother, but I have another question about pre-training. I noticed that you used stuff annotations for the COCO dataset. I'm wondering why you are not using the instance segmentation masks, which seems to be more reasonable since VOS targets are usually things rather than stuffs. Thank you. |
Sorry for the late reply.
|
Thank you for your reply. Did you start from an schedule of 200 epochs and perform early stopping? Could you give a rough estimation of the number of epochs for pre-training? Thanks! |
@chenz97 |
@hzxie Thank you for your reply. Could you give a rough estimation of the validation performance on DAVIS after the pre-training stage? Thanks. |
@chenz97 |
Thank you for your great work and code! I have a question about the training schedule. In the pre-training schedule, is the training protocol exactly identical to the main training stage? For example, the epoch is also 200 epochs for pre-training? Thank you very much!
The text was updated successfully, but these errors were encountered: