Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the lr_drop in DETR based experiment #41

Closed
4-0-4-notfound opened this issue Feb 17, 2022 · 1 comment
Closed

Question about the lr_drop in DETR based experiment #41

4-0-4-notfound opened this issue Feb 17, 2022 · 1 comment

Comments

@4-0-4-notfound
Copy link

It seems the lr_drop is missing in the pretraining and finetune script.
The default lr_drop for DETR is 200, but the pretraining is only 60 epochs, so lr_drop is possible missing.

@amirbar
Copy link
Owner

amirbar commented Feb 24, 2022

I think the defaults should be ok. For DETReg using Deformable-DETR on IN1k I did not try lr dropping (just 5e pretraining). Similarly, for DETReg using DETR on IN I didn't lr drop. On IN100 where I reported some results on the paper, I've lr dropped after 40e and train for 50e.

@amirbar amirbar closed this as completed Feb 24, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants