You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems the lr_drop is missing in the pretraining and finetune script.
The default lr_drop for DETR is 200, but the pretraining is only 60 epochs, so lr_drop is possible missing.
The text was updated successfully, but these errors were encountered:
I think the defaults should be ok. For DETReg using Deformable-DETR on IN1k I did not try lr dropping (just 5e pretraining). Similarly, for DETReg using DETR on IN I didn't lr drop. On IN100 where I reported some results on the paper, I've lr dropped after 40e and train for 50e.
It seems the
lr_drop
is missing in the pretraining and finetune script.The default
lr_drop
for DETR is 200, but the pretraining is only 60 epochs, solr_drop
is possible missing.The text was updated successfully, but these errors were encountered: