You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to reproduce the linear segmentation results obtained with the ViT-B IBOT pretrained model, which performs at 38.3 mIoU according to the paper.
I run my experiment a single node with 8 GPUs. I was wondering if the performance gap could come from the fact that I am not using DistOptimizerHook and apex, or if there is something else I am missing.
Thanks for your help.
The text was updated successfully, but these errors were encountered:
Hi,
I am trying to reproduce the linear segmentation results obtained with the ViT-B IBOT pretrained model, which performs at 38.3 mIoU according to the paper.
With this model, and the config file provided in:
I only reach ~18mIoU on ADE20K.
I saw that the command in the README change the learning rate and normalize the output so I tried with:
and I got ~20mIoU.
The only difference is that I am not using apex and the custom distributed optimizer, so I basically comment:
In the config file.
I run my experiment a single node with 8 GPUs. I was wondering if the performance gap could come from the fact that I am not using DistOptimizerHook and apex, or if there is something else I am missing.
Thanks for your help.
The text was updated successfully, but these errors were encountered: