-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can I download your SAM ViTs checkpoints? #159
Comments
You can download SAM checkpoints from here: gs://vit_models/sam They should load with @xiangning-chen FYI |
I check the model list, and found there are only base and large pure transformer models? Do you have sam models for small or tiny? |
@xiangning-chen who added the SAM checkpoints in 30b1a16 |
Hi, I will prepare a ViT-small recently. Thanks. |
Thanks a lot for your consideration! Because according to my experiments, smaller size ViTs may perform differently when using SAM. It could be appreciated that you share a SAM ViT-small or SAM ViT-tiny, or more recommended using DeiT's trainig set. |
@Longday0923 , we uploaded the ViT-S/16 model trained by SAM. I agree that the benefit brought by SAM is not that significant on small models because it mainly aims to improve generalization and reduce overfitting. On small models, the effects of SAM and strong augmentations used in DeiT is not orthogonal, you can check the section discussing augmentations and SAM in our paper. |
Thx a lot for your careful consideration! "On small models, the effects of SAM and strong augmentations used in DeiT is not orthogonal." This is an interesting and reasonable point of view. |
Thank you for your exciting and promising works!
I notice that you have recently release the SAM pretrained ViTs, I'm also very much interesting in SAM's effect on vision transformers, but have not enough GPUs or TPU. I'm wondering if and how I'm able to download it and use in my own pytorch code?
Thank you for your time and contribution.
You can email me via longday0923@gmail.com
The text was updated successfully, but these errors were encountered: