Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can I download your SAM ViTs checkpoints? #159

Closed
Longday0923 opened this issue Dec 21, 2021 · 7 comments
Closed

Can I download your SAM ViTs checkpoints? #159

Longday0923 opened this issue Dec 21, 2021 · 7 comments

Comments

@Longday0923
Copy link

Thank you for your exciting and promising works!
I notice that you have recently release the SAM pretrained ViTs, I'm also very much interesting in SAM's effect on vision transformers, but have not enough GPUs or TPU. I'm wondering if and how I'm able to download it and use in my own pytorch code?
Thank you for your time and contribution.
You can email me via longday0923@gmail.com

@andsteing
Copy link
Collaborator

You can download SAM checkpoints from here: gs://vit_models/sam

They should load with timm library as described here:
https://rwightman.github.io/pytorch-image-models/models/vision-transformer/

@xiangning-chen FYI

@Longday0923
Copy link
Author

I check the model list, and found there are only base and large pure transformer models? Do you have sam models for small or tiny?

@andsteing
Copy link
Collaborator

andsteing commented Feb 8, 2022

@xiangning-chen who added the SAM checkpoints in 30b1a16

@xiangning-chen
Copy link
Contributor

Hi,

I will prepare a ViT-small recently. Thanks.

@Longday0923
Copy link
Author

Thanks a lot for your consideration! Because according to my experiments, smaller size ViTs may perform differently when using SAM. It could be appreciated that you share a SAM ViT-small or SAM ViT-tiny, or more recommended using DeiT's trainig set.

@xiangning-chen
Copy link
Contributor

@Longday0923 , we uploaded the ViT-S/16 model trained by SAM. I agree that the benefit brought by SAM is not that significant on small models because it mainly aims to improve generalization and reduce overfitting. On small models, the effects of SAM and strong augmentations used in DeiT is not orthogonal, you can check the section discussing augmentations and SAM in our paper.

@Longday0923
Copy link
Author

Thx a lot for your careful consideration! "On small models, the effects of SAM and strong augmentations used in DeiT is not orthogonal." This is an interesting and reasonable point of view.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants