Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch.hub: add support for DOFA and Swin models #2052

Merged
merged 3 commits into from
May 12, 2024

Conversation

adamjstewart
Copy link
Collaborator

We forgot to add support for DOFA and Swin ViT to torch.hub.

@adamjstewart adamjstewart added this to the 0.6.0 milestone May 7, 2024
@github-actions github-actions bot added the models Models and pretrained weights label May 7, 2024
isaaccorley
isaaccorley previously approved these changes May 7, 2024
@github-actions github-actions bot added the testing Continuous integration testing label May 7, 2024
**kwargs: Additional keywork arguments to pass to :class:`DOFA`.

Returns:
A DOFA huge 16 model.
"""
model = DOFA(patch_size=14, embed_dim=1280, depth=32, num_heads=16, **kwargs)
kwargs |= {'patch_size': 14, 'embed_dim': 1280, 'depth': 32, 'num_heads': 16}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we warn the user if these overwrite kwargs?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's worth adding, just not in this PR. Torchvision does issue a warning as of pytorch/vision#5618.

@adamjstewart adamjstewart merged commit dbfe7fa into microsoft:main May 12, 2024
16 checks passed
@adamjstewart adamjstewart deleted the models/hub branch May 12, 2024 07:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
models Models and pretrained weights testing Continuous integration testing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants