Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transformer block local window attention #7349

Open
vgrau98 opened this issue Dec 30, 2023 · 0 comments · May be fixed by #7348
Open

Transformer block local window attention #7349

vgrau98 opened this issue Dec 30, 2023 · 0 comments · May be fixed by #7348
Labels
enhancement New feature or request Feature request

Comments

@vgrau98
Copy link
Contributor

vgrau98 commented Dec 30, 2023

Is your feature request related to a problem? Please describe.
Add local window attention in transformer block as used in transformers based networkds (SAM for instance, https://arxiv.org/abs/2304.02643)

Describe the solution you'd like
int arg specifying window size in TransformerBlock constructor https://github.com/Project-MONAI/MONAI/blob/b3d7a48afb15f6590e02302d3b048a4f62d1cdee/monai/networks/blocks/transformerblock.py#L26C1-L34C15. If 0, global attention is used instead (no change).

See PR #7348 as suggestion

Describe alternatives you've considered

Additional context
Should help to add monai flexibility to implement more easily ViT with slightly different architectures. Could help for #6357

@vgrau98 vgrau98 linked a pull request Dec 30, 2023 that will close this issue
7 tasks
@KumoLiu KumoLiu added enhancement New feature or request Feature request labels Jan 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request Feature request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants