You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Add local window attention in transformer block as used in transformers based networkds (SAM for instance, https://arxiv.org/abs/2304.02643)
Is your feature request related to a problem? Please describe.
Add local window attention in transformer block as used in transformers based networkds (SAM for instance, https://arxiv.org/abs/2304.02643)
Describe the solution you'd like
int arg specifying window size in TransformerBlock constructor https://github.com/Project-MONAI/MONAI/blob/b3d7a48afb15f6590e02302d3b048a4f62d1cdee/monai/networks/blocks/transformerblock.py#L26C1-L34C15. If 0, global attention is used instead (no change).
See PR #7348 as suggestion
Describe alternatives you've considered
Additional context
Should help to add monai flexibility to implement more easily ViT with slightly different architectures. Could help for #6357
The text was updated successfully, but these errors were encountered: