Skip to content

Transformer block local window attention #7349

Open
@vgrau98

Description

@vgrau98

Is your feature request related to a problem? Please describe.
Add local window attention in transformer block as used in transformers based networkds (SAM for instance, https://arxiv.org/abs/2304.02643)

Describe the solution you'd like
int arg specifying window size in TransformerBlock constructor https://github.com/Project-MONAI/MONAI/blob/b3d7a48afb15f6590e02302d3b048a4f62d1cdee/monai/networks/blocks/transformerblock.py#L26C1-L34C15. If 0, global attention is used instead (no change).

See PR #7348 as suggestion

Describe alternatives you've considered

Additional context
Should help to add monai flexibility to implement more easily ViT with slightly different architectures. Could help for #6357

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions