import { Callout } from "nextra-theme-docs" import { Tab, Tabs } from "nextra-theme-docs"
Relative Position Bias for Swin Transformer.
WindowAttention(
dim=768,
num_heads=12,
plugins=[
RelativePositionBiasViT(
window_size=8,
num_heads=12,
)
],
)
The relative position bias is intended to be added to the attention before the softmax is applied. It helps to improve the attention mechanism by incorporating the relative positions of the elements in the input.
Note that the number of parameters is O(window_size)
, not O(window_size**2)
.
window_size
: The window size for which attention is computed.num_heads
: Number of heads for the multi-head attention mechanism.