Skip to content

[Inductor] Flex attention supports dynamic shape #204774

[Inductor] Flex attention supports dynamic shape

[Inductor] Flex attention supports dynamic shape #204774

linux-focal-cuda12.1-py3.10-gcc9-sm86  /  test (default, 5, 5, linux.g5.4xlarge.nvidia.gpu)

failed May 14, 2024 in 5m 41s