Skip to content

DISABLED test_flex_attention (__main__.TestCompiledAutograd) #144912

@jeffdaily

Description

@jeffdaily

Platforms: rocm

This test was disabled because it is failing on main branch (recent examples).

Caused by this PR: #144533

The MI200 CI runners were passing all inductor UTs prior to merge. Post-merge on MI300 we see this failure. Hopefully just the one test.

cc @sunway513 @jithunnair-amd @pruthvistony @ROCmSupport @dllehr-amd @jataylo @hongxiayang @naromero77amd @chauhang @penguinwu @zou3519 @ydwu4 @xmfan @yf225 @bdhirsh @Chillee @drisspg @yanboliang @BoyuanFeng

Metadata

Metadata

Assignees

Labels

module: compiled autogradcompiled_autogradmodule: flex attentionmodule: higher order operatorstorch.cond and similarmodule: pt2-dispatcherPT2 dispatcher-related issues (e.g., aotdispatch, functionalization, faketensor, custom-op,module: rocmAMD GPU support for Pytorchoncall: pt2skippedDenotes a (flaky) test currently skipped in CI.triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions