Skip to content

Fix import error when flash attention 3 is installed#1913

Closed
HollowMan6 wants to merge 1 commit intoNVIDIA:mainfrom
HollowMan6:flash_attn_3
Closed

Fix import error when flash attention 3 is installed#1913
HollowMan6 wants to merge 1 commit intoNVIDIA:mainfrom
HollowMan6:flash_attn_3

Conversation

@HollowMan6
Copy link
Copy Markdown

Description

Referring to https://github.com/Dao-AILab/flash-attention/blob/7661781d001e0900121c000a0aaf21b3f94337d6/README.md?plain=1#L61-L62

flash_attn_interface shouldn't be imported from flash_attn_3 but instead directly, otherwise, the import error will happen.

Type of change

  • Documentation change (change only to the documentation, either a fix or a new content)
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Infra/Build change
  • Code refactoring

Changes

Please list the changes introduced in this PR:

  • Remove flash_attn_3. when importing the flash attention 3's flash_attn_interface

Checklist:

  • I have read and followed the contributing guidelines
  • The functionality is complete
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

Referring to https://github.com/Dao-AILab/flash-attention/blob/7661781d001e0900121c000a0aaf21b3f94337d6/README.md?plain=1#L61-L62

`flash_attn_interface` shouldn't be imported from flash_attn_3 but
instead directly, otherwise, the import error will happen.

Signed-off-by: Hollow Man <hollowman@opensuse.org>
@ptrendx ptrendx requested a review from cyanguwa August 1, 2025 23:18
@ptrendx
Copy link
Copy Markdown
Member

ptrendx commented Aug 1, 2025

@cyanguwa could you take a look? Is this a new breaking API change from FA3?

@gugarosa
Copy link
Copy Markdown

gugarosa commented Sep 2, 2025

Could you please merge this PR? FA3 is indeed imported through import flash_attn_interface, we are hitting the same issue.

@HollowMan6
Copy link
Copy Markdown
Author

Closing this one as I saw some customized instructions for installing flash attn v3:

(1) git clone https://github.com/Dao-AILab/flash-attention.git
(2) cd flash-attention/ && git checkout 3ba6f82 && git submodule update --init && cd hopper/ && python setup.py install
(3) python_path=`python -c "import site; print(site.getsitepackages()[0])"`
(4) mkdir -p $python_path/flash_attn_3
(5) cp flash_attn_interface.py $python_path/flash_attn_3/flash_attn_interface.py"""

@HollowMan6 HollowMan6 closed this Sep 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants