Skip to content

[Examples] Add disable_flash_attn#22

Merged
comaniac merged 1 commit intoawslabs:mainfrom
chhzh123:disable_flash_attn
Jan 27, 2023
Merged

[Examples] Add disable_flash_attn#22
comaniac merged 1 commit intoawslabs:mainfrom
chhzh123:disable_flash_attn

Conversation

@chhzh123
Copy link
Contributor

Description

This PR adds a flag of disable_flash_attn to use native PyTorch attention implementation.

Checklist

  • PR's title starts with a category (e.g. [Bugfix], [Model], [Tutorial], etc)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage
  • Code is well-documented

@chhzh123 chhzh123 requested a review from comaniac January 27, 2023 02:10
Copy link
Contributor

@comaniac comaniac left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@comaniac comaniac merged commit 7c5b4f6 into awslabs:main Jan 27, 2023
@comaniac
Copy link
Contributor

Thanks @chhzh123

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants