Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LlamaFlashAttention2 __init__ is undefined. #60

Closed
Luke20000429 opened this issue Jun 13, 2024 · 2 comments
Closed

LlamaFlashAttention2 __init__ is undefined. #60

Luke20000429 opened this issue Jun 13, 2024 · 2 comments

Comments

@Luke20000429
Copy link

The LlamaFlashAttention2 has no __init__(). As a result running the system with flash-attn will crash.

class LlamaFlashAttention2(LlamaAttention):

@deepcs233
Copy link
Collaborator

Hi!

Sorry for the late reply. I've been very busy lately.

It's an experimental feature, and you can simply replace the forward function and other related functions from LlamaAttention with those from LlamaFlashAttention2. I also update the code.

@Luke20000429
Copy link
Author

I've included that in my local repo, but not a bad thing for you to include in your public repo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants