Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for paged attention v2 and update flash attention v2 #54

Merged
merged 5 commits into from
Nov 22, 2023

Conversation

tgaddair
Copy link
Contributor

No description provided.

Copy link
Collaborator

@magdyksaleh magdyksaleh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks goodd

Comment on lines 30 to 31
from lorax_server.utils.flash_attn import attention
from lorax_server.utils import paged_attn
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

simple design question: why not using flash_attn prefix as with paged_attn
so it would be easier to understand that flash attention is used
if its only attention there could be missunderstandings which one it is

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point! I'll update so it will be called as flash_attn.attention instead.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@tgaddair tgaddair merged commit f27a61c into main Nov 22, 2023
1 check failed
@tgaddair tgaddair deleted the paged-att-v2 branch November 22, 2023 19:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants