Skip to content

Fix the attention bug caused by upgrading vLLM#555

Merged
ajtejankar merged 4 commits intomainfrom
fix-attention-bug
Jul 26, 2024
Merged

Fix the attention bug caused by upgrading vLLM#555
ajtejankar merged 4 commits intomainfrom
fix-attention-bug

Conversation

@ajtejankar
Copy link
Contributor

As the title says, upgrading vLLM changed the APIs for a few paged attention related operators but I didn't properly test the newly built image. This fixes it.

@ajtejankar ajtejankar merged commit 2e81331 into main Jul 26, 2024
@ajtejankar ajtejankar deleted the fix-attention-bug branch July 26, 2024 03:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants