Skip to content

Conversation

tjtanaa
Copy link
Contributor

@tjtanaa tjtanaa commented Dec 13, 2023

  • Upgraded the xformers version for ROCm to match the one used in CUDA.
  • Update documentations

@WoosukKwon WoosukKwon self-requested a review December 13, 2023 07:27
Copy link
Collaborator

@WoosukKwon WoosukKwon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @tjtanaa, thanks for submitting the PR! Left some minor comments. Please take a look at them.

@WoosukKwon WoosukKwon added the rocm Related to AMD ROCm label Dec 13, 2023
Copy link
Collaborator

@WoosukKwon WoosukKwon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for the quick update!

@WoosukKwon WoosukKwon merged commit f375ec8 into vllm-project:main Dec 13, 2023
@tjtanaa tjtanaa deleted the upgrade-xformers branch December 13, 2023 15:35
hongxiayang pushed a commit to hongxiayang/vllm that referenced this pull request Feb 13, 2024
)

Co-authored-by: miloice <jeffaw99@hotmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
rocm Related to AMD ROCm
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants