Skip to content

Conversation

gshtras
Copy link
Collaborator

@gshtras gshtras commented Sep 11, 2025

Bringing the ROCm installation part of the docs up to date

Updated the dependency repositories and recommended commits to match the Dockerfile.rocm_base
Removed some parts that are no longer relevant, namely the deprecated rocm:vllm-dev:navi_base

Signed-off-by: Gregory Shtrasberg <Gregory.Shtrasberg@amd.com>
@gshtras gshtras requested a review from hmellor as a code owner September 11, 2025 20:14
@mergify mergify bot added documentation Improvements or additions to documentation rocm Related to AMD ROCm labels Sep 11, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the ROCm installation documentation to align with newer dependency versions and practices, which is a valuable improvement. I've identified a couple of areas for enhancement to ensure clarity and adherence to best practices. Specifically, there's a minor version inconsistency in an example that could confuse users, and a deprecated installation command is being introduced. Addressing these points will make the documentation more robust and user-friendly.

Comment on lines +56 to +57
if [ ! -f setup.py ]; then cd python; fi
python3 setup.py install
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The command python3 setup.py install is a legacy command and is deprecated. It's recommended to use pip install . instead, which is the modern standard for installing packages from source. This will correctly handle dependencies and use the PEP 517 build process. The previous version of the documentation correctly used pip3 install ., so this change is a regression.

Suggested change
if [ ! -f setup.py ]; then cd python; fi
python3 setup.py install
if [ ! -f setup.py ]; then cd python; fi
pip install .

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This mimics the way it is installed in the ROCm docker image for consistency.

Install ROCm's flash attention (v2.7.2) following the instructions from [ROCm/flash-attention](https://github.com/ROCm/flash-attention#amd-rocm-support)
Alternatively, wheels intended for vLLM use can be accessed under the releases.

For example, for ROCm 6.3, suppose your gfx arch is `gfx90a`. To get your gfx architecture, run `rocminfo |grep gfx`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The documentation has been updated to use ROCm 6.4 in the PyTorch installation step, but this example for flash attention still refers to ROCm 6.3. To maintain consistency throughout the document and avoid confusion for users, please update this to refer to ROCm 6.4.

Suggested change
For example, for ROCm 6.3, suppose your gfx arch is `gfx90a`. To get your gfx architecture, run `rocminfo |grep gfx`.
For example, for ROCm 6.4, suppose your gfx arch is `gfx90a`. To get your gfx architecture, run `rocminfo |grep gfx`.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ROCm support goes 2 versions back, so 6.3 and above are all currently supported. The installation instructions assume the lowest

Copy link
Member

@hmellor hmellor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM overall, but Gemini's comments seem valid

@gshtras
Copy link
Collaborator Author

gshtras commented Sep 11, 2025

LGTM overall, but Gemini's comments seem valid

Added explanations

@vllm-bot vllm-bot merged commit 6a50eaa into vllm-project:main Sep 12, 2025
16 checks passed
skyloevil pushed a commit to skyloevil/vllm that referenced this pull request Sep 13, 2025
Signed-off-by: Gregory Shtrasberg <Gregory.Shtrasberg@amd.com>
dsxsteven pushed a commit to dsxsteven/vllm_splitPR that referenced this pull request Sep 15, 2025
Signed-off-by: Gregory Shtrasberg <Gregory.Shtrasberg@amd.com>
FeiDaLI pushed a commit to FeiDaLI/vllm that referenced this pull request Sep 25, 2025
Signed-off-by: Gregory Shtrasberg <Gregory.Shtrasberg@amd.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation rocm Related to AMD ROCm
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants