Skip to content

Conversation

tdoublep
Copy link
Member

@tdoublep tdoublep commented Sep 23, 2025

Purpose

Remove placeholder attention backend. It is no longer needed for Mamba models in V1, since each mamba/linear attention layer has its own "real" attention backend.

Test Plan

Let's see if CI passes

Test Result


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

Signed-off-by: Thomas Parnell <tpa@zurich.ibm.com>
@mergify mergify bot added the kv-connector label Sep 23, 2025
@tdoublep tdoublep changed the title Remove placeholder attn [V0 Deprecation] Remove placeholder attn Sep 23, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request removes the placeholder attention backend and the associated is_attention_free flag, which is a good cleanup as it's no longer needed for Mamba models in V1. The changes are consistent across the modified files. However, I've found a broken test case that needs to be addressed to ensure the integrity of the test suite.

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) September 23, 2025 19:58
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Sep 23, 2025
Signed-off-by: Thomas Parnell <tpa@zurich.ibm.com>
Copy link
Collaborator

@WoosukKwon WoosukKwon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for doing this!

Copy link
Member

@yewentao256 yewentao256 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for the work!

Copy link
Member

@tlrmchlsmth tlrmchlsmth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice

@DarkLight1337 DarkLight1337 merged commit 969b4da into vllm-project:main Sep 23, 2025
47 checks passed
xuechendi added a commit to vllm-project/vllm-gaudi that referenced this pull request Sep 24, 2025
slokesha pushed a commit to slokesha/vllm-gaudi that referenced this pull request Sep 24, 2025
vllm-project/vllm#25510

Signed-off-by: Chendi Xue <Chendi.Xue@intel.com>
Signed-off-by: slokesha <slokeshappa@habana.ai>
FeiDaLI pushed a commit to FeiDaLI/vllm that referenced this pull request Sep 25, 2025
Signed-off-by: Thomas Parnell <tpa@zurich.ibm.com>
iboiko-habana pushed a commit to iboiko-habana/vllm-gaudi that referenced this pull request Oct 2, 2025
vllm-project/vllm#25510

Signed-off-by: Chendi Xue <Chendi.Xue@intel.com>
Signed-off-by: Iryna Boiko <iboiko@habana.ai>
yewentao256 pushed a commit that referenced this pull request Oct 3, 2025
Signed-off-by: Thomas Parnell <tpa@zurich.ibm.com>
Signed-off-by: yewentao256 <zhyanwentao@126.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kv-connector ready ONLY add when PR is ready to merge/full CI is needed
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

5 participants