vllm
[Bugfix]: Fix ROCm fusion attn test; use AttentionBackend utils to create kv cache
#33948
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
6
Changes
View On
GitHub
[Bugfix]: Fix ROCm fusion attn test; use AttentionBackend utils to create kv cache
#33948
ProExpertProg
merged 6 commits into
vllm-project:main
from
ROCm:fix_fusion_attn_test_rocm
fix ROCm fusion attn test; use AttentionBackend utils to create kv cache
df2a3f25
mergify
added
rocm
mergify
added
bug
gemini-code-assist
commented on 2026-02-05
fix lint
daf1d320
ProExpertProg
approved these changes on 2026-02-05
mergify
added
needs-rebase
Merge branch 'main' into fix_fusion_attn_test_rocm
9c1a88ba
mergify
removed
needs-rebase
ProExpertProg
approved these changes on 2026-02-09
ProExpertProg
enabled auto-merge (squash)
90 days ago
github-actions
added
ready
Merge branch 'main' into fix_fusion_attn_test_rocm
e54517e6
fix kv cache layout for FI attn+quant fusion on Blackwell
d6147463
Merge branch 'fix_fusion_attn_test_rocm' of https://github.com/ROCm/v…
e3f1d6fc
disabled auto-merge
89 days ago
Head branch was pushed to by a user without write access
ProExpertProg
approved these changes on 2026-02-11
ProExpertProg
merged
fd618871
into main
88 days ago
Rohan138
deleted the fix_fusion_attn_test_rocm branch
75 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
ProExpertProg
gemini-code-assist
Assignees
No one assigned
Labels
bug
rocm
ready
Milestone
No milestone
Login to write a write a comment.
Login via GitHub