vllm
[CI/Build][AMD] Skip test on test_hybrid_attention_mamba_tensor_shapes on ROCm, requires FLASHINFER
#29995
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
2
Changes
View On
GitHub
[CI/Build][AMD] Skip test on test_hybrid_attention_mamba_tensor_shapes on ROCm, requires FLASHINFER
#29995
tjtanaa
merged 2 commits into
vllm-project:main
from
rasmith:ransmith_fix_test_hybrid_attention_mamba_tensor_shapes
Skip test on ROCm, requires FLASHINFER
221df5ac
mergify
added
rocm
mergify
added
v1
gemini-code-assist
commented on 2025-12-03
tjtanaa
approved these changes on 2025-12-04
tjtanaa
added
ready
tjtanaa
enabled auto-merge (squash)
43 days ago
Merge branch 'main' into ransmith_fix_test_hybrid_attention_mamba_ten…
8f1a928f
tjtanaa
merged
f2f4cea6
into main
43 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
tjtanaa
gemini-code-assist
Assignees
No one assigned
Labels
rocm
ready
v1
Milestone
No milestone
Login to write a write a comment.
Login via GitHub