vllm
[Chore] Update more locations to use `attention_config.backend`
#31153
Merged

[Chore] Update more locations to use `attention_config.backend` #31153

DarkLight1337
DarkLight1337 DarkLight1337 requested a review from LucasWilkinson LucasWilkinson 2 days ago
DarkLight1337 DarkLight1337 requested a review from ProExpertProg ProExpertProg 2 days ago
DarkLight1337 DarkLight1337 requested a review from tjtanaa tjtanaa 2 days ago
DarkLight1337 DarkLight1337 added ready
mergify mergify added performance
chatgpt-codex-connector
chatgpt-codex-connector commented on 2025-12-22
gemini-code-assist
gemini-code-assist commented on 2025-12-22
DarkLight1337 [Chore] Update more location to use `attention_config.backend`
18e97873
DarkLight1337 DarkLight1337 force pushed from 9624a781 to 18e97873 2 days ago
DarkLight1337
gemini-code-assist
gemini-code-assist commented on 2025-12-22
ProExpertProg
ProExpertProg approved these changes on 2025-12-22
vllm-bot vllm-bot merged 8cef1376 into main 1 day ago
DarkLight1337 DarkLight1337 deleted the remove-attention-backend branch 1 day ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone