vllm
[TPU] Update torch version to include paged attention kernel change
#19706
Merged

[TPU] Update torch version to include paged attention kernel change #19706

Chenyaaang
github-actions
gemini-code-assist
gemini-code-assist commented on 2025-06-16
mergify mergify added ci/build
gemini-code-assist
gemini-code-assist commented on 2025-06-16
yaochengji yaochengji added tpu
yaochengji yaochengji added ready
mergify mergify removed tpu
yaochengji
yaochengji approved these changes on 2025-06-16
yaochengji yaochengji enabled auto-merge (squash) 327 days ago
disabled auto-merge 327 days ago
Manually disabled by user
Chenyaaang Chenyaaang marked this pull request as draft 327 days ago
Chenyaaang update torch version to include paged attention kernel change
d00eb38e
Chenyaaang Chenyaaang force pushed to d00eb38e 326 days ago
Chenyaaang Chenyaaang marked this pull request as ready for review 326 days ago
yaochengji yaochengji enabled auto-merge (squash) 326 days ago
yaochengji yaochengji merged dac8cc49 into main 326 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone