vllm
[TPU] Update torch-xla version to include paged attention tuned block change
#19813
Merged

[TPU] Update torch-xla version to include paged attention tuned block change #19813

yaochengji merged 1 commit into vllm-project:main from QiliangCui:tune
QiliangCui
QiliangCui Update xla package to use new tuned blocks.
be009da8
QiliangCui QiliangCui marked this pull request as draft 227 days ago
github-actions
gemini-code-assist
gemini-code-assist commented on 2025-06-18
gemini-code-assist
gemini-code-assist commented on 2025-06-18
mergify mergify added ci/build
mergify mergify added qwen
QiliangCui QiliangCui marked this pull request as ready for review 227 days ago
yaochengji yaochengji added ready
yaochengji
yaochengji approved these changes on 2025-06-18
yaochengji yaochengji enabled auto-merge (squash) 227 days ago
yaochengji yaochengji merged 04fefe7c into main 227 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone