vllm
Remove torch_xla.tpu.version() from pallas.py.
#21065
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
Remove torch_xla.tpu.version() from pallas.py.
#21065
yaochengji
merged 1 commit into
vllm-project:main
from
QiliangCui:fix-test1
Remove torch_xla.tpu.version() from pallas.py.
d28e8b1c
QiliangCui
requested a review
from
WoosukKwon
177 days ago
QiliangCui
requested a review
from
robertgshaw2-redhat
177 days ago
QiliangCui
requested a review
from
njhill
177 days ago
QiliangCui
requested a review
from
ywang96
177 days ago
QiliangCui
requested a review
from
comaniac
177 days ago
QiliangCui
requested a review
from
alexm-redhat
177 days ago
mergify
added
v1
mergify
added
tpu
gemini-code-assist
commented on 2025-07-16
yaochengji
approved these changes on 2025-07-16
yaochengji
enabled auto-merge (squash)
177 days ago
github-actions
added
ready
bythew3i
approved these changes on 2025-07-16
yaochengji
merged
72ad2735
into main
177 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
yaochengji
bythew3i
gemini-code-assist
WoosukKwon
robertgshaw2-redhat
njhill
ywang96
comaniac
alexm-redhat
Assignees
No one assigned
Labels
tpu
ready
v1
Milestone
No milestone
Login to write a write a comment.
Login via GitHub