vllm
[Docker] Allow FlashInfer to be built in the ARM CUDA Dockerfile
#21013
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
2
Changes
View On
GitHub
[Docker] Allow FlashInfer to be built in the ARM CUDA Dockerfile
#21013
vllm-bot
merged 2 commits into
vllm-project:main
from
neuralmagic:arm-dockerfile-flashinfer
Allow FlashInfer to be built in the ARM CUDA Dockerfile
2bd00be9
mergify
added
ci/build
gemini-code-assist
commented on 2025-07-15
Indent
625706d0
mgoin
added
ready
DarkLight1337
approved these changes on 2025-07-17
vllm-bot
merged
a50d9182
into main
159 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
DarkLight1337
gemini-code-assist
Assignees
No one assigned
Labels
ready
ci/build
Milestone
No milestone
Login to write a write a comment.
Login via GitHub