lighteval
Upgrade vLLM from 0.10.1.1 to 0.14.1
#1173
Merged

Upgrade vLLM from 0.10.1.1 to 0.14.1 #1173

NathanHB merged 22 commits into main from upgrade/vllm-0.14.1
NathanHB
NathanHB Upgrade vLLM from 0.10.1.1 to 0.14.1
e438e2d4
bot-ci-comment
NathanHB Fix vLLM slow test OOM by reducing GPU memory utilization and improvi…
f54496a0
NathanHB Fix vLLM CI test by increasing gpu_memory_utilization to 0.4
3f606c5c
NathanHB Fix vLLM CI test and add GPU memory monitoring
9d598f4e
NathanHB Fix vLLM CI: Add CUDA environment setup for FlashInfer JIT compilation
faecdff0
NathanHB Fix vLLM CI: Pass CUDA environment variables to test subprocess
5d30f87c
NathanHB Install CUDA Toolkit 12.8 in CI for vLLM FlashInfer JIT compilation
6de62f22
NathanHB Fix vLLM v0.15.x API compatibility: use max_model_len instead of max_…
604ad83b
NathanHB Fix vLLM v0.15.x generate() API: use prompts parameter instead of pro…
62e28f41
NathanHB NathanHB force pushed from 6d4c9eae to 62e28f41 13 days ago
NathanHB Fix vLLM v0.15.x prompt_logprobs API: increase top-k and handle dict …
045b5cdd
NathanHB Fix vLLM v0.15.x logprobs API compatibility
2ee90722
NathanHB working omg
ee892f6b
NathanHB revert
cffac15c
NathanHB
NathanHB commented on 2026-03-03
NathanHB revert
19e159b6
NathanHB revert
ae94ff8e
NathanHB Fix slow_tests workflow: update Python dev headers from 3.10 to 3.12
356779aa
NathanHB lower memory need
5f93c8b5
NathanHB add debug prints
bf82ffce
NathanHB upgrade ruff
07a58462
NathanHB upgrade ruff
ebe256d8
NathanHB fix dependencies
4713f49e
NathanHB fix dependencies
5bf85388
NathanHB NathanHB merged 0a74a170 into main 1 day ago

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone