Add test on main branch of vllm #1175
Upgrade vLLM from 0.10.1.1 to 0.14.1
e438e2d4
Fix vLLM slow test OOM by reducing GPU memory utilization and improvi…
f54496a0
Fix vLLM CI test by increasing gpu_memory_utilization to 0.4
3f606c5c
Fix vLLM CI test and add GPU memory monitoring
9d598f4e
Fix vLLM CI: Add CUDA environment setup for FlashInfer JIT compilation
faecdff0
Fix vLLM CI: Pass CUDA environment variables to test subprocess
5d30f87c
Install CUDA Toolkit 12.8 in CI for vLLM FlashInfer JIT compilation
6de62f22
Fix vLLM v0.15.x API compatibility: use max_model_len instead of max_…
604ad83b
Fix vLLM v0.15.x generate() API: use prompts parameter instead of pro…
62e28f41
Fix vLLM v0.15.x prompt_logprobs API: increase top-k and handle dict …
045b5cdd
Fix vLLM v0.15.x logprobs API compatibility
2ee90722
working omg
ee892f6b
revert
cffac15c
revert
19e159b6
revert
ae94ff8e
Fix slow_tests workflow: update Python dev headers from 3.10 to 3.12
356779aa
lower memory need
5f93c8b5
add debug prints
bf82ffce
upgrade ruff
07a58462
upgrade ruff
ebe256d8
fix dependencies
4713f49e
fix dependencies
5bf85388
Merge remote-tracking branch 'origin/main' into upgrade/vllm-0.14.1
e00a6b8b
workflow test against vllm nighlty
56629606
workflow test against vllm nighlty
5970a2c1
NathanHB
changed the title Upgrade/vllm 0.14.1 Add test on main branch of vllm 41 days ago
NathanHB
merged
33acf35f
into main 41 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub