Add support for vllm >= 0.19.0 #1211
Fix vLLM 0.11 compatibility and restore hellaswag_cf
e7730009
Support vLLM 0.19 prompt schema
4caaeb32
lewtun
commented
on 2026-04-12
Address vLLM PR review feedback
9a110250
Remove temporary hellaswag_cf task
2bc5af73
lewtun
commented
on 2026-04-12
Clarify vLLM compatibility branches
7106e88a
lewtun
commented
on 2026-04-13
NathanHB
approved these changes
on 2026-04-13
Handle tied MCQ logits in slow sample comparisons
bcb902ff
Handle flat VLM token outputs in tie checks
e8356067
NathanHB
merged
34889df3
into main 2 days ago
lewtun
deleted the vllm-0.19-compat branch 2 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub