lighteval
af6b5b4f - Update LiteLLM configuration for hosted_vllm provider (#1060)

Commit
34 days ago
Update LiteLLM configuration for hosted_vllm provider (#1060) even though vllm produces openai compatible endpoint, to make work you have to use provider as hosted_vllm and use a hosted_vllm prefix prior to model name
Author
Parents
Loading