lighteval
af6b5b4f
- Update LiteLLM configuration for hosted_vllm provider (#1060)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
34 days ago
Update LiteLLM configuration for hosted_vllm provider (#1060) even though vllm produces openai compatible endpoint, to make work you have to use provider as hosted_vllm and use a hosted_vllm prefix prior to model name
References
#1060 - Update LiteLLM configuration for hosted_vllm provider
Author
abhiram1809
Parents
391d5b40
Loading