vllm
f790ad3c
- [Frontend][OpenAI] Support for returning max_model_len on /v1/models response (#4643)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
[Frontend][OpenAI] Support for returning max_model_len on /v1/models response (#4643)
References
#4643 - [Frontend][OpenAI] Support for returning max_model_len on /v1/models response
Author
Avinash-Raj
Parents
ed59a7ed
Loading