vllm
[Frontend][OpenAI] Support for returning max_model_len on /v1/models response
#4643
Merged

[Frontend][OpenAI] Support for returning max_model_len on /v1/models response #4643

Avinash-Raj
Avinash-Raj support for returning max_model_len on openai /v1/model response
a3a0cbfc
DarkLight1337
DarkLight1337 approved these changes on 2024-05-31
DarkLight1337 DarkLight1337 enabled auto-merge (squash) 1 year ago
DarkLight1337
DarkLight1337 DarkLight1337 merged f790ad3c into main 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone