vllm
[Frontend][OpenAI] Support for returning max_model_len on /v1/models response
#4643
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
[Frontend][OpenAI] Support for returning max_model_len on /v1/models response
#4643
DarkLight1337
merged 1 commit into
vllm-project:main
from
Avinash-Raj:return-max-model-len-on-openai-models-response
support for returning max_model_len on openai /v1/model response
a3a0cbfc
DarkLight1337
approved these changes on 2024-05-31
DarkLight1337
enabled auto-merge (squash)
1 year ago
DarkLight1337
merged
f790ad3c
into main
1 year ago
Login to write a write a comment.
Login via GitHub
Reviewers
DarkLight1337
Assignees
No one assigned
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub