vllm
[Bugfix] Treat generation_config max_tokens as default not ceiling
#34063
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
6
Changes
View On
GitHub
[Bugfix] Treat generation_config max_tokens as default not ceiling
#34063
vllm-bot
merged 6 commits into
vllm-project:main
from
almogtavor:almogtavor/fix-generation-config-max-tokens
fix: treat generation_config max_tokens as default, not ceiling (#34005)
f103fb04
almogtavor
requested a review
from
DarkLight1337
94 days ago
almogtavor
requested a review
from
robertgshaw2-redhat
94 days ago
almogtavor
requested a review
from
aarnphm
94 days ago
almogtavor
requested a review
from
NickLucche
94 days ago
almogtavor
requested a review
from
chaunceyjiang
94 days ago
mergify
added
frontend
mergify
added
bug
gemini-code-assist
commented on 2026-02-07
DarkLight1337
commented on 2026-02-08
refactor: rename variables in get_max_tokens for clarity
a22bcc49
DarkLight1337
approved these changes on 2026-02-09
DarkLight1337
enabled auto-merge (squash)
92 days ago
github-actions
added
ready
Merge branch 'main' into almogtavor/fix-generation-config-max-tokens
65927a8a
Merge branch 'main' into almogtavor/fix-generation-config-max-tokens
52f6fc17
Merge branch 'main' into almogtavor/fix-generation-config-max-tokens
3d2d0168
fix: keep the support for the max_tokens override
b25fadbc
disabled auto-merge
90 days ago
Head branch was pushed to by a user without write access
almogtavor
requested a review
from
DarkLight1337
90 days ago
DarkLight1337
commented on 2026-02-12
vllm-bot
merged
72d5951d
into main
85 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
DarkLight1337
gemini-code-assist
robertgshaw2-redhat
aarnphm
NickLucche
chaunceyjiang
Assignees
No one assigned
Labels
bug
frontend
ready
Milestone
No milestone
Login to write a write a comment.
Login via GitHub