vllm
b692e9cd
- [Misc] Fix skipped max-model-len validation when deriving max model length from tokenizer config (#19660)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
235 days ago
[Misc] Fix skipped max-model-len validation when deriving max model length from tokenizer config (#19660) Signed-off-by: Ye (Charlotte) Qi <yeq@meta.com>
References
#19660 - [Misc] Fix skipped max-model-len validation when deriving max model length from tokenizer config
Author
yeqcharlotte
Parents
367871a4
Loading