vllm
2ffb9b6e
- [Bugfix] model_max_length should consider max_model_len in tokenizer_config (#19201)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
322 days ago
[Bugfix] model_max_length should consider max_model_len in tokenizer_config (#19201)
References
#19201 - [Bugfix] model_max_length should consider max_model_len in tokenizer_config
Author
noooop
Parents
cda10fa3
Loading