vllm
2ffb9b6e - [Bugfix] model_max_length should consider max_model_len in tokenizer_config (#19201)

Commit
322 days ago
[Bugfix] model_max_length should consider max_model_len in tokenizer_config (#19201)
Author
Parents
Loading