text-generation-inference
22ed5703
- Update server/text_generation_server/models/flash_causal_lm.py
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
Update server/text_generation_server/models/flash_causal_lm.py Co-authored-by: Daniƫl de Kok <me@github.danieldk.eu>
References
#2903 - Baichuan2-13B does not have max_position_embeddings in config
Author
sywangyi
Parents
5ad8c9a4
Loading