text-generation-inference
Baichuan2-13B does not have max_position_embeddings in config
#2903
Merged

Baichuan2-13B does not have max_position_embeddings in config #2903

Narsil merged 3 commits into huggingface:main from sywangyi:baichuan2-13b
sywangyi
sywangyi Baichuan2-13B does not have max_position_embeddings in config
5ad8c9a4
danieldk
danieldk commented on 2025-01-13
sywangyi Update server/text_generation_server/models/flash_causal_lm.py
22ed5703
sywangyi fmt
48067e4a
Narsil
Narsil approved these changes on 2025-01-15
Narsil Narsil merged cc8b9650 into main 1 year ago
sywangyi sywangyi deleted the baichuan2-13b branch 349 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone